I use the md5 grunt task to generate MD5 filenames. Now I want to rename the sources in the HTML file with the new filename in the callback of the task. I wonder what's the easiest way to do this.
- 
                    3I wish there was a renamer and replace-in-file combination, which would both rename the files, and search/replace any reference for those files as well. – Brain2000 Mar 30 '19 at 01:34
- 
                    @Brain2000 I had the same need, so I created a CLI tool named **rev-web-assets** to hash the filenames and update their references. It's intended for use in **npm** scripts and is on GitHub: [rev-web-assets](https://github.com/center-key/rev-web-assets) – Dem Pilafian Oct 01 '22 at 15:48
14 Answers
You could use simple regex:
var result = fileAsString.replace(/string to be replaced/g, 'replacement');
So...
var fs = require('fs')
fs.readFile(someFile, 'utf8', function (err,data) {
  if (err) {
    return console.log(err);
  }
  var result = data.replace(/string to be replaced/g, 'replacement');
  fs.writeFile(someFile, result, 'utf8', function (err) {
     if (err) return console.log(err);
  });
});
 
    
    - 19,922
- 7
- 69
- 65
 
    
    - 35,552
- 12
- 89
- 98
- 
                    3Sure, but do I have to read the file replace the text and then write the file again, or is there an easier way, sorry I'm more of a frontend guy. – Andreas Köberle Jan 06 '13 at 12:50
- 
                    Maybe there is a node module to achieve this, but i'm not aware of it. Added a full example btw. – asgoth Jan 06 '13 at 20:40
- 
                    4
- 
                    1sorry as i know utf-8 support many language like: vietnamese, chinese... – vuhung3990 Oct 19 '16 at 02:24
- 
                    1If your string appearance multiple times in your text it will replace only the first string it finds. – eltongonc Feb 06 '20 at 13:31
- 
                    @eltongonc Why doesn't [the `/g` flag](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/RegExp/global) handle that? `"12131415".replace(/1/g, "0")` gives `'02030405'`, for instance. – ruffin Jan 11 '23 at 21:55
- 
                    1
Since replace wasn't working for me, I've created a simple npm package replace-in-file to quickly replace text in one or more files. It's partially based on @asgoth's answer.
Edit (3 October 2016): The package now supports promises and globs, and the usage instructions have been updated to reflect this.
Edit (16 March 2018): The package has amassed over 100k monthly downloads now and has been extended with additional features as well as a CLI tool.
Install:
npm install replace-in-file
Require module
const replace = require('replace-in-file');
Specify replacement options
const options = {
  //Single file
  files: 'path/to/file',
  //Multiple files
  files: [
    'path/to/file',
    'path/to/other/file',
  ],
  //Glob(s) 
  files: [
    'path/to/files/*.html',
    'another/**/*.path',
  ],
  //Replacement to make (string or regex) 
  from: /Find me/g,
  to: 'Replacement',
};
Asynchronous replacement with promises:
replace(options)
  .then(changedFiles => {
    console.log('Modified files:', changedFiles.join(', '));
  })
  .catch(error => {
    console.error('Error occurred:', error);
  });
Asynchronous replacement with callback:
replace(options, (error, changedFiles) => {
  if (error) {
    return console.error('Error occurred:', error);
  }
  console.log('Modified files:', changedFiles.join(', '));
});
Synchronous replacement:
try {
  let changedFiles = replace.sync(options);
  console.log('Modified files:', changedFiles.join(', '));
}
catch (error) {
  console.error('Error occurred:', error);
}
 
    
    - 4,165
- 1
- 44
- 35
- 
                    5Great and easy to use turn-key module. Used it with async/await and a glob over quite a large folder and it was lightning fast – Matt Fletcher May 17 '18 at 09:07
- 
                    Will it be able to work with file sizes greater then 256 Mb since i read somewhere that string limit in node js is 256 Mb – Alien128 Jan 10 '19 at 16:23
- 
                    I believe it will, but there is also a work in progress to implement streaming replacement for larger files. – Adam Reis Jan 10 '19 at 19:31
- 
                    3nice, I found and used this package (for its CLI tool) before I ever read this SO answer. love it – Russell Chisholm Apr 30 '20 at 16:13
- 
                    there is tiny problem with that implementation, what if i want to replace { KEY } all the time at Angular build process, Regexp replaces string only once and that's it developer reaches dead-end. This would work for only one shot – Юрій Мориляк Sep 27 '21 at 12:44
- 
                    1
- 
                    1
Perhaps the "replace" module (www.npmjs.org/package/replace) also would work for you. It would not require you to read and then write the file.
Adapted from the documentation:
// install:
npm install replace 
// require:
var replace = require("replace");
// use:
replace({
    regex: "string to be replaced",
    replacement: "replacement string",
    paths: ['path/to/your/file'],
    recursive: true,
    silent: true,
});
 
    
    - 327
- 4
- 13
 
    
    - 781
- 7
- 9
- 
                    Do you know how can filter by file extension in paths? something like paths: ['path/to/your/file/*.js'] --> it doesn't work – Kalamarico Jun 24 '15 at 07:44
- 
                    You can use node-glob to expand glob patterns to an array of paths, and then iterate over them. – RobW Aug 31 '15 at 17:55
- 
                    3This is nice, but has been abandoned. See http://stackoverflow.com/a/31040890/1825390 for a maintained package if you want an out-of-the-box solution. – xavdid Aug 24 '16 at 08:40
- 
                    1There's also a maintained version called [node-replace](https://github.com/raphamorim/node-replace); however, looking at the code base neither this nor replace-in-file _actually_ replace text in the file, they use `readFile()` and `writeFile()` just like the accepted answer. – c1moore Feb 08 '18 at 15:19
- 
                    
You can also use the 'sed' function that's part of ShellJS ...
 $ npm install [-g] shelljs
 require('shelljs/global');
 sed('-i', 'search_pattern', 'replace_pattern', file);
Full documentation ...
 
    
    - 21,638
- 3
- 67
- 78
- 
                    
- 
                    1`shx` lets you run from npm scripts, ShellJs.org recommended it. https://github.com/shelljs/shx – Joshua Robinson Jul 25 '16 at 02:30
- 
                    I like this too. Better an oneliner, than npm-module, but surveral lines of code ^^ – suther Feb 09 '17 at 11:40
- 
                    1
- 
                    2
If someone wants to use promise based 'fs' module for the task.
const fs = require('fs').promises;
// Below statements must be wrapped inside the 'async' function:
const data = await fs.readFile(someFile, 'utf8');
const result = data.replace(/string to be replaced/g, 'replacement');
await fs.writeFile(someFile, result,'utf8');
 
    
    - 8,481
- 2
- 52
- 43
You could process the file while being read by using streams. It's just like using buffers but with a more convenient API.
var fs = require('fs');
function searchReplaceFile(regexpFind, replace, cssFileName) {
    var file = fs.createReadStream(cssFileName, 'utf8');
    var newCss = '';
    file.on('data', function (chunk) {
        newCss += chunk.toString().replace(regexpFind, replace);
    });
    file.on('end', function () {
        fs.writeFile(cssFileName, newCss, function(err) {
            if (err) {
                return console.log(err);
            } else {
                console.log('Updated!');
            }
    });
});
searchReplaceFile(/foo/g, 'bar', 'file.txt');
 
    
    - 1,264
- 15
- 20
- 
                    5But... what if the chunk splits the regexpFind string? Doesn't the intention fail then? – Jaakko Karhu Oct 31 '17 at 16:05
- 
                    That's a very good point. I wonder if by setting a `bufferSize` longer than the string that you're replacing and saving the last chunk and concatenating with the current one you could avoid that problem. – sanbor Jan 10 '18 at 13:46
- 
                    1Probably this snippet should also be improved by writing the modified file directly to the filesystem rather than creating a big variable as the file might be larger than available memory. – sanbor Jan 10 '18 at 13:48
- 
                    @JaakkoKarhu I made an npm package that keeps old chunks in memory in case the string spans multiple chunks. It's called [`stream-replace-string`](https://www.npmjs.com/package/stream-replace-string#how-it-works). It doesn't work with regexs, but it is an efficient solution when just finding strings. – programmerRaj Jun 21 '21 at 21:40
On Linux or Mac, keep is simple and just use sed with the shell. No external libraries required. The following code works on Linux.
const shell = require('child_process').execSync
shell(`sed -i "s!oldString!newString!g" ./yourFile.js`)
The sed syntax is a little different on Mac. I can't test it right now, but I believe you just need to add an empty string after the "-i":
const shell = require('child_process').execSync
shell(`sed -i "" "s!oldString!newString!g" ./yourFile.js`)
The "g" after the final "!" makes sed replace all instances on a line. Remove it, and only the first occurrence per line will be replaced.
 
    
    - 71
- 4
Expanding on @Sanbor's answer, the most efficient way to do this is to read the original file as a stream, and then also stream each chunk into a new file, and then lastly replace the original file with the new file.
async function findAndReplaceFile(regexFindPattern, replaceValue, originalFile) {
  const updatedFile = `${originalFile}.updated`;
  return new Promise((resolve, reject) => {
    const readStream = fs.createReadStream(originalFile, { encoding: 'utf8', autoClose: true });
    const writeStream = fs.createWriteStream(updatedFile, { encoding: 'utf8', autoClose: true });
    // For each chunk, do the find & replace, and write it to the new file stream
    readStream.on('data', (chunk) => {
      chunk = chunk.toString().replace(regexFindPattern, replaceValue);
      writeStream.write(chunk);
    });
    // Once we've finished reading the original file...
    readStream.on('end', () => {
      writeStream.end(); // emits 'finish' event, executes below statement
    });
    // Replace the original file with the updated file
    writeStream.on('finish', async () => {
      try {
        await _renameFile(originalFile, updatedFile);
        resolve();
      } catch (error) {
        reject(`Error: Error renaming ${originalFile} to ${updatedFile} => ${error.message}`);
      }
    });
    readStream.on('error', (error) => reject(`Error: Error reading ${originalFile} => ${error.message}`));
    writeStream.on('error', (error) => reject(`Error: Error writing to ${updatedFile} => ${error.message}`));
  });
}
async function _renameFile(oldPath, newPath) {
  return new Promise((resolve, reject) => {
    fs.rename(oldPath, newPath, (error) => {
      if (error) {
        reject(error);
      } else {
        resolve();
      }
    });
  });
}
// Testing it...
(async () => {
  try {
    await findAndReplaceFile(/"some regex"/g, "someReplaceValue", "someFilePath");
  } catch(error) {
    console.log(error);
  }
})()
- 
                    1This does not handle the case where the text that `regexFindPattern` matches is split between two chunks. – Tongfa Apr 26 '22 at 21:07
I ran into issues when replacing a small placeholder with a large string of code.
I was doing:
var replaced = original.replace('PLACEHOLDER', largeStringVar);
I figured out the problem was JavaScript's special replacement patterns, described here. Since the code I was using as the replacing string had some $ in it, it was messing up the output.
My solution was to use the function replacement option, which DOES NOT do any special replacement:
var replaced = original.replace('PLACEHOLDER', function() {
    return largeStringVar;
});
 
    
    - 3,910
- 1
- 34
- 46
 
    
    - 9,230
- 10
- 40
- 61
ES2017/8 for Node 7.6+ with a temporary write file for atomic replacement.
const Promise = require('bluebird')
const fs = Promise.promisifyAll(require('fs'))
async function replaceRegexInFile(file, search, replace){
  let contents = await fs.readFileAsync(file, 'utf8')
  let replaced_contents = contents.replace(search, replace)
  let tmpfile = `${file}.jstmpreplace`
  await fs.writeFileAsync(tmpfile, replaced_contents, 'utf8')
  await fs.renameAsync(tmpfile, file)
  return true
}
Note, only for smallish files as they will be read into memory.
 
    
    - 68,711
- 7
- 155
- 158
- 
                    1No need for `bluebird`, use native `Promise` and [util.promisify](https://nodejs.org/dist/latest-v8.x/docs/api/util.html#util_util_promisify_original). – Cisco Dec 04 '17 at 00:50
- 
                    1@FranciscoMateo True, but beyond 1 or 2 functions promisifyAll is still super useful. – Matt Dec 04 '17 at 02:33
This may help someone:
This is a little different than just a global replace
from the terminal we run
node replace.js
replace.js:
function processFile(inputFile, repString = "../") {
var fs = require('fs'),
    readline = require('readline'),
    instream = fs.createReadStream(inputFile),
    outstream = new (require('stream'))(),
    rl = readline.createInterface(instream, outstream);
    formatted = '';   
const regex = /<xsl:include href="([^"]*)" \/>$/gm;
rl.on('line', function (line) {
    let url = '';
    let m;
    while ((m = regex.exec(line)) !== null) {
        // This is necessary to avoid infinite loops with zero-width matches
        if (m.index === regex.lastIndex) {
            regex.lastIndex++;
        }
        
        url = m[1];
    }
    let re = new RegExp('^.* <xsl:include href="(.*?)" \/>.*$', 'gm');
    formatted += line.replace(re, `\t<xsl:include href="${repString}${url}" />`);
    formatted += "\n";
});
rl.on('close', function (line) {
    fs.writeFile(inputFile, formatted, 'utf8', function (err) {
        if (err) return console.log(err);
    });
});
}
// path is relative to where your running the command from
processFile('build/some.xslt');
This is what this does. We have several file that have xml:includes
However in development we need the path to move down a level.
From this
<xsl:include href="common/some.xslt" />
to this
<xsl:include href="../common/some.xslt" />
So we end up running two regx patterns one to get the href and the other to write there is probably a better way to do this but it work for now.
Thanks
 
    
    - 1,144
- 1
- 11
- 25
Nomaly, I use tiny-replace-files to replace texts in file or files. This pkg is smaller and lighter...
https://github.com/Rabbitzzc/tiny-replace-files
import { replaceStringInFilesSync } from 'tiny-replace-files'
const options = {
  files: 'src/targets/index.js',
  from: 'test-plugin',
  to: 'self-name',
}
# await
const result = replaceStringInFilesSync(options)
console.info(result)
 
    
    - 35
- 4
- 
                    While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes. - [From Review](/review/late-answers/30702529) – Sercan Dec 31 '21 at 10:11
I would use a duplex stream instead. like documented here nodejs doc duplex streams
A Transform stream is a Duplex stream where the output is computed in some way from the input.
 
    
    - 1,263
- 14
- 25
<p>Please click in the following {{link}} to verify the account</p>
function renderHTML(templatePath: string, object) {
    const template = fileSystem.readFileSync(path.join(Application.staticDirectory, templatePath + '.html'), 'utf8');
    return template.match(/\{{(.*?)\}}/ig).reduce((acc, binding) => {
        const property = binding.substring(2, binding.length - 2);
        return `${acc}${template.replace(/\{{(.*?)\}}/, object[property])}`;
    }, '');
}
renderHTML(templateName, { link: 'SomeLink' })
for sure you can improve the reading template function to read as stream and compose the bytes by line to make it more efficient
 
    
    - 529
- 1
- 5
- 13
 
     
     
    