Replace a string in a file with nodejs
Asked Answered
K

15

264

I use the md5 grunt task to generate MD5 filenames. Now I want to rename the sources in the HTML file with the new filename in the callback of the task. I wonder what's the easiest way to do this.

Kliman answered 5/1, 2013 at 22:28 Comment(2)
I wish there was a renamer and replace-in-file combination, which would both rename the files, and search/replace any reference for those files as well.Nombles
@Nombles I had the same need, so I created a CLI tool named rev-web-assets to hash the filenames and update their references. It's intended for use in npm scripts and is on GitHub: rev-web-assetsHouseraising
V
444

You could use simple regex:

var result = fileAsString.replace(/string to be replaced/g, 'replacement');

So...

var fs = require('fs')
fs.readFile(someFile, 'utf8', function (err,data) {
  if (err) {
    return console.log(err);
  }
  var result = data.replace(/string to be replaced/g, 'replacement');

  fs.writeFile(someFile, result, 'utf8', function (err) {
     if (err) return console.log(err);
  });
});
Valve answered 6/1, 2013 at 10:12 Comment(7)
Sure, but do I have to read the file replace the text and then write the file again, or is there an easier way, sorry I'm more of a frontend guy.Vinia
Maybe there is a node module to achieve this, but i'm not aware of it. Added a full example btw.Valve
@Zax: Thanks, I'm surprised this 'bug' could survive so long ;)Valve
sorry as i know utf-8 support many language like: vietnamese, chinese...Wait
If your string appearance multiple times in your text it will replace only the first string it finds.Rafaelarafaelia
@Rafaelarafaelia Why doesn't the /g flag handle that? "12131415".replace(/1/g, "0") gives '02030405', for instance.Pastime
@Pastime my bad, I didn't notice the regexRafaelarafaelia
B
122

Since replace wasn't working for me, I've created a simple npm package replace-in-file to quickly replace text in one or more files. It's partially based on @asgoth's answer.

Edit (3 October 2016): The package now supports promises and globs, and the usage instructions have been updated to reflect this.

Edit (16 March 2018): The package has amassed over 100k monthly downloads now and has been extended with additional features as well as a CLI tool.

Install:

npm install replace-in-file

Require module

const replace = require('replace-in-file');

Specify replacement options

const options = {

  //Single file
  files: 'path/to/file',

  //Multiple files
  files: [
    'path/to/file',
    'path/to/other/file',
  ],

  //Glob(s) 
  files: [
    'path/to/files/*.html',
    'another/**/*.path',
  ],

  //Replacement to make (string or regex) 
  from: /Find me/g,
  to: 'Replacement',
};

Asynchronous replacement with promises:

replace(options)
  .then(changedFiles => {
    console.log('Modified files:', changedFiles.join(', '));
  })
  .catch(error => {
    console.error('Error occurred:', error);
  });

Asynchronous replacement with callback:

replace(options, (error, changedFiles) => {
  if (error) {
    return console.error('Error occurred:', error);
  }
  console.log('Modified files:', changedFiles.join(', '));
});

Synchronous replacement:

try {
  let changedFiles = replace.sync(options);
  console.log('Modified files:', changedFiles.join(', '));
}
catch (error) {
  console.error('Error occurred:', error);
}
Boyne answered 25/6, 2015 at 3:50 Comment(7)
Great and easy to use turn-key module. Used it with async/await and a glob over quite a large folder and it was lightning fastPesthouse
Will it be able to work with file sizes greater then 256 Mb since i read somewhere that string limit in node js is 256 MbPismire
I believe it will, but there is also a work in progress to implement streaming replacement for larger files.Boyne
nice, I found and used this package (for its CLI tool) before I ever read this SO answer. love itConsequence
there is tiny problem with that implementation, what if i want to replace { KEY } all the time at Angular build process, Regexp replaces string only once and that's it developer reaches dead-end. This would work for only one shotYucca
Excellent! This works super fast and easy!Ditheism
Absolutely wonderful code, and fittings explanations to boot!Gelt
J
41

Perhaps the "replace" module (www.npmjs.org/package/replace) also would work for you. It would not require you to read and then write the file.

Adapted from the documentation:

// install:

npm install replace 

// require:

var replace = require("replace");

// use:

replace({
    regex: "string to be replaced",
    replacement: "replacement string",
    paths: ['path/to/your/file'],
    recursive: true,
    silent: true,
});
Jenjena answered 1/8, 2014 at 2:39 Comment(5)
Do you know how can filter by file extension in paths? something like paths: ['path/to/your/file/*.js'] --> it doesn't workMyth
You can use node-glob to expand glob patterns to an array of paths, and then iterate over them.Cradlesong
This is nice, but has been abandoned. See https://mcmap.net/q/108920/-replace-a-string-in-a-file-with-nodejs for a maintained package if you want an out-of-the-box solution.Benzine
There's also a maintained version called node-replace; however, looking at the code base neither this nor replace-in-file actually replace text in the file, they use readFile() and writeFile() just like the accepted answer.Jihad
The library works fine but it does not have Typescript supportKetch
B
31

You can also use the 'sed' function that's part of ShellJS ...

 $ npm install [-g] shelljs


 require('shelljs/global');
 sed('-i', 'search_pattern', 'replace_pattern', file);

Full documentation ...

Befoul answered 16/7, 2014 at 7:33 Comment(5)
this seems to be the cleanest solution :)Boughton
shx lets you run from npm scripts, ShellJs.org recommended it. github.com/shelljs/shxTorruella
I like this too. Better an oneliner, than npm-module, but surveral lines of code ^^Leesa
Importing a third party dependency is not the cleanest solution.Fermat
This won't do multilines.Nickens
M
13

If someone wants to use promise based 'fs' module for the task.

const fs = require('fs').promises;

// Below statements must be wrapped inside the 'async' function:
const data = await fs.readFile(someFile, 'utf8');
const result = data.replace(/string to be replaced/g, 'replacement');
await fs.writeFile(someFile, result,'utf8');
Motherland answered 18/12, 2021 at 8:59 Comment(0)
J
5

You could process the file while being read by using streams. It's just like using buffers but with a more convenient API.

var fs = require('fs');
function searchReplaceFile(regexpFind, replace, cssFileName) {
    var file = fs.createReadStream(cssFileName, 'utf8');
    var newCss = '';

    file.on('data', function (chunk) {
        newCss += chunk.toString().replace(regexpFind, replace);
    });

    file.on('end', function () {
        fs.writeFile(cssFileName, newCss, function(err) {
            if (err) {
                return console.log(err);
            } else {
                console.log('Updated!');
            }
    });
});

searchReplaceFile(/foo/g, 'bar', 'file.txt');
Johan answered 23/6, 2016 at 14:53 Comment(4)
But... what if the chunk splits the regexpFind string? Doesn't the intention fail then?Muezzin
That's a very good point. I wonder if by setting a bufferSize longer than the string that you're replacing and saving the last chunk and concatenating with the current one you could avoid that problem.Johan
Probably this snippet should also be improved by writing the modified file directly to the filesystem rather than creating a big variable as the file might be larger than available memory.Johan
@JaakkoKarhu I made an npm package that keeps old chunks in memory in case the string spans multiple chunks. It's called stream-replace-string. It doesn't work with regexs, but it is an efficient solution when just finding strings.Goalkeeper
S
4

On Linux or Mac, keep is simple and just use sed with the shell. No external libraries required. The following code works on Linux.

const shell = require('child_process').execSync
shell(`sed -i "s!oldString!newString!g" ./yourFile.js`)

The sed syntax is a little different on Mac. I can't test it right now, but I believe you just need to add an empty string after the "-i":

const shell = require('child_process').execSync
shell(`sed -i "" "s!oldString!newString!g" ./yourFile.js`)

The "g" after the final "!" makes sed replace all instances on a line. Remove it, and only the first occurrence per line will be replaced.

Steamheated answered 20/3, 2019 at 19:39 Comment(0)
O
2

Expanding on @Sanbor's answer, the most efficient way to do this is to read the original file as a stream, and then also stream each chunk into a new file, and then lastly replace the original file with the new file.

async function findAndReplaceFile(regexFindPattern, replaceValue, originalFile) {
  const updatedFile = `${originalFile}.updated`;

  return new Promise((resolve, reject) => {
    const readStream = fs.createReadStream(originalFile, { encoding: 'utf8', autoClose: true });
    const writeStream = fs.createWriteStream(updatedFile, { encoding: 'utf8', autoClose: true });

    // For each chunk, do the find & replace, and write it to the new file stream
    readStream.on('data', (chunk) => {
      chunk = chunk.toString().replace(regexFindPattern, replaceValue);
      writeStream.write(chunk);
    });

    // Once we've finished reading the original file...
    readStream.on('end', () => {
      writeStream.end(); // emits 'finish' event, executes below statement
    });

    // Replace the original file with the updated file
    writeStream.on('finish', async () => {
      try {
        await _renameFile(originalFile, updatedFile);
        resolve();
      } catch (error) {
        reject(`Error: Error renaming ${originalFile} to ${updatedFile} => ${error.message}`);
      }
    });

    readStream.on('error', (error) => reject(`Error: Error reading ${originalFile} => ${error.message}`));
    writeStream.on('error', (error) => reject(`Error: Error writing to ${updatedFile} => ${error.message}`));
  });
}

async function _renameFile(oldPath, newPath) {
  return new Promise((resolve, reject) => {
    fs.rename(oldPath, newPath, (error) => {
      if (error) {
        reject(error);
      } else {
        resolve();
      }
    });
  });
}

// Testing it...
(async () => {
  try {
    await findAndReplaceFile(/"some regex"/g, "someReplaceValue", "someFilePath");
  } catch(error) {
    console.log(error);
  }
})()
Octroi answered 27/6, 2019 at 0:21 Comment(1)
This does not handle the case where the text that regexFindPattern matches is split between two chunks.Mckean
T
1

I ran into issues when replacing a small placeholder with a large string of code.

I was doing:

var replaced = original.replace('PLACEHOLDER', largeStringVar);

I figured out the problem was JavaScript's special replacement patterns, described here. Since the code I was using as the replacing string had some $ in it, it was messing up the output.

My solution was to use the function replacement option, which DOES NOT do any special replacement:

var replaced = original.replace('PLACEHOLDER', function() {
    return largeStringVar;
});
Tortuosity answered 22/12, 2016 at 23:39 Comment(0)
A
1

ES2017/8 for Node 7.6+ with a temporary write file for atomic replacement.

const Promise = require('bluebird')
const fs = Promise.promisifyAll(require('fs'))

async function replaceRegexInFile(file, search, replace){
  let contents = await fs.readFileAsync(file, 'utf8')
  let replaced_contents = contents.replace(search, replace)
  let tmpfile = `${file}.jstmpreplace`
  await fs.writeFileAsync(tmpfile, replaced_contents, 'utf8')
  await fs.renameAsync(tmpfile, file)
  return true
}

Note, only for smallish files as they will be read into memory.

Acrosstheboard answered 27/10, 2017 at 11:34 Comment(2)
No need for bluebird, use native Promise and util.promisify.Wernick
@FranciscoMateo True, but beyond 1 or 2 functions promisifyAll is still super useful.Acrosstheboard
Y
1

This may help someone:

This is a little different than just a global replace

from the terminal we run
node replace.js

replace.js:

function processFile(inputFile, repString = "../") {
var fs = require('fs'),
    readline = require('readline'),
    instream = fs.createReadStream(inputFile),
    outstream = new (require('stream'))(),
    rl = readline.createInterface(instream, outstream);
    formatted = '';   

const regex = /<xsl:include href="([^"]*)" \/>$/gm;

rl.on('line', function (line) {
    let url = '';
    let m;
    while ((m = regex.exec(line)) !== null) {
        // This is necessary to avoid infinite loops with zero-width matches
        if (m.index === regex.lastIndex) {
            regex.lastIndex++;
        }
        
        url = m[1];
    }

    let re = new RegExp('^.* <xsl:include href="(.*?)" \/>.*$', 'gm');

    formatted += line.replace(re, `\t<xsl:include href="${repString}${url}" />`);
    formatted += "\n";
});

rl.on('close', function (line) {
    fs.writeFile(inputFile, formatted, 'utf8', function (err) {
        if (err) return console.log(err);
    });

});
}


// path is relative to where your running the command from
processFile('build/some.xslt');

This is what this does. We have several file that have xml:includes

However in development we need the path to move down a level.

From this

<xsl:include href="common/some.xslt" />

to this

<xsl:include href="../common/some.xslt" />

So we end up running two regx patterns one to get the href and the other to write there is probably a better way to do this but it work for now.

Thanks

Yawning answered 27/1, 2021 at 20:59 Comment(0)
C
1

Nomaly, I use tiny-replace-files to replace texts in file or files. This pkg is smaller and lighter... https://github.com/Rabbitzzc/tiny-replace-files

import { replaceStringInFilesSync } from 'tiny-replace-files'

const options = {
  files: 'src/targets/index.js',
  from: 'test-plugin',
  to: 'self-name',
}

# await
const result = replaceStringInFilesSync(options)
console.info(result)
Crankshaft answered 30/12, 2021 at 7:8 Comment(1)
While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes. - From ReviewHarlie
D
0

I would use a duplex stream instead. like documented here nodejs doc duplex streams

A Transform stream is a Duplex stream where the output is computed in some way from the input.

Duala answered 28/8, 2016 at 9:6 Comment(0)
W
0

<p>Please click in the following {{link}} to verify the account</p>


function renderHTML(templatePath: string, object) {
    const template = fileSystem.readFileSync(path.join(Application.staticDirectory, templatePath + '.html'), 'utf8');
    return template.match(/\{{(.*?)\}}/ig).reduce((acc, binding) => {
        const property = binding.substring(2, binding.length - 2);
        return `${acc}${template.replace(/\{{(.*?)\}}/, object[property])}`;
    }, '');
}
renderHTML(templateName, { link: 'SomeLink' })

for sure you can improve the reading template function to read as stream and compose the bytes by line to make it more efficient

Westerly answered 17/5, 2020 at 13:48 Comment(0)
A
0

Having the oppurtunity to handle file system in nestJs, I had this kind of requirement since there is no replace method defined by default, you can simply save a new file while keeping the same id or key :

 async updateFile(
    file: any,
    id:string
  ): Promise<string> {
    try {

       const filePath = path.join(your directory path, id);
       await fs.promises.writeFile(filePath, file.buffer);
       return id

    } catch (error) {
      throw new InternalServerErrorException();
    }
}
Akela answered 12/2 at 17:8 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.