Write to a CSV in Node.js
Asked Answered
P

10

72

I am struggling to find a way to write data to a CSV in Node.js.

There are several CSV plugins available however they only 'write' to stdout.

Ideally I want to write on a row-by-row basis using a loop.

Porphyria answered 19/4, 2012 at 11:32 Comment(3)
"however they only 'write' to stdout" That seems really surprising. They won't write to any writeable Stream, it has to be stdout?!Waechter
Could you include links to the modules you've tested, so others can review them and/or know which alternates to suggest?Arlon
there is a tutorial on generate CSV using nodejs. programmerblog.net/generate-csv-using-nodejsMunroe
A
35

The docs for node-csv-parser (npm install csv) specifically state that it can be used with streams (see fromStream, toStream). So it's not hard-coded to use stdout.

Several other CSV parsers also come up when you npm search csv -- you might want to look at them too.

Arlon answered 19/4, 2012 at 12:2 Comment(0)
K
59

You can use fs (https://nodejs.org/api/fs.html#fs_fs_writefile_file_data_options_callback):

var dataToWrite;
var fs = require('fs');

fs.writeFile('form-tracking/formList.csv', dataToWrite, 'utf8', function (err) {
  if (err) {
    console.log('Some error occured - file either not saved or corrupted file saved.');
  } else{
    console.log('It\'s saved!');
  }
});
Kyser answered 8/11, 2015 at 0:49 Comment(0)
A
35

The docs for node-csv-parser (npm install csv) specifically state that it can be used with streams (see fromStream, toStream). So it's not hard-coded to use stdout.

Several other CSV parsers also come up when you npm search csv -- you might want to look at them too.

Arlon answered 19/4, 2012 at 12:2 Comment(0)
T
34

Here is a simple example using csv-stringify to write a dataset that fits in memory to a csv file using fs.writeFile.

import stringify from 'csv-stringify';
import fs from 'fs';

let data = [];
let columns = {
  id: 'id',
  name: 'Name'
};

for (var i = 0; i < 10; i++) {
  data.push([i, 'Name ' + i]);
}

stringify(data, { header: true, columns: columns }, (err, output) => {
  if (err) throw err;
  fs.writeFile('my.csv', output, (err) => {
    if (err) throw err;
    console.log('my.csv saved.');
  });
});
Touchstone answered 26/1, 2018 at 14:21 Comment(3)
What is the advantage here compared to John's solution? Make sure it is string that is written?Cyrille
This solution helped me out. I needed to have different column names than the JSON field names. Defining columns as that object did the trick.Brief
I had to add brackets around the import { Stringify } from 'csv-stringnify'Antitoxin
A
27

If you want to use a loop as you say you can do something like this with Node fs:

let fs = require("fs")

let writeStream = fs.createWriteStream('/path/filename.csv')

someArrayOfObjects.forEach((someObject, index) => {     
    let newLine = []
    newLine.push(someObject.stringPropertyOne)
    newLine.push(someObject.stringPropertyTwo)
    ....

    writeStream.write(newLine.join(',')+ '\n', () => {
        // a line was written to stream
    })
})

writeStream.end()

writeStream.on('finish', () => {
    console.log('finish write stream, moving along')
}).on('error', (err) => {
    console.log(err)
})
Anzovin answered 20/9, 2019 at 15:4 Comment(0)
P
8

In case you don't wanna use any library besides fs, you can do it manually.

let fileString = ''
const filename = 'fileExample.csv'

fileString += Object.keys(jsonObjects[0]).join(',')

jsonObjects.forEach((jsonObject) => {
    fileString += '\n' +  Object.values(jsonObject).join(',')
})

fs.writeFileSync(filename, fileString, 'utf8')
Prune answered 16/6, 2020 at 0:58 Comment(0)
P
4

Writing a CSV is pretty easy and can be done without a library.

import { writeFile } from 'fs/promises';
// you can use just fs module too

// Let's say you want to print a list of users to a CSV
const users = [
  { id: 1, name: 'John Doe0', age: 21 },
  { id: 2, name: 'John Doe1', age: 22 },
  { id: 3, name: 'John Doe2', age: 23 }
];

// CSV is formatted in the following format 
/*
  column1, column2, column3
  value1, value2, value3
  value1, value2, value
*/
// which we can do easily by
const dataCSV = users.reduce((acc, user) => {
    acc += `${user.id}, ${user.name}, ${user.age}\n`;
    return acc;
  }, 
  `id, name, age\n` // column names for csv
);

// finally, write csv content to a file using Node's fs module
writeFile('mycsv.csv', dataCSV, 'utf8')
  .then(() => // handle success)
  .catch((error) => // handle error)

NOTE: If your CSV content has , in it, you must escape it or use another delimiter. If that's the case, I suggest using a library like csv-stringify

Pooley answered 30/3, 2023 at 8:0 Comment(2)
great it works nicely, but this only writes one row of data. how to append multiple rows to this file, but writing the header row only onceBitterling
the reduce is doing exactly this, you must be doing something wrong with mapping data for CSVPooley
F
1

For those who prefer fast-csv:

const { writeToPath } = require('@fast-csv/format');

const path = `${__dirname}/people.csv`;
const data = [{ name: 'Stevie', id: 10 }, { name: 'Ray', id: 20 }];
const options = { headers: true, quoteColumns: true };

writeToPath(path, data, options)
        .on('error', err => console.error(err))
        .on('finish', () => console.log('Done writing.'));
Feldspar answered 25/6, 2020 at 18:33 Comment(0)
D
1

**In case you don't wanna use any library besides fs, you can do it manually. More over you can filter the data as you want to write to CSV file **

router.get('/apiname', (req, res) => {
 const data = arrayOfObject; // you will get from somewhere
 /*
    // Modify old data (New Key Names)
    let modifiedData = data.map(({ oldKey1: newKey1, oldKey2: newKey2, ...rest }) => ({ newKey1, newKey2, ...rest }));
 */
 const path = './test'
 writeToFile(path, data, (result) => {
     // get the result from callback and process
     console.log(result) // success or error
   });
});

writeToFile = (path, data, callback) => {
    fs.writeFile(path, JSON.stringify(data, null, 2), (err) => { // JSON.stringify(data, null, 2) help you to write the data line by line
            if (!err) {
                callback('success');
                // successfull
            }
            else {
                 callback('error');
               // some error (catch this error)
            }
        });
}
Drone answered 8/7, 2021 at 4:25 Comment(0)
P
0

Based on Jyotirmoy Upadhaya's answer above.

Node version 22.

No libraries or async, Node-script with example data.

const fs = require('fs');
     
function writeCsv(){
  const users = [
    { id: 1, name: 'John Doe' },
  ];

  const dataCSV = users.reduce((acc, user) => {
      acc += `${user.id}, ${user.name}\n`;
      return acc;
    },
    `id, name \n`
  );

  fs.writeFileSync('mycsv.csv', dataCSV, 'utf8')
 }

// Call function
writeCsv()
Payday answered 28/6, 2024 at 8:31 Comment(0)
A
-2

this is the code that worked for me in nest js

import { Parser } from "json2csv";

 const csv = require('csvtojson');

      const csvFilePath = process.cwd() + '/' + file.path;

      let csv data  = await csv().fromFile(csvFilePath); /// read data from csv into an array of json 
          


/// * from here how to write data into csv *


          data.push({
                label: value,
                .......
          })                 
        }

        const fields = [
         'field1','field2', ... 
        ]
        
        const parser = new Parser({ fields, header:false }); /// if dont want header else remove header: false
        const csv = parser.parse(data);
        appendFileSync('./filename.csv',`${csv}\n`); // remove /n if you dont want new line at the end 
      
Archespore answered 29/12, 2022 at 5:28 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.