Important: This documentation covers Yarn 1 (Classic).
For Yarn 2+ docs and migration guide, see yarnpkg.com.

Package detail

parse-csv-stream

ayushpratap2494187ISC2.0.2

Parse any csv file via stream or csv stream from any other source in Node.js for various usecases like database insertion, logging, file creation, batch processing data etc.

csv-stream, parse-csv, parse-csv-stream, csv-2-json, csv-json-stream, csv-json, csv, csv-parser

readme

code style: prettier

parse-csv-stream

Parse csv files via stream or parse any csv stream from various sources in Node.js for different usecases like batch processing, database insertion, logging, file creation & data transformations etc. Support for large csv files added.

Example:

const parse_csv = require('parse-csv-stream');
const fs = require('fs');

const readStream = fs.createReadStream('./test.csv', 'utf8');
const writeStream = fs.createWriteStream('./test.json');

//default option.
const options = {
    // delimiter: ',',
    // wrapper: '"',
    // newlineSeperator: '\r\n'
};

const parser = new parse_csv(options);
const events = parser.events;

/*
There are 2 approaches you can take : 
[A.] events. 
[B.] streams.

There are 3 ways to handle data : 
[1.] Process each row seperately via events.
[2.] Process resultset (array of rows).
[3.] Pipe parsed stream.

 choose any one.
*/

// [A.] working with events.
events.on('data', (row) => {
    console.log(row); //process each row seperately.
})

readStream.on('data', (chunk) => {
   let resultset =  parser.parse(chunk); //process resultset (array of rows).
});

//[B.] Working with streams.
readStream.pipe(parser).pipe(writeStream); //pipe parsed stream.

Built With

  • Native Node.js modules
  • No external dependencies.

Authors

License

[MIT License] © Ayush Pratap