I currently have a csv file that is 1.3 million lines. I'm trying to parse this file line by line and run a processes on each line. The issue I am running into, is I run out of heap memory. I've read online and tried a bunch of solutions to not store the entire file into memory, but it seems nothing is working. Here is my current code:
const readLine = createInterface({
  input: createReadStream(file),
  crlfDelay: Infinity
});
readLine.on('line', async (line) => {
  let record = parse2(`${line}`, {
    delimiter: ',',
    skip_empty_lines: true,
    skip_lines_with_empty_values: false
  });
  // Do something with record
  index++;
  if (index % 1000 === 0) {
    console.log(index);
  }
});
// halts process until all lines have been processed
await once(readLine, 'close');
This starts off strong, but slowly the heap gets filled, and I run out of memory and the program crashes. I'm using a readstream, so I don't understand why the file is filling the heap.
 
     
    