1

Memory issue

I'm working with a binary file larger then 50mb. Node can't handle it and returns me this at the console:

FATAL ERROR: JS Allocation failed - process out of memory

Finding a solution

So, I've searched Google enough to understand that I need to use stream to handle this huge files. But I didn't find any example of how can I read a stream, parse it, and write down to a file.

My original code below:

var fs = require('fs');

var buff = fs.readFileSync('mybinaryfile.dat');
var toBeWriten = "";

for (var i = 0; i < buff.length; i++) {
    toBeWriten += buff[i]+1;
}

fs.appendFileSync('mybinaryfile_modified.txt', toBeWriten);

The question is: can someone convert this simple parsing process into the stream way of handling large files?

0 Answers0