0

Every time I am trying to send a huge Json object(contains images base64) over the network to my node js server.I get this

Error: request entity too large

There is solution like this link (stackoverflow question) that recommend to increase the limit on the node js server. I am looking for a client side solution. Is there any way to break down the data sent and have the server wait to receive all the packets and then process? (promise??)

Hypothetically speaking, if my files is 20mb, even if my limit is set to 5mb, I will be able to send it, it will just take more time. Is it possible? Can you please put me on the right track?

I am using express on the back end side

Here some code to refer:

    //AJAX CALL
    var promise = $.Deferred();
    $.ajax('/module/report/ddl',{
        data: {
            widget: JSON.stringify(widget), <--- large, very large
            filename: filename,
            username: username,
            reportname: reportName
        },
        type: 'POST',
        success: function(result){
            //Do something
            promise.resolve(result);
        },
        error :function(err) {
            /* Act on the event */
            promise.reject(err);
        }
    });
    return promise;
}

Here is my server side

   router.route('/ddl').post(ddlReport));
   function ddlReport(req, res){
     var filename = req.body.filename;
     var widgets = req.body.widget; <--- Once again large
     var username = req.body.username;


ctrl.ddlProcess(widgets,filename,title,username,function(err, result){
    if(err){
        res.status(422).send(err);
    }else {
        res.send(result);
    }
});

}

If you need anymore information, let me know! I am trying to find a solution that would be best even if it is not the one that I suggested!

Thank you for your time! Have good day!

Community
  • 1
  • 1
  • 1) make a sync request to server so that file transfer is ensured. an async request would not stop user from going to a different url and killing your long upload 2) express need not do anything here. you can break the result of JSON.stringify by a length (say 5,000,000 bytes), name each chunk in serial sequence, add it up at server (you'd need to save these chunks in your session). 3) use a flag to know the last chunk of the file. like totalChunkCount or lastChunk=true – sbharti May 17 '17 at 19:37
  • 1) So to do so I just add async: false to my ajax call, right? 2 & 3) I am new to express but I will check how to store my data into the user's session. And when it received the last chunk it execute the process, it makes sense thank you! – TheOneWhoMightKnow May 18 '17 at 13:13
  • yes you should use sessions for this. check this https://github.com/expressjs/session. also have a backing store like redis or mongodb and you're good to go. – sbharti May 18 '17 at 18:57
  • Thank you for your help I have a 'temporary' working demo but I am using sync ajax call which is deprecated. I will try to publish an answer using promise soon – TheOneWhoMightKnow May 24 '17 at 15:10
  • yes you can easily chain promises.. and they are async.. thats the way to go – sbharti May 24 '17 at 15:52

1 Answers1

0

Have you tried to break your large json into small pieces and send them separately? Just add some additional info like {packet: 1, total: 20, data: ... (one small piece of large data)} Then send them at same time via ajax or one by one. And on node server side collect all of those 20 pieces. Additionally you can have some checksum.

Mark Chackerian
  • 16,691
  • 6
  • 95
  • 91
Romick
  • 135
  • 8
  • How do I make the server wait to receive all package? Is there a way to do that with express? – TheOneWhoMightKnow May 17 '17 at 17:18
  • Create an object and add each asyncronusly received peace to that object. And receiving enery peace please perform check if all peaces are received – Romick May 18 '17 at 19:26