44

I've seen a bunch of similar questions to this get asked before, but I haven't found one that describes my current problem exactly, so here goes:

I have a page which loads a large (between 0.5 and 10 MB) JSON document via AJAX so that the client-side code can process it. Once the file is loaded, I don't have any problems that I don't expect. However, it takes a long time to download, so I tried leveraging the XHR Progress API to render a progress bar to indicate to the user that the document is loading. This worked well.

Then, in an effort to speed things up, I tried compressing the output on the server side via gzip and deflate. This worked too, with tremendous gains, however, my progress bar stopped working.

I've looked into the issue for a while and found that if a proper Content-Length header isn't sent with the requested AJAX resource, the onProgress event handler cannot function as intended because it doesn't know how far along in the download it is. When this happens, a property called lengthComputable is set to false on the event object.

This made sense, so I tried setting the header explicitly with both the uncompressed and the compressed length of the output. I can verify that the header is being sent, and I can verify that my browser knows how to decompress the content. But the onProgress handler still reports lengthComputable = false.

So my question is: is there a way to gzipped/deflated content with the AJAX Progress API? And if so, what am I doing wrong right now?


This is how the resource appears in the Chrome Network panel, showing that compression is working:

network panel

These are the relevant request headers, showing that the request is AJAX and that Accept-Encoding is set properly:

GET /dashboard/reports/ajax/load HTTP/1.1
Connection: keep-alive
Cache-Control: no-cache
Pragma: no-cache
Accept: application/json, text/javascript, */*; q=0.01
X-Requested-With: XMLHttpRequest
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_5) AppleWebKit/537.22 (KHTML, like Gecko) Chrome/25.0.1364.99 Safari/537.22
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3

These are the relevant response headers, showing that the Content-Length and Content-Type are being set correctly:

HTTP/1.1 200 OK
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Content-Encoding: deflate
Content-Type: application/json
Date: Tue, 26 Feb 2013 18:59:07 GMT
Expires: Thu, 19 Nov 1981 08:52:00 GMT
P3P: CP="CAO PSA OUR"
Pragma: no-cache
Server: Apache/2.2.8 (Unix) mod_ssl/2.2.8 OpenSSL/0.9.8g PHP/5.4.7
X-Powered-By: PHP/5.4.7
Content-Length: 223879
Connection: keep-alive

For what it's worth, I've tried this on both a standard (http) and secure (https) connection, with no differences: the content loads fine in the browser, but isn't processed by the Progress API.


Per Adam's suggestion, I tried switching the server side to gzip encoding with no success or change. Here are the relevant response headers:

HTTP/1.1 200 OK
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Content-Encoding: gzip
Content-Type: application/json
Date: Mon, 04 Mar 2013 22:33:19 GMT
Expires: Thu, 19 Nov 1981 08:52:00 GMT
P3P: CP="CAO PSA OUR"
Pragma: no-cache
Server: Apache/2.2.8 (Unix) mod_ssl/2.2.8 OpenSSL/0.9.8g PHP/5.4.7
X-Powered-By: PHP/5.4.7
Content-Length: 28250
Connection: keep-alive

Just to repeat: the content is being downloaded and decoded properly, it's just the progress API that I'm having trouble with.


Per Bertrand's request, here's the request:

$.ajax({
    url: '<url snipped>',
    data: {},
    success: onDone,
    dataType: 'json',
    cache: true,
    progress: onProgress || function(){}
});

And here's the onProgress event handler I'm using (it's not too crazy):

function(jqXHR, evt)
{
    // yes, I know this generates Infinity sometimes
    var pct = 100 * evt.position / evt.total;

    // just a method that updates some styles and javascript
    updateProgress(pct);
});
Jimmy Sawczuk
  • 12,908
  • 7
  • 45
  • 60
  • 1
    One of the ideologies that is the basis of AJAX is the ability to lazy load pieces of data on demand. Why not load portions of this data as-needed with ajax rather than the whole heap? – Kristian Feb 26 '13 at 19:41
  • 1
    @Kristian Without going into too many details, I kind of need the whole thing. The only reason I'm using AJAX at all (rather than just throwing it in with the main request) is because I want to put something on the screen quickly so the user knows something's happening. – Jimmy Sawczuk Feb 26 '13 at 19:45
  • This mozilla bug looks interesting: https://bugzilla.mozilla.org/show_bug.cgi?id=614352 – Rocket Hazmat Feb 26 '13 at 19:49
  • @JimmySawczuk i understand completely. in my experience, usually there is a simpler road to travel down than to juggle headers and compression for the sake of what you're doing. but since i trust you understand what you're getting into, good luck mate! – Kristian Feb 26 '13 at 19:57
  • Reading Rocket's link, I think this "bug" would be there for quite some time. Is it possible to split the JSON in several chunks and use several XHR to fetch them? You can split the progress bar like "loading image...loading audio...etc.". – Passerby Mar 01 '13 at 03:38
  • 1
    i think the problem is that if you compress it, there will be a transfer-encoding , which possibly forbids usage of content-length and the onProgress is trying to measure the uncompressed length. It does not know the size of the final uncompressed content , so it cannot work. The transfer is probably done in a chunked mode where the transmission ends when it ends. – Markus Mikkolainen Mar 01 '13 at 12:16
  • also check if the "transferred" byte amount is updated even though "total" is not known, because you can easily deduce/transfer separately the "total" and get a good progress indication after that. – Markus Mikkolainen Mar 01 '13 at 12:25
  • I don't get it; what's the problem with my answer? – Ivan Castellanos Mar 06 '13 at 14:38
  • Unless that is jsonp, why not break the json file into lots of small strings? You can then measure progress even without the Progress API. – Tiberiu-Ionuț Stan Mar 07 '13 at 20:35
  • Elaborating on Marcus's comment: _If_ the deflate encoding somehow implies a non-identity transfer encoding (and it's not clear to me one way or the other if it does), then regardless of what you put in the Content-Length header, it will be ignored. See http://www.w3.org/Protocols/rfc2616/rfc2616-sec4.html#sec4.4. My hunch is that this is your issue. – dgvid Mar 07 '13 at 22:24
  • Would it be possible for you to show us the code of your progressBar and how you work with the xhr? – Bertrand Mar 07 '13 at 23:35
  • @dgvid, that's the conclusion I've arrived at too, that the transfer encoding is the culprit. I'm trying to figure out how to force it now. – Jimmy Sawczuk Mar 07 '13 at 23:36
  • @Bertrand I added that code to the question. There's not a whole lot of crazy going on though. – Jimmy Sawczuk Mar 07 '13 at 23:41
  • Thank you, I am sorry, I was interested in the way you send your request to the server, I read a couple of notes on requesting binary data using xhr. – Bertrand Mar 07 '13 at 23:55
  • @Bertrand attached, sorry for the mix-up. – Jimmy Sawczuk Mar 07 '13 at 23:57
  • On Mozilla documentation https://developer.mozilla.org/en-US/docs/DOM/XMLHttpRequest/Using_XMLHttpRequest#Handling_binary_data if you had addEventListener("progress", updateProgress, false); the first argument received by the updateProgress is event and not xhr, with the availaible properties: event.lengthComputable, event.loaded and event.total; – Bertrand Mar 07 '13 at 23:59
  • Forgot my previous comment, you use jQuery... – Bertrand Mar 08 '13 at 00:02
  • Which version of jQuery are you using? I will try a solution without jQuery as described in my previous comment – Bertrand Mar 08 '13 at 00:04
  • This is a very helpful post... but any ideas as to _why_ gzipped content can't use gzipped/deflated content with the AJAX Progress API? – jedierikb Sep 05 '13 at 18:28

8 Answers8

15

A slightly more elegant variation on your solution would be to set a header like 'x-decompressed-content-length' or whatever in your HTTP response with the full decompressed value of the content in bytes and read it off the xhr object in your onProgress handler.

Your code might look something like:

request.onProgress = function (e) {
  var contentLength;
  if (e.lengthComputable) {
    contentLength = e.total;
  } else {
    contentLength = parseInt(e.target.getResponseHeader('x-decompressed-content-length'), 10);
  }
  progressIndicator.update(e.loaded / contentLength);
};
Nat
  • 2,136
  • 23
  • 31
  • 1
    This is a superior solution to all others on this page. – user145400 Nov 22 '16 at 23:47
  • 1
    This solution works very well see [here](http://stackoverflow.com/questions/41909881/xmlhttprequest-progress-tracking-and-symfony-cors/41945737#41945737) – Tom Tom Jan 31 '17 at 09:04
  • 1
    Shouldn't you call `parseInt` on `contentLength` when you acquire it from a response header? – John Weisz Aug 01 '17 at 12:46
  • @JohnWeisz Probably should. Although it works anyway as the `/` operator in JavaScript performs typecasting. I'll fix it. – Nat Jan 05 '18 at 11:49
  • 1
    I'm seeing that `e.loaded` shows the number of **compressed** bytes downloaded in Firefox, and the number of **uncompressed** bytes downloaded in Chrome. This solution still works properly, however, because `e.lengthComputable` is `true` in Firefox and `false` in Chrome. – We Are All Monica May 05 '20 at 19:55
9

I wasn't able to solve the issue of using onProgress on the compressed content itself, but I came up with this semi-simple workaround. In a nutshell: send a HEAD request to the server at the same time as a GET request, and render the progress bar once there's enough information to do so.


function loader(onDone, onProgress, url, data)
{
    // onDone = event handler to run on successful download
    // onProgress = event handler to run during a download
    // url = url to load
    // data = extra parameters to be sent with the AJAX request
    var content_length = null;

    self.meta_xhr = $.ajax({
        url: url,
        data: data,
        dataType: 'json',
        type: 'HEAD',
        success: function(data, status, jqXHR)
        {
            content_length = jqXHR.getResponseHeader("X-Content-Length");
        }
    });

    self.xhr = $.ajax({
        url: url,
        data: data,
        success: onDone,
        dataType: 'json',
        progress: function(jqXHR, evt)
        {
            var pct = 0;
            if (evt.lengthComputable)
            {
                pct = 100 * evt.position / evt.total;
            }
            else if (self.content_length != null)
            {
                pct = 100 * evt.position / self.content_length;
            }

            onProgress(pct);
        }
    });
}

And then to use it:

loader(function(response)
{
    console.log("Content loaded! do stuff now.");
},
function(pct)
{
    console.log("The content is " + pct + "% loaded.");
},
'<url here>', {});

On the server side, set the X-Content-Length header on both the GET and the HEAD requests (which should represent the uncompressed content length), and abort sending the content on the HEAD request.

In PHP, setting the header looks like:

header("X-Content-Length: ".strlen($payload));

And then abort sending the content if it's a HEAD request:

if ($_SERVER['REQUEST_METHOD'] == "HEAD")
{
    exit;
}

Here's what it looks like in action:

screenshot

The reason the HEAD takes so long in the below screenshot is because the server still has to parse the file to know how long it is, but that's something I can definitely improve on, and it's definitely an improvement from where it was.

Jimmy Sawczuk
  • 12,908
  • 7
  • 45
  • 60
  • Now you have 2 HTTP requests that need to succeed. Is unlikely but statistically it does increases the probability of a broken page. Plus you have a custom HTTP header that may confuse people that come here to copy/paste the code to solve the same problem. If you want decoupling that bad better use GET params you can extract with JS: `download.html?file=data.json&size=6666` – Ivan Castellanos Mar 09 '13 at 11:37
  • @IvanCastellanos if the `HEAD` request fails it's not a big deal, and if the `GET` request fails I'm no better off than before. I could theoretically set the normal `Content-Length` header but it might be misrepresentative of what's actually being sent on the `HEAD` request, so the custom one is more appropriate. (I could probably clarify that part of my answer.) I don't follow how your `GET` parameters address my problem either. – Jimmy Sawczuk Mar 09 '13 at 18:41
  • The link to the download page would always contain the info of the "Content-length" making it way more portable solution (don't need php script plus one less HTTP request). `content_length=document.location.href.match(/(?:size=)(\d+)/)[1]|0;` – Ivan Castellanos Mar 10 '13 at 19:54
  • That doesn't address my problem because I don't know on page load what files the user is going to need or how big they're going to be. – Jimmy Sawczuk Mar 11 '13 at 04:35
  • I see. Maybe update the page itself using PHP? something like `if (!isset($_GET['size'])) header("Location: ". $_SERVER["REQUEST_URI"]."&size=".filesize($_GET["file"]))` but maybe is because I like to do things like that when I need max decoupling. – Ivan Castellanos Mar 11 '13 at 19:11
  • I had this problem and was able to send a header containing the decompressed length as a response to the GET request itself. In my case, at least, this eliminated the need for two separate requests. – charleslparker Apr 03 '14 at 13:49
4

Don't get stuck just because there isn't a native solution; a hack of one line can solve your problem without messing with Apache configuration (that in some hostings is prohibited or very restricted):

PHP to the rescue:

var size = <?php echo filesize('file.json') ?>;

That's it, you probably already know the rest, but just as a reference here it is:

<script>
var progressBar = document.getElementById("p"),
    client = new XMLHttpRequest(),
    size = <?php echo filesize('file.json') ?>;

progressBar.max = size;

client.open("GET", "file.json")

function loadHandler () {
  var loaded = client.responseText.length;
  progressBar.value = loaded;
}

client.onprogress = loadHandler;

client.onloadend = function(pe) {
  loadHandler();
  console.log("Success, loaded: " + client.responseText.length + " of " + size)
}
client.send()
</script>

Live example:

Another SO user thinks I am lying about the validity of this solution so here it is live: http://nyudvik.com/zip/, it is gzip-ed and the real file weights 8 MB



Related links:

Community
  • 1
  • 1
Ivan Castellanos
  • 7,234
  • 1
  • 39
  • 39
  • The entire problem is that in the javascript the file size isn't reported as it only can be computed once the entire document is loaded. Knowing the size of the original file won't help anything in that. – David Mulder Mar 06 '13 at 13:40
  • What? Reading the client.responseText.length like you can see in my example you know the exact amount of bytes that are already loaded; I just tested it with Apache and a big .json file that is being transmitted in gzip (using the mod_deflate module). – Ivan Castellanos Mar 06 '13 at 14:12
  • As I can't test it easily on this computer, could you please enlighten me in that case what's contained in your client.responseText string? From my experience that string should be empty till the entire file is loaded (if the file is compressed). If anything `evt.loaded` might work, but I would have to check that. And additionally you would in that case need to use the filesize of the compressed file and not the original file. Oh and, were you the one downvoting my answer? – David Mulder Mar 06 '13 at 14:44
  • 1
    This isn't a bad idea either, but I was hoping for a cleaner solution. – Jimmy Sawczuk Mar 06 '13 at 14:45
  • You need something else besides solving your problem, ok. You can also use .getResponseHeaders() and use a little regex; maybe that's clean enough for you but I'm not sure anymore. – Ivan Castellanos Mar 06 '13 at 14:53
  • Enlightening Mr. Mulder: http://nyudvik.com/zip/; you can access the chuncked file both in Firefox and Chrome (didn't test any other)... It was a long time without answering a question, I'm starting to remember exactly why is that. – Ivan Castellanos Mar 06 '13 at 15:07
  • world are you making it some kind of personal thing... The point here is just to give good answer, I tried that and you tried that as well initially... why all the hate? – David Mulder Mar 06 '13 at 15:45
  • Sorry, I wake up everyday with a lot of pain (sick) and is just hard to tolerate many things. – Ivan Castellanos Mar 06 '13 at 16:15
  • Also, that negative point someone gave me get on my nerves; I came here to collaborate, give a nice solution with the full code, related links... and that's what I get? – Ivan Castellanos Mar 06 '13 at 19:30
  • I ended up using a variation of this answer, which I'll write up separately, but I'll give you the bounty. Thanks! – Jimmy Sawczuk Mar 08 '13 at 15:24
  • This is the correct answer, calling an extra head request is unintuitive. Thank you @IvanCastellanos – NiCk Newman Oct 27 '15 at 01:50
  • it's perfect solution for my case, without changing any server settings :) – Kamilos Dec 04 '15 at 09:58
2

Try changing your server encoding to gzip.

Your request header shows three potential encodings (gzip,deflate,sdch), so the server can pick any one of those three. By the response header, we can see that your server is choosing to respond with deflate.

Gzip is an encoding format that includes a deflate payload in addition to additional headers and footer (which includes the original uncompressed length) and a different checksum algorithm:

Gzip at Wikipedia

Deflate has some issues. Due to legacy issues dealing with improper decoding algorithms, client implementations of deflate have to run through silly checks just to figure out which implementation they're dealing with, and unfortunately, they often still get it wrong:

Why use deflate instead of gzip for text files served by Apache?

In the case of your question, the browser probably sees a deflate file coming down the pipe and just throws up its arms and says, "When I don't even know exactly how I'll end up decoding this thing, how can you expect me to worry about getting the progress right, human?"

If you switch your server configuration so the response is gzipped (i.e., gzip shows up as the content-encoding), I'm hopeful your script works as you'd hoped/expected it would.

Community
  • 1
  • 1
AdamJonR
  • 4,551
  • 2
  • 19
  • 24
2

We have created a library that estimates the progress and always sets lengthComputable to true.

Chrome 64 still has this issue (see Bug)

It is a javascript shim that you can include in your page which fixes this issue and you can use the standard new XMLHTTPRequest() normally.

The javascript library can be found here:

https://github.com/AirConsole/xmlhttprequest-length-computable

1

This solution worked for me.

I increased deflate buffer size to cover biggest file size I may have, which is going to be compressed generally, to around 10mb, and it yielded from 9.3mb to 3.2mb compression, in apache configuration so content-length header to be returned instead of omitted as result of Transfer Encoding specification which is used when loading compressed file exceeds the buffer size, refer to https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Transfer-Encoding for more info about chunked encoding header which is used in compression as well as more info about deflate buffer size in https://httpd.apache.org/docs/2.4/mod/mod_deflate.html#deflatebuffersize.

1- Include the following in your apache configuration, and note buffer size value is in bytes.

<IfModule mod_deflate.c>
DeflateBufferSize 10000000
</IfModule>

2- Restart apache server.

3- Incldue the following in your .htaccess file to make sure content-length header is exposed to JS HTTP requests.

<IfModule mod_headers.c>
    Header set Access-Control-Expose-Headers "Content-Length"
</IfModule>

4- In onDownloadProgress event before calculating progress total percentage append following to retrieve total bytes value.

var total = e.total;
if(!e.lengthComputable){
total = e.target.getResponseHeader('content-length') * 2.2;
} 

5- Note, I learnt by comparing, that lengthComputable is set to false, as flag indicates if content-length is passed in header, while relying not on Content-Length header omission but actually it’s Content-Encoding header, as I found when it is passed in file response headers, lengthComputable is only then set to false, it seems as a normal behaviour as part of JS HTTP requests specification Also, the reason why I multiplied by 2.2 the total from compressed content-length, because it achieves more accurate download/upload progress tracking with my server compression level and method, as the loaded total in HTTP progress returned reflects the decompressed data total instead of compressed data thus it requires tweaking the code logic a little bit to meet your server compression method as it may vary than mine, and first step is to examine the general difference of the compression across multiple files and see if multiplying by 2 e.g. results with closest value to the decompressed files size i.e. original size and multiply accordingly and yet make sure by multiplication the result is still smaller or equal but not bigger than original file size, so for the loaded data its guaranteed reaching and most likely as well as slightly surpassing 100 in all cases. Also, there is hacky enhancement for this issue solution that is by capping progress calculation to 100 and no need to check if progress exceeded while taking the relevant point on assuring to reach 100% into implementation must be addressed.

In my condition, this allowed me, to know when each file/resource loading has completed i.e. check total to be like the following where >= used to take into account slight surpassing 100% after compressed total multiplication to reach decompressed or if percentage calculating method was capped to 100 then use == operator instead, to find when each file completed preloading. Also, I thought about resolving this issue from roots, through storing fixed decompressed loaded totals for each file i.e original file size and using it during preloading files e.g. such as the resources in my condition to calculate progress percentage. Here is following snippet from my onProgress event handling conditions.

// Some times 100 reached in the progress event more than once.
if(preloadedResources < resourcesLength && progressPercentage < 100) {
    canIncreaseCounter = true;
}
if(progressPercentage >= 100 && canIncreaseCounter && preloadedResources < resourcesLength) {
    preloadedResources++;
    canIncreaseCounter = false;
}

Also, note expected loaded total usage as fixed solution it's valid in all circumstances except when oneself have no prior access to files going to preload or download and I think its seldom to happen, as most of times we know the files we want to preload thus can retrieve its size prior to preloading perhaps through serving via PHP script list of sizes for the files of interest that is located in a server with HTTP first request, and then in second, preloading request one will have each relevant original file size and or even before hand store as part of code, the preloaded resources fixed decompressed size in associative array, then one can use it in tracking loading progress.

For my tracking loading progress implementation live example refer to resources preloading in my personal website at https://zakaria.website.

Lastly, I'm not aware of any downsides with increasing deflate buffer size, except extra load on server memory, and if anyone have input on this issue, it would be very much appreciated to let us know about.

Zakaria
  • 11
  • 2
0

The only solution I can think of is manually compressing the data (rather than leaving it to the server and browser), as that allows you to use the normal progress bar and should still give you considerable gains over the uncompressed version. If for example the system only is required to work in latest generation web browsers you can for example zip it on the server side (whatever language you use, I am sure there is a zip function or library) and on the client side you can use zip.js. If more browser support is required you can check this SO answer for a number of compression and decompression functions (just choose one which is supported in the server side language you're using). Overall this should be reasonably simple to implement, although it will perform worse (though still good probably) than native compression/decompression. (Btw, after giving it a bit more thought it could in theory perform even better than the native version in case you would choose a compression algorithm which fits the type of data you're using and the data is sufficiently big)

Another option would be to use a websocket and load the data in parts where you parse/handle every part at the same time it's loaded (you don't need websockets for that, but doing 10's of http requests after eachother can be quite a hassle). Whether this is possible depends on the specific scenario, but to me it sounds like report data is the kind of data that can be loaded in parts and isn't required to be first fully downloaded.

Community
  • 1
  • 1
David Mulder
  • 24,033
  • 8
  • 45
  • 104
  • I had considered this solution but was really hoping to avoid it. Thanks for the reference links though. – Jimmy Sawczuk Mar 06 '13 at 14:42
  • 1
    If `evt.loaded` really holds a valid value then using the actual content-length would be better of course (this was bugged previously, but `xhr.getResponseHeader("Content-Length")` might work by now), but from your answer I assumed that it didn't work with compressed data. (Although right now I am confused, as I am not sure anymore even whether the content length is the length of the compressed or the uncompressed content :S It's been awhile since I did some stuff with this...) – David Mulder Mar 06 '13 at 14:47
  • Oh and btw, you can always just add another header yourself which you could read from the javascript making sure everything is in one request (still provided `evt.loaded` works). – David Mulder Mar 06 '13 at 14:50
  • content-length is usually the size of the compressed file. But because Apache uses progressive encoding I would not recommend to use it. (by default you don't know the final size before hand) – Ivan Castellanos Mar 06 '13 at 15:17
-2

I do not clearly understand the issue, it should not happen since the decompression should done by the browser.

You may try to move away from jQuery or hack jQuery because the $.ajax does not seems to work well with binary data:

Ref: http://blog.vjeux.com/2011/javascript/jquery-binary-ajax.html

You could try to do your own implementation of the ajax request See: https://developer.mozilla.org/en-US/docs/DOM/XMLHttpRequest/Using_XMLHttpRequest#Handling_binary_data

You could try to uncompress the json the content by javascript (see resources in comments).

* UPDATE 2 *

the $.ajax function does not support the progress event handler or it is not part of the jQuery documentation (see comment below).

here is a way to get this handler work but I never tried it myself: http://www.dave-bond.com/blog/2010/01/JQuery-ajax-progress-HMTL5/

* UPDATE 3 *

The solution use tierce third party library to extend (?) jQuery ajax functionnality, so my suggestion do not apply

Bertrand
  • 368
  • 4
  • 13
  • might help: http://rosettacode.org/wiki/LZW_compression#JavaScript http://stuk.github.com/jszip/ – Bertrand Mar 08 '13 at 12:38
  • The decompression *is* done by the browser, it's figuring out the content length so I can know how much I've downloaded that I'm struggling with. – Jimmy Sawczuk Mar 08 '13 at 13:55
  • Ok, then could you follow evt in firebug for example, with console.log(evt) in the function(jqXHR, evt) in the onProgress event handler. I am just trying to help. – Bertrand Mar 08 '13 at 13:58
  • Would it be possible to have the link in private ? – Bertrand Mar 08 '13 at 14:01
  • progress does not exists in the jQuery documentation as an handler to follow progress – Bertrand Mar 08 '13 at 14:12
  • I'm using a third-party extension to do it. It's not an issue unless I try to use deflate or gzip compression. – Jimmy Sawczuk Mar 08 '13 at 14:46
  • Could we know about the behavior observed on console.log(evt) in the function(jqXHR, evt) in the onProgress event handler as asked before? What do you see when you do it ? – Bertrand Mar 08 '13 at 14:54