Skip to content Skip to sidebar Skip to footer

Browser Aborting Ajax Requests Sporadically Without Returning Any Errors

In my project (PHP with Symfony 2) I do a lot of Ajax requests in every page. I'm having a lot of problems with them, because it looks like browsers (tested in Google Chrome and Fi

Solution 1:

Could fit your needs, sending request one by one should avoid server rejecting some parallel requests:

TEST IT

$(document).ready(function () {
    var tot = 30; //to simulate 30 requests
    (functionrequest(i){
        if(i === tot) return;
        $.get('/echo/html/?'+i, function (data) {
            console.log("success");
        }).always(function() { request(++i) });
    })(0);
});

Solution 2:

You can serialize requests with a global object:

functionAjaxRequester() {
    this.queue = [];
}

AjaxRequester.prototype.doRequest(request){
    if (this.queue.length>0){
        this.queue.push(request)
    }
    elsehandleRequest(request)
}

AjaxRequester.prototype.handleRequest(request){
    /* actually handle ajax request, on complete inspect 
       queue and if not empty recall this method on the first
       element */
}

requester = newAjaxRequester();

into your code, do

requester.doRequest(yourRequest);

Solution 3:

I think your server is taking too long to respond and the browser is timing out. It may be that the browser will only have two open connections to the server and is aborting the last 8/10 when the first two take too long. I would check your server logs to confirm this.

What is the server doing when it sees a request to /i18n/javaScript/pt.json? Four seconds is a long time. Try fetching some static content like an image or some static HTML instead of pt.json and see if that fixes the problem.

If you need to do a lot of computation to produce pt.json, can it be cached? Is it very large? Are you using Apache?

Solution 4:

If this is a cache issue then you may try this:

Add the following markup in your page's header. That will prevent the browser from caching any data.

<metahttp-equiv="cache-control"content="max-age=0" /><metahttp-equiv="cache-control"content="no-cache" /><metahttp-equiv="expires"content="0" /><metahttp-equiv="expires"content="Tue, 01 Jan 1990 12:00:00 GMT" />

This will disable caching in most of the browsers, and your data will be fetched on each refresh as there is no cache,

One more thing. Your data is called in loop... I can say it is a continuous call with same request, so the request might be considered as a duplicate request and may get cancelled by the service; As that might be considered as a flood of requests if the same request is sent again and again...

Try adding something while calling, like this..

for (var i = 0; i < 10; i++) {
    $.get('/i18n/javaScript/pt.json?v='+i, function(data) {
        console.log(data);
    });
}

And handle the request with something. Try and see if it works..

If calling open() using, as you are using, jQuery, it's built in, and it will have an effect.

Refer an Stack Overflow Old Post How to solve Firebug’s “Aborted” messages upon Ajax requests?.

I am sure this will give you positive lead towards solution...

Post a Comment for "Browser Aborting Ajax Requests Sporadically Without Returning Any Errors"