If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

View Poll Results: Which would be the best method to return 30 data tables from the server at once?

Voters

2. You may not vote on this poll

30 simultaneous ajax calls

00%

1 large ajax call with all 30 tables

2100.00%

5 simultaneous ajax calls with 6 tables per call (or some other spread)

00%

5 ajax calls, one at a time, with 6 tables per call (or some other spread)

Multiple ajax requests vs one large ajax request?

Assume you have 30 data tables to load into javascript arrays. Would it be faster and more efficient to make 30 ajax requests, one ajax request that returns all 30 tables, or something in between such as do five ajax requests at a time, waiting for each to return? Would 30 ajax requests at once cause issues?

You would add a considerable amount of time in the process of loading this data by doing each table request seperately. It actually shouldn't take any more time (that would be reasonably noticeable by a human) in receiving one data table via AJAX vs receiving all 30 via AJAX. Essentially the time it actually takes to send that data back and forth doesn't really increase per request and it would merely be a matter of processing time on the server-side script end of things that would increase.

"Given billions of tries, could a spilled bottle of ink ever fall into the words of Shakespeare?"

I would recommend making one Ajax request for each table, but staggering them by, say, 300 ms, and doing them in batches of 5.

This gives you several advantages:

1) The real processor heavy work for the browser is rerendering the page. You have more render events in the browser, but less of the document is changed each time.

2) With staggering these updates, the processing time is spread out over a greater span of time, causing fewer cases where the web page freezes. You get greater average processor usage over the life of the page, but lower spikes in usage.

3) By staggering requests and doing them in batches, you can guard your backend against what is essentially a distributed denial of service attack by your own users.