Hey folks,
this is probably one of the most exciting things I've come up with since I started playing around with JavaScript. If you ever wrote a somewhat complex JS snippet you might have noticed that slower computers / browsers (like FF2 on mac) will freeze if you run through a big loop that does some heavy operations (event bindings, dom manipulations, etc.).
For me at a user this has always been very annoying since it just gets into your work flow and if you get unlucky might even crash your browser. So far I thought the only way to avoid this was by simply writing less intense JavaScript and/or optimizing the hell out of it.
However, with the recent upgrades coming for PostTask we've reached the point were we often need to bind up to 300 events (!) on page load as well as perform a whole bunch of other operations. It got to the point where loading the side would freeze FF2 on mac for an insane ~16 seconds - absolutely unacceptable by any standard. This post is about how a simple trick reduced initialization time down to 5 seconds while avoiding any browser freezing whatsoever at the same time.
The big secret - Using JavaScript to partition workload
The key problem I identified that is responsible for browser freezing is that JavaScript runs in a single thread. This means while there is some JS code executing, no other code can run at the same time. From my old VB days I remembered that there used to be a function called doEvents() which you could put into big loops to keep your apps from freezing. What it did was essentially to see if there was any other code that needed to run, execute it and then go back into your loop. JavaScript doesn't have such a function, but something that comes very close - timers. John recently did a good post about them that confirmed my previous experiments with them: If you set an interval or timeout of lets say 50ms in JavaScript there is no guarantee whatsoever to when the event will fire. The only thing JS promises you is to not fire the event *before* 50ms are over. Other then that it just tries to execute the event ASAP. That means if there is any JS code currently running then no event will fire until the block of code finished executing. It also means that several events (interval / timeout / click / etc. callbacks) can queue up over a while and then fire back to back. What seems like a very annoying problem that you have to work around if you want to do lets say smooth animations, it turns out that this also opens a broad series of possibilities of writing "asynchronous code" in JS. And by asynchronous I mean code that will not execute as part of your normal program flow, but whenever it is convenient for JS / the browser to execute it. You might already have come to the conclusion I made a little while ago: This is a perfect mechanism to split a big operation up into smaller chunks that the browser can process whenever its convenient and will not cause any freezes. This always works best if you leave the some room to "breathe" for the browser in between executing the code chunks (1-5ms usually are enough).
Here is some code that can accomplish this in an easy-to-use fashion:
$.queue = {
_timer: null,
_queue: [],
add: function(fn, context, time) {
var setTimer = function(time) {
$.queue._timer = setTimeout(function() {
time = $.queue.add();
if ($.queue._queue.length) {
setTimer(time);
}
}, time || 2);
}
if (fn) {
$.queue._queue.push([fn, context, time]);
if ($.queue._queue.length == 1) {
setTimer(time);
}
return;
}
var next = $.queue._queue.shift();
if (!next) {
return 0;
}
next[0].call(next[1] || window);
return next[2];
},
clear: function() {
clearTimeout($.queue._timer);
$.queue._queue = [];
}
};
Late event binding
One of the things we use this for in PostTask is what I call late event binding. Lets say you have the following code: (Note: This could be simplified a lot, but this way its easier to understand how the refactoring works)
$(document).ready(function() {
// a lot of li's, lets say 500
$('li').each(function() {
$(this).bind('click', function() {
alert('Yeah you clicked me');
});
});
});
You will probably notice that it already has a noticeable impact on your pages initialization time while freezing the browser during it. To avoid it can be as easy as:
$(document).ready(function() {
// a lot of li's, lets say 500
$('li').each(function() {
var self = this, doBind = function() {
$(self).bind('click', function() {
alert('Yeah you clicked me');
});
};
$.queue.add(doBind, this);
});
});
This should minimize your pages initialization time down to almost its non-JS speed while also binding the events in a timely fashion. Essentially instead of binding the event on directly with the document ready invent, you simply encapsulate each bind into its own closure that you add to a queue for later execution. This means that the user will see the page very fast while in the background every 2ms a new li element gets its binding done. Of course the user could now do something really annoying and click on an element before its bind finishes which might screw things up. However, this is very unlikely. Fore one, the user needs to be very fast, because he'll need to click in < 1 second on one of the last elements (which is likely outside the current viewport because of the amount of elements). But also its very unnatural behavior for the user. And even if you have to make 100% sure the user is not able to interact you can use this technique to avoid freezes. You can for example start with all elements being hidden and only start showing them 1 by 1 after they've been processed. Or you can also use an overlay like the jquery blockUI plugin to block any interaction. Either way you'll give the user a better experience by not freezing the browser and allowing him to see the page as fast as possible (even if people can't use something right away, visual response of the page loading up is very important).
Async calculations
One other usage case I had with a client a couple days ago was that he was using a jquery UI slider which was supposed to dynamically filter a list of keywords depending on the selected value and doing some calculations. My first implementation of this would constantly freeze the browser b/c the values where live updated as you moved the slider and moving it over 20 values caused 20x intense loops to occur right after each other. The solution? Use the time queue from above. I simply split the keyword filtering loop up into distinct chunks, and whenever the slider was moved I ran $.queue.clear() first so if the previous filtering hadn't finished yet it would simply be discarded. This does two things at the same time: It allows the user to freely drag the slider back & forth without any freezing, while also easing work load by being able to discard ongoing operations for the previous value. If the operation for each value takes more then a second you can also easily add an ajax spinner that indicates activity. It also makes sense b/c working with this trick really feels like suddenly being able to use "ajax" for loops locally executed on the client ; ).
Other usage cases
I'm sure there are many other situations where you can use this. I've comen up with a rough draft for dramatically improving the responsiveness of jQuery UI's sortable plugin on slower browsers and am working on some other neat things. One could also write a jQuery plugin to ease late event bindings and other operations with a syntax like: $('li').delayed('bind', 'click', function() {}).
Alright, I hope this is useful to those of you writing heavy JavaScript applications and maybe interesting for the rest as well. I wish I had more time to provide a more comprehensive solution, but I'm leaving for a week of non-computer fun to Thailand tomorrow.
Let me know what you think and what other usage cases you can think of,
-- Felix Geisendörfer aka the_undefined