I would really like to see support for playlists in SVT Play, but that has not been high enough on the priority list at SVT yet. So in the meantime I’ve built a small hack to provide a simple playlist using a bookmarklet. You can try it out on www.svtplaylist.se

The idea was born when a colleague of mine showed a really smart way to create bookmarklets by dynamically loading a script hosted on github. The only code you need to put in the bookmarklet looks something like this:

This code just creates a <script> tag that loads a javascript file hosted on github. You can then put any stuff you want in that script. To make it a real bookmarklet you also need to pack it into one line, url-encode it and put a javascript: prefix in front, like this:

I’ve done a few presentations on SPDY/HTTP2 during the last year. SPDY is now supported by Chrome, Firefox and Opera. Google has been supporting SPDY on all their SSL/TLS traffic for quite some time now. If you haven’t looked at it yet I recommend that you take a look to see how it may affect the way we build web applications. For example file concatenation is not necessary with SPDY and domain sharding might actually be harmful. Here are some of my slides:

After working with ruby and rails for the last couple of months I must say it is a true pleasure to work with. So when we hosted the local Ruby User Group (SHRUG) at the Valtech office I thought I should contribute with a lightningtalk. The presentation I did was about Devise a really nice authentication solution for Rails applications that fixes all the basic stuff you need to have users logging in to your site. It was mostly live coding, which may not have been such a good idea ater all. Hope some of you found it amusing at least. Anyway here are the slides:

Performance tests continued

In my previous post I measured how the performance of a web application is impacted by jQuery live events. The conclusion was that live events will normally not have a big performance impact, but in complex applications with lots of events it can be a problem. An important factor that I didn’t cover very well is the level of nesting in the dom-tree. I have created a test case that demonstrates better how the performance impact changes with a more nested dom-tree.

In this test we have a dom-tree with 30 nested div’s and we register 100 live events. We then trigger a click event on different levels of the tree and measure the time required to process it. On my macbook I get the following result:

click at level 1: 24ms

click at level 5: 54ms

click at level 10: 91ms

click at level 15: 130ms

click at level 20: 174ms

click at level 25: 210ms

click at level 30: 253ms

We see that the performance impact of the live events is greater on clicks that are made deep down a nested dom-tree.

Can the context help?

In my previous post on performance in jquery live events I proposed that it would be a good improvement to support a scope for the live events, so that the performance impact can be limited to a specific part in the dom-tree. One way to implement this could be through the context parameter in the core jQuery() object constructor. When live events were introduced they did not support the use of a context parameter, but the coming jQuery 1.4 release is supposed to support it. More details about this can be found at learningjquery.com. I was curious to se how this context was being used and if it would allow me to limit the impact of the live events to a specific part of dom-tree. To investigate this I modified the above test so that a context is provided when registering the live events. Note that in the previous test as well as this one i use the latest jQuery 1.4 alpha 2 release.

In this test we use one of the nested divs as context. I was hoping that jQuery takes advantage of this context so that we don’t get any performance impact on events triggered above the context element, only events on elements inside the context should get the performance impact. However when I run the tests I get the following result:

context set at level 20

click at level 1: 30ms

click at level 5: 66ms

click at level 10: 113ms

click at level 15: 160ms

click at level 20: 207ms

click at level 25: 255ms

click at level 30: 305ms

We see that the resulting figures are similar to the results where no context was used. The live events even has slightly bigger performance impact now. Events on all elements in the dom-tree gets the performance impact regardless of which context we specify. So the context did not help us here. This is probably due to that jQuery attaches the delegating event on the document instead of the context element. We can verify that jQuery puts the events on the document element with a simple check in firebug. By opening firebug, reloading the test, and then invoking the following command in the console

jQuery(document).data('events').live

We see that the live events are stored on the document and not on the context element. In my view it would be bad if this behavior is kept in the final release of jQuery 1.4 unless the context is still unsupported like in 1.3. Once the functionality is released it can not be changed without breaking some applications that rely on the current behavior. I guess the idea could be that live events should be kept even if you remove the context element and that you should be able to load the context element dynamically. But I think that it would be more valuable to be able to limit the performance impact by putting the delegating event on the context. It would also provide an easy way to control the lifetime of live events by simply removing the context element.

Live events is a new feature in jQuery 1.3. It could be used as an alternative to traditional event delegation techniques. It provides a convenient way to implement event delegation. The following code will bind an event handler to the document. When events bubble up jQuery will check if any ancestor of the original target has the class myButton and delegate the event accordingly.

jQuery('.myButtton').live('click', function(){ //do stuff });

This is very convenient when your markup is loaded through ajax or created on the fly by JavaScript. The jQuery documentation provides some more info on how it works.

Performance

Anytime the user clicks anywhere on the document an event will be triggered and jQuery needs to determine if it should delegate it or not.

jQuery will of course need some cpu time to do this check which could potentially result in sluggish behaviour when the user clicks on the page.

Testcases

To help me understand what performance tradeoff I make when using live events, I have set up a few test cases.

<!DOCTYPE HTML PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"

"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">

<html xmlns="http://www.w3.org/1999/xhtml">

<head>

<title>jQuery live tester</title>

</head>

<body>

<div id="result1">waiting for results...</div>

<div id="result2">waiting for results...</div>

<div id="result3">waiting for results...</div>

<div id="result4">waiting for results...</div>

<div id="result5">waiting for results...</div>

<div id="result6">waiting for results...</div>

<div id="result7">waiting for results...</div>

<div id="button">Button</div>

<script type="text/javascript" src="jquery.js"></script>

<script type="text/javascript">

/*<![CDATA[*/

jQuery(document).ready(function(){

function measureClick(output, text) {

var target = jQuery('#button');

var count = 10;

var start = new Date();

for(var i=0; i<count; i++) {

target.click();

}

var stop = new Date();

output.html(text + ((stop.getTime() - start.getTime())/count));

}

function addEvents(count, selector, context) {

context = context || document;

for(var i=0; i<count; i++) {

jQuery(selector + i, context).live('click', function() {});

}

}

function removeEvents(count, selector) {

for(var i=0; i<count; i++) {

jQuery(selector + i).die('click');

}

}

jQuery('#button').bind('click', function() {});

measureClick(jQuery('#result1'), 'without live events: ');

addEvents(10, '.garbage');

measureClick(jQuery('#result2'), 'with 10 live events: ');

removeEvents(10, '.garbage');

addEvents(100, '.garbage');

measureClick(jQuery('#result3'), 'with 100 live events: ');

removeEvents(100, '.garbage');

addEvents(200, '.garbage');

measureClick(jQuery('#result4'), 'with 200 live events: ');

//Add large DOM-tree

for(var i=0; i<500; i++) {

jQuery('<div class="garbage' + i+'">garbage</div>').appendTo('body');

}

measureClick(jQuery('#result5'), 'with 200 live events and large dom: ');

The tests were performed using jQuery v1.3.2 on a Pentium4 2.8GHz with 2GB RAM running win xp

Explanation of tests

Basically the tests are done by registering a number of live events and one normal event. We measure the time it takes to process the normal event to see how it gets impacted by the live events. First we measured the time it takes to handle the normal event without any live events registered. This is just to have something to compare against. The result is just a fraction of a millisecond. Next we add 10 live events to see how this impacts the performance. The event now takes 4.9 milliseconds to process in FireFox 3.5. To push it a bit we add more live events. We then see that 100 live events results in ca 45 milliseconds and 200 live events results in ca 94.2 milliseconds. It seem like the performance impact is approximately linear when adding more live events. What about dom size? In the first tests the markup is very simple with a very small dom-tree. A bigger dom-tree might have some impact on the performance. To test this we simply add 500 div’s to the dom and test again. This is what test no 5 does. Surprisingly we see that the performance impact of the live events does not get worse when the dom size increases. What about different types of selectors? One might also suspect that the performance is depending on the type of selector used. In test 3 a simple class selector was used. Test 6 instead uses an id selector and test 7 uses a more complext selector. We see that there is no big performance difference between a simple id selector and a simple class selector. However when using more complex selector the performance impact seems to be worse.

Conclusion

The tests show that adding live events will have some performance impact. As long as the number of registered live events is small the impact is not significant. However if you have a complex site with many events you may run into trouble, especially if you need to support browsers with poor JavaScript performance.

What could be done to improve this?

In my view it would have been better if jQuery would let you somehow define what element you want to register the delegating event on, so that the impact is limited to a specific context. The default could still be to add it to the document. This could be done in at least two different ways. One option could be to add the delegating event to the context given in the core jQuery(expression, context) function. I dont know enough about the internals of jQuery to say how feasible this is.

Update:

After a comment from Jonathan Sharp I added some more tests to see how a more nested dom-tree affects the performance. When adding 30 nested divs we can see that a click deep down the tee will be affected more by the live events.

Here are the slides that I have prepared for my talk at JavaForum this evening. It took me some time to cut it down to 34 slides and then still many of them had to become extra spare slides. I had to realize that 30 minutes is really not much time. On the other hand I’ve seen many 10 minute lightning talks where the speaker manages to deliver the essence of a pretty complicated topic, so I shouldn’t really complain.

I have put together some slides about the basic principles in REST. The real presentation is actually implemented as a JAX-RS application. I started out with a django version, but I needed a Java-angle on it so I had to switch to JAX-RS.

It’s been two days of interesting talks and discussions at Agila Sverige 2009. These are some of my reflections from the event:

Bring back the simplicity

There was a lot of talk about bringing back simplicity as one of the cornerstones in agile development. XP was all about communication, simplicity, feedback and courage. Somehow the principle of using the simplest solution has been lost when we standardize on agile methods. Many of the talks were about simplicity, for example Johan talked about “Standards vs. Cargo cult“, Niklas talked about “Simplicity as means, opportunity as goal” and Marcus asked “Where did the simplest working solution go?”. All very good presentations. A somewhat related presentation was about allowing variation to facilitate evolution (variation + selection = evolution). The advice was to bring in some variation in each project and select the working solutions. When I think of it, that is usually how I work as well.

Kanban and Scrum

Another hot topic was Scrum and Kanban. Kanban is of course the new and trendy thing to talk about. Henrik gave a good talk where he managed to describe the similarities and differences in just 10 minutes. There were also some good presentations about how to apply Kanban and how Kanban and Scrum can be combined.

Is Scrum the new RUP?

I don’t remember who it was, but someone asked the question whether Scrum is the new RUP. The Agile movement was a reaction on stiff processes being forced down from upper management, but today we see the same thing happening with Scrum. Anna Herting for example described that to solve their problem with prioritizing maintenance activities they used Kanban, but still had to call it Scrum to satisfy the managers. In my view it is always bad to just apply a process methodology without thinking about what the needs are. You also want to continuously improve on the process and extend it with new ways of working. What you need is some principles to follow that will take you in the right direction and simple tools to choose from and use as inspiration.

People, Teams and how we behave

Many of the talks were about how to behave as responsible developers and how to create good team cultures. For example Åke made a talk about self-learning and asking good questions, asking them at the right time and asking lagom many questions. We also got the good advice from Jocke to promise behavior instead of results. He explained that promises are tools to gain trust, synchronize with people and create motivation. Chris talked about the different stages of learning. How you go from unconsciously incompetent -> consciously incompetent -> consciously competent -> unconsciously competent. The unconsciously incompetent needs inspiration while the unconsciously competent needs reflection. Lots of talk about creating openness and trust in teams, for example by establishing common team values.

All in all I had a great time. I really enjoyed the open and interactive format with Lightning Talks and Open Space sessions.