Details

Description

The book "High Performance Web Sites" by Steve Sounders identifies a number of issues which degrade web page rendering.

#1 If a JavaScript imports listed above CSS imports will delay the download of the CSS imports until the JS imports have completed.

#2 JavaScript blocks <script> </script> in the header of the page will delay the progressive rendering of HTML until the entire page has been downloaded.

The Click PageImports class could be modified to support $headerImports which include an ordered list of import statements, with JS imports following CSS imports. The class could also provide a $footerImports which contains JS blocks to be included at the bottom of the page.

To support backward compatability the $imports should still be maintained, but should have CSS imports listed above JS imports, with JS script blocks following last.

It lists about 15 items that can improve performance. It also recommends Css on top, javascript on bottom. But most javascript guru's follow the "Unobtrusive javascript" pattern, meaning no javascript should appear anywhere in the <body> tag.

I think #1 and #10 will have a good impact on performance. A build script can solve #10 and a Filter can solve #1 by aggregating external files and caching the result.

Note I do not mean Click should solve these, but is good general practices. Perhaps I can add a FAQ on this topic.

Bob Schellink
added a comment - 10/Jan/08 08:50 Hi Malcolm
Perhaps you know this already but Yahoo has a performance plugin called YSlow.
http://developer.yahoo.com/yslow/
http://developer.yahoo.com/performance/rules.html
It lists about 15 items that can improve performance. It also recommends Css on top, javascript on bottom. But most javascript guru's follow the "Unobtrusive javascript" pattern, meaning no javascript should appear anywhere in the <body> tag.
I think #1 and #10 will have a good impact on performance. A build script can solve #10 and a Filter can solve #1 by aggregating external files and caching the result.
Note I do not mean Click should solve these, but is good general practices. Perhaps I can add a FAQ on this topic.

Have checked in changes and added an PerformanceFilter to address #1, #3, #4. Will open a new JIRA item for the PerformanceFilter

There is an issue with the JS Chart controls which are currently rendered in the page, and does not work as the JS includes are now at the bottom of the page. The JS include needs to be updated, to write to the contents of a HTML div element.

Malcolm Edgar
added a comment - 14/Jan/08 17:47 Have checked in changes and added an PerformanceFilter to address #1, #3, #4. Will open a new JIRA item for the PerformanceFilter
There is an issue with the JS Chart controls which are currently rendered in the page, and does not work as the JS includes are now at the bottom of the page. The JS include needs to be updated, to write to the contents of a HTML div element.

The browser will have to hit the server 5 times. Once for the .htm and 4 times for the external files.

If we combine the external files into 2 seperate ones (one for all css, one for all js), the browser only needs to hit the server 3 times. So even if we had 10 external files, they will still be combined into 2 files. The header section would be rewritten to this:

I can see some problems with the above in that not all applications could work like this. Some pages might conditionally include controls, thus external files might differ between requests, and the cache will have to be rebuilt, hampering performance again.

So how about we create a method createPageImports in ClickServlet to inject a custom PageImports class and we can experiment with something like this in our own projects? There are already a couple of create* methods which makes it possible to extend Click quite a bit.

Bob Schellink
added a comment - 15/Jan/08 01:28 Very nice!
Looking at the code I am not sure how #1 is solved. Unless you mean that by setting expiry headers, there will be less traffic.
Also I think "CSS Spirtes" and "Image Maps" can only be solved by the user.
For "Combined files" however the following could potentially work:
say PageImports scan the imports of all controls and come up with the following:
<link type="text/css" rel="stylesheet" href="/click-examples/click/table.css"/>
<link type="text/css" rel="stylesheet" href="/click-examples/click/control.css"/>
<script type="text/javascript" src="/click-examples/click/control.js"></script>
<script type="text/javascript" src="/click-examples/click/NumberField.js"></script>
The browser will have to hit the server 5 times. Once for the .htm and 4 times for the external files.
If we combine the external files into 2 seperate ones (one for all css, one for all js), the browser only needs to hit the server 3 times. So even if we had 10 external files, they will still be combined into 2 files. The header section would be rewritten to this:
<link type="text/css" rel="stylesheet" href="/temp/example-page-all.css"/>
<link type="text/css" rel="stylesheet" href="/temp/example-page-all.js"/>
The aggregated files must cached on disk or memory.
I can see some problems with the above in that not all applications could work like this. Some pages might conditionally include controls, thus external files might differ between requests, and the cache will have to be rebuilt, hampering performance again.
So how about we create a method createPageImports in ClickServlet to inject a custom PageImports class and we can experiment with something like this in our own projects? There are already a couple of create* methods which makes it possible to extend Click quite a bit.

Bob Schellink
added a comment - 15/Jan/08 18:23 Seems an ant build script can nicely solve both #1 and #10
http://www.julienlecomte.net/blog/2007/09/16/
The above article is by the author of YUI Compressor. YUI Compressor can be used to strip comments and minimize .js and .css files.
After the files are stripped and minimized one gets better GZIP compression.

thanks for the feedback and ideas. With regard to item #1, yes I was thinking that setting the Expiry headers to cache static content will effectively, address this item on subsequent requests. I appreciate that this is not the case with the home page where the inital page load will have to pull down the various static resources.

I have been thinking about this stuff over the last week or so, and how we best balance the design requirements:

1 Simple development model, for Click framework developers

2 Backward compatibility

3 Improved performance

4 Click Framework code base complexity

We need to get the right balance with this stuff, and I think the priority order above is about right.

One issue we have with the Click Extra's controls is that they can include their own JS and CSS file, so you can end up with a lot of imports and requests. However with the new caching of these resources, this problem will largely go away. For most home pages, they would only include a table and a login form, so I dont think there would be that many includes that people should worry too much.

In this use case, if I was writing a really really fast home page, I would probably now do the imports by hand, and not rely on Click, but that would be the exception rather than the rule.

Adding the capability to condense all these imports into a couple of files would be pretty cool, but will make the framework more difficult to configure for developers, as we would need to introduce new configuration options for click.xml and the PerformanceFilter. And explaining how this all works, is a bunch of documentation. So I am pretty hesitant to implement this.

With regard to minimizing the JS files, if we do that it will make debugging harder to do, and I think GZIP compressing the files addresses this comms issue anyway.

However, all that said I would be happy to add a method to the ClickServlet for people to override, so this concept could be implemented.

Your CSS and JS Block idea is very interesting, I have been thinking around this issue for a little while now, and was thinking about adding a getHtmlImports() method to the Page class, so it can work with the PageImports.

Malcolm Edgar
added a comment - 17/Jan/08 01:24 Hi Bob,
thanks for the feedback and ideas. With regard to item #1, yes I was thinking that setting the Expiry headers to cache static content will effectively, address this item on subsequent requests. I appreciate that this is not the case with the home page where the inital page load will have to pull down the various static resources.
I have been thinking about this stuff over the last week or so, and how we best balance the design requirements:
1 Simple development model, for Click framework developers
2 Backward compatibility
3 Improved performance
4 Click Framework code base complexity
We need to get the right balance with this stuff, and I think the priority order above is about right.
One issue we have with the Click Extra's controls is that they can include their own JS and CSS file, so you can end up with a lot of imports and requests. However with the new caching of these resources, this problem will largely go away. For most home pages, they would only include a table and a login form, so I dont think there would be that many includes that people should worry too much.
<link rel="stylesheet" type="text/css" href="/click-examples/style.css" title="Style"></link>
<link type="text/css" rel="stylesheet" href="/click-examples/click/menu.css"></link>
<script type="text/javascript" src="/click-examples/click/control.js"></script>
<script type="text/javascript" src="/click-examples/click/menu.js"></script>
<script type="text/javascript">addLoadEvent( function()
{ initMenu() }
);</script>
In this use case, if I was writing a really really fast home page, I would probably now do the imports by hand, and not rely on Click, but that would be the exception rather than the rule.
Adding the capability to condense all these imports into a couple of files would be pretty cool, but will make the framework more difficult to configure for developers, as we would need to introduce new configuration options for click.xml and the PerformanceFilter. And explaining how this all works, is a bunch of documentation. So I am pretty hesitant to implement this.
With regard to minimizing the JS files, if we do that it will make debugging harder to do, and I think GZIP compressing the files addresses this comms issue anyway.
However, all that said I would be happy to add a method to the ClickServlet for people to override, so this concept could be implemented.
Your CSS and JS Block idea is very interesting, I have been thinking around this issue for a little while now, and was thinking about adding a getHtmlImports() method to the Page class, so it can work with the PageImports.
public AjaxPage extends BorderPage {
..
public static final String HTML_IMPORTS =
"<script type='text/javascript' src='
{0}/click-examples/ajax/prototype.js'></script>\" +
"<script type='text/javascript' src='{0}
/click-examples/ajax/rico.js'></script>\n" +
"<script type='text/javascript'>addLoadEvent(registerAjax) </script >";
..
public String getHtmlImports()
{
return ClickUtils.createHtmlImport(HTML_IMPORT, getContext());
}
}
However this wouldn't really solve issue in developing complex pages. For example in the Click AJAX examples:
<script type='text/javascript'>
function registerAjax()
{
ajaxEngine.registerRequest('getCustomerInfo', '$context/ajax/ajax-customer.htm');
ajaxEngine.registerAjaxElement('customerDetails');
}
function onCustomerChange(select)
{
ajaxEngine.sendRequest('getCustomerInfo', 'customerId=' + select.value);
}
</script>
I think I should move the rest of this discussion onto your other issue.
regards Malcolm Edgar

I agree that solving #1 is pretty tricky. Also I can imagine that requirements would differ from page to page. Some pages can be pretty static eg a home page, so aggregating all imports could work. Other pages could be more dynamic, for example hiding/showing different controls depending on certain state. In such cases the imported files would differ each request.

So I was thinking about adding a performance FAQ and pointing to some tools or info on performance tuning. The above blog could be mentioned there. What I like about the blog entry, is it shows how to tune performance as part of the build process, making it orthogonal to the application implementation.

Lastly having getHtmlImports on Page makes sense, since it maps logically to the html page. I think that even if more methods are added to Page, the API would still feel natural.

Bob Schellink
added a comment - 17/Jan/08 05:36 Hi Malcolm,
I agree that solving #1 is pretty tricky. Also I can imagine that requirements would differ from page to page. Some pages can be pretty static eg a home page, so aggregating all imports could work. Other pages could be more dynamic, for example hiding/showing different controls depending on certain state. In such cases the imported files would differ each request.
So I was thinking about adding a performance FAQ and pointing to some tools or info on performance tuning. The above blog could be mentioned there. What I like about the blog entry, is it shows how to tune performance as part of the build process, making it orthogonal to the application implementation.
Lastly having getHtmlImports on Page makes sense, since it maps logically to the html page. I think that even if more methods are added to Page, the API would still feel natural.

I have been going back and forth about #1, but the designs I think up always end up being complex and not "largom".

I like your idea of adding a "Performance" topic, we could add this as new doco page or under "Best Practices". I would probably add to to "Best Practices" down the bottom, and if it gets really big, we could break it out as another page.

I am wondering if we should rollup the table.css file into control.css, and introduce an extras-controls.js file which roles up the simple extras JS files:

CreditCardField.js

EmailField.js

menu.js

NumberField.js

RegexField.js

One the getHtmlImports() yes I think it would be good to add this to the page as well. I will play around with this in Click Examples to see how it works out with the Ajax examples.

Malcolm Edgar
added a comment - 17/Jan/08 18:02 Hi Bob,
I have been going back and forth about #1, but the designs I think up always end up being complex and not "largom".
I like your idea of adding a "Performance" topic, we could add this as new doco page or under "Best Practices". I would probably add to to "Best Practices" down the bottom, and if it gets really big, we could break it out as another page.
I am wondering if we should rollup the table.css file into control.css, and introduce an extras-controls.js file which roles up the simple extras JS files:
CreditCardField.js
EmailField.js
menu.js
NumberField.js
RegexField.js
One the getHtmlImports() yes I think it would be good to add this to the page as well. I will play around with this in Click Examples to see how it works out with the Ajax examples.
regards Malcolm

You probably noticed but CheckList and FormTable examples does not work currently. (As well as the charts)

1) CheckList
The problem with CheckList is that scriptaculous.js contains code that matches the string 'scriptculous.js'. It then downloads other libraries. In fact that is really all scriptaculous.js does. Its a wrapper to download other libraries.
By renaming scriptaculous.js it fails to download its libraries and CheckList fails. One workaround is to not rename scriptaculous.js. Another is to not use scriptaculous.js at all, but rather specify the libraries specifically:

There are five libraries in total: 'builder.js', 'effects.js', 'dragdrop.js', 'control.js' and 'slider.js'

2) FormTable
DateField does not render a <script> tag for each field. Also the field name is null. In my browser the <script> tag is rendered as -> <script ....>Calendar.setup({ inputField : 'table_form_null'...</script>
Not yet sure what is wrong with the Date

Bob Schellink
added a comment - 17/Jan/08 18:44 You probably noticed but CheckList and FormTable examples does not work currently. (As well as the charts)
1) CheckList
The problem with CheckList is that scriptaculous.js contains code that matches the string 'scriptculous.js'. It then downloads other libraries. In fact that is really all scriptaculous.js does. Its a wrapper to download other libraries.
By renaming scriptaculous.js it fails to download its libraries and CheckList fails. One workaround is to not rename scriptaculous.js. Another is to not use scriptaculous.js at all, but rather specify the libraries specifically:
buffer.append("<script type=\"text/javascript\" src=\"");
buffer.append(path);
buffer.append("/click/prototype/effects_");
buffer.append(getClickVersion());
buffer.append(".js\"></script>\n");
There are five libraries in total: 'builder.js', 'effects.js', 'dragdrop.js', 'control.js' and 'slider.js'
2) FormTable
DateField does not render a <script> tag for each field. Also the field name is null. In my browser the <script> tag is rendered as -> <script ....>Calendar.setup({ inputField : 'table_form_null'...</script>
Not yet sure what is wrong with the Date
regards
bob

Malcolm Edgar
added a comment - 18/Jan/08 01:29 Thanks Bob for the analysis. Have done a bunch of changes to click-examples, so please keep an eye out for anything broken.
1) - I think your option of including the js files specifically is the correct approach.
2) - Interesting, I think the form table will have to iterate over its fields using a visitor pattern in getHtmlImports()
regards Malcolm Edgar

Ah you are right about the getHtmlImports. By iterating over all rows in getHtmlImports one can set the field name and add the Field.getHtmlImports to the current buffer.

I wonder if the PageImports performance will suffer from all the imports. Currently it uses a List and has to iterate through it each time a new item is added. Perhaps a LinkedHashSet would be faster. It also keeps its order like a List.

Bob Schellink
added a comment - 18/Jan/08 07:17 Ah you are right about the getHtmlImports. By iterating over all rows in getHtmlImports one can set the field name and add the Field.getHtmlImports to the current buffer.
I wonder if the PageImports performance will suffer from all the imports. Currently it uses a List and has to iterate through it each time a new item is added. Perhaps a LinkedHashSet would be faster. It also keeps its order like a List.

Malcolm Edgar
added a comment - 18/Jan/08 20:09 Have checked in the CheckList fix suggested. Haven't tested it yet though.
Regarding using a LinkedHashSet, you could give it a try in a Unit test, I would be surprised if there was much in it, but every little bit helps.

In short the performance varies depending on the length of the string and number of iterations.

The set outperforms the list when the string is less than ~10000 chars. As the string gets longer the Set becomes slower. But the performance is much worse than linear O. I assume its because hashing larger string is expensive.

The list becomes slower as its size increase. Obviously since it must do more comparisons.

So it looks like the Set can provide a little boost if the strings are < 10000 and the iterations are large. In Click's case I think an import would at most be ~500. A cssBlock or jsBlock could be much larger tho...

But the number of imports would seldomly > 10. Except for tables eg. FormTable where the amount of imports depends on the amount of rows. But this is the extreme case rather than the norm.

So Malcolm is probably right and the set wouldn't provide any advantage at this stage.

Bob Schellink
added a comment - 19/Jan/08 12:03 Attached is a test for ListVsSet.
In short the performance varies depending on the length of the string and number of iterations.
The set outperforms the list when the string is less than ~10000 chars. As the string gets longer the Set becomes slower. But the performance is much worse than linear O . I assume its because hashing larger string is expensive.
The list becomes slower as its size increase. Obviously since it must do more comparisons.
So it looks like the Set can provide a little boost if the strings are < 10000 and the iterations are large. In Click's case I think an import would at most be ~500. A cssBlock or jsBlock could be much larger tho...
But the number of imports would seldomly > 10. Except for tables eg. FormTable where the amount of imports depends on the amount of rows. But this is the extreme case rather than the norm.
So Malcolm is probably right and the set wouldn't provide any advantage at this stage.

Malcolm Edgar
added a comment - 31/Jan/08 07:16 Updated JSBarChart, JSLineChart, JSPieChar to support JavaScript include at bottom of page. Introduces an abstract JSChart control which these controls extend.
Checked in and will be available in release 1.4RC3