I am bulding an app and I tried YSlow and got Grade F on most of my practices. I have loads of jscript that am working on reducing. I want to be able to cache some of these because the pages get called many times.

I have one master age and I wanted to cache the scripts and css files.

How do I achieve this?
Are there any recommended best practices?
Are there any other performance improvements that I can make?

YSlow suggests you the different performance improvements you could do. As to caching your scripts, make sure they are externalized into separate files and they will be automatically cached by the client browsers. Also try to reduce their number if possible grouping them into fewer files in order to reduce network calls.
–
Darin DimitrovAug 20 '10 at 11:56

Not sure what are you asking here. YSlow recommendations are about client side. ASP.NET caching is about server side. Implementing ASP.NET cache will not change anything in YSlow rating.
–
MainMaAug 20 '10 at 11:57

1

Note: Ignore what YSlow says about e-tags if you aren't on a farm. If you are on a farm take the more complicated approach suggested. The advice to turn off e-tags is worse than useless and hurts performance.
–
Jon HannaAug 20 '10 at 11:59

Thanks guys! Very useful comments!!. I have Grade F on Add Expires headers. How do I work on this?
–
KenyanaAug 20 '10 at 12:02

3 Answers
3

Have you re-read RFC 2616 yet this year? If not, do. Trying to build websites without a strong familiarity with HTTP is like trying to seduce someone when you're extremely drunk; just because lots of other people do it doesn't mean you'll have good performance.

If a resource can be safely reused within a given time period (e.g safe for the next hour/day/month) say so. Use the max-age component of the cache-control header as well as expires (max-age is better than expires, but doing both costs nothing).

If you know the time something last changed, say so in a Last-Modified header (see note below).

If you don't know when something last changed, but can add the ability to know, do so (e.g. timestamp database rows on UPDATE).

If you can keep a record of every time something changed do so, and build an e-tag from it. While E-tags should not be based on times an exception is if you know they can't change in a finer resolution (time to nearest .5 second is fine if you can't have more than 1 change every .5 second, etc.)

If you receive a request with a If-Modified-Since with a date matching last change time or a If-None-Match matching the e-tag, send a 304 instead of the whole page.

Use Gzip or Deflate compression (deflate is slightly better when client says it can handle both) but do note that you must change the e-tag. Sending correct Vary header for this breaks IE caching, so Vary on User-Agent instead (imperfect solution for an imperfect world). If you roll your own compression in .NET note that flushing the compression stream causes bugs, write a wrapper that only flushes the output on Flush() prior to the final flush on Close().

Don't defeat the caching done for you. Turning off e-tags on static files gives you a better YSlow rating and worse performance (except on web-farms, when the more complicated solution recommended by YSlow should be used). Ignore what YSlow says about turning off e-tags (maybe they've fixed that bug now and don't say it any more) unless you are on a web-farm where different server types can deal with the same request (e.g. IIS and Apache dealing with the same URI; Yahoo are which is why this worked for them, most people aren't).

Favour public over private unless inapproprate.

Avoid doing anything that depends on sessions. If you can turn off sessions, so much the better.

Avoid sending large amounts of viewstate. If you can do something without viewstate, so much the better.

Go into IIS and look at the HTTP Headers section. Set appropriate values for static files. Note that this can be done on a per-site, per-directory and per-file basis.

If you have a truly massive file (.js, .css) then give it a version number and put that version in the URI used to access it (blah.js/?version=1.1.2). Then you can set a really long expiry date (1 year) and/or a hard-coded e-tag and not worry about cache staleness as you will change the version number next time and to the rest of the web it's a new resource rather than an updated one.

Edit:

I said "see note below" and didn't add the note.

The last modified time of any resource, is the most recent of:

Anything (script, code-behind) used to create the entity sent.

Anything used as part of it.

Anything that was used as part of it, that has now been deleted.

Of these, number 3 can be the trickiest to work out, since it has after all been deleted. One solution is to keep track of changes of the resource itself, and update this on deletion of anything used to create it, the other is to have a "soft delete" where you have the item still, but marked as deleted and not used in any other way. Just what the best way to track this stuff is, depends on the application.

You can also reference common js and css libraries to trusted online stores. For example, if you add jquery as <script src="http://code.jquery.com/jquery-latest.js"></script> the jquery file has probably been cached by the browser of client, because of another web site that references this address before, even if it is cached because of your web site.
This way may have pros and cons but there is such a way.
Also I don't know if response of YSlow changes with this way.

The pro is as you describe, the con is extra hit on dns-lookup if the client hasn't seen anything from that domain recently. Pro strongly outweighs con if using several files from the same domain. I don't know if response of YSlow changes either, but who cares - tools should work for us, not us for tools.
–
Jon HannaAug 20 '10 at 12:20