JavaScript minification — is it worth it?

I've been secretly working on UglifyJS v2. I'll release some nice code, very soonish.
Here's just a thought I've been pondering.

Some quick test against my 650K DynarchLIB reveals an
interesting observation. It seems that the lion's share in compression is
brought by name mangling (i.e. converting local names to single characters)
and whitespace removal. Everything else is, well, quite insignificant.

“Compression”, meaning those complicated AST transformations that lead to
replacing if with the ternary operator, or to block brackets
removal, saves about 5.5K on my file. That's not bad, but after
gzip the net savings go down to 500 bytes.

For those 5.5K 500 bytes, UglifyJS spends 2 long seconds
only on squeezing the code (all the other steps combined, like
parsing, mangling, lifting variables and generating code, take way less than
2 seconds).

Is it worth it? I'm not so sure... In any case, I do plan to make v2
compression as good as v1's, but I couldn't help noticing this fact:
if I had 100,000 unique visitors each day (and that's quite an astronomical
figure to me), and if that script would be served to each of them (no cache
involved), then 500 bytes less in the file would save me about 1.5G/month,
which is about 0.015% of my monthly bandwidth (which I pay for anyway). I'm
not very sure it's worth the trouble.

Update: better than “is it worth it?” perhaps the question
should be “how could I make it better?” Every byte matters after
all, but it seems to me that working 2 seconds for a net saving of 500 bytes,
which is 0.68% of the gzipped file size, means we're doing something wrong...

I agree, it makes sense to put the savings into perspective, e.g. the amount of bandwidth saved. Perhaps some of the more time-consuming, but low yielding, compressions could be switched on/off with a flag, e.g. --fast or --slow.
It could be either opt-in (skipping the slow compressions by default) or opt-out, such that by default all compressions are applied, but for those that care about speed a --fast flag would disable some.
For our app we used to do maximum compression, but have fallen back to preserving line-breaks. Sure, it's slightly larger, but after gzip the difference isn't that big and being able to debug code on the production servers (e.g. set breakpoints) is worth the somewhat larger files.

I actually reconsidered everything I said here. See, I was thinking from the position of having almost unlimited bandwidth.
Everything changed in the past few days because I moved to a new place; it's a new neighborhood and there's no ISP yet to have wired the area, so my only option is a mobile connection. My prepaid traffic is 1.5G; byte beyond that costs me more money. For a guy like me 1.5G is a painful limit.
Now I realize that those 500 bytes, if saved on every website I visit, would truly make a difference!
UglifyJS v2 supports source maps, which makes debugging minified production code much more straightforward.

Sure, some compression algorithms are complex and slow to run, with typically only small savings produced. However, this depends hugely on the code UglifyJS is being run with.
For languages that compile to JavaScript, they tend to just dump their whole standard library into the compiled file. Similarly, a JS build workflow may concatenate libraries with the application code. In both these cases, dead-code elimination is invaluable.
I've also worked on JS code which had a large amount of data in constants (for example, a program that had a large number of lists of colour names with hex values). In this case, some of the syntactic tricks in UglifyJS produce significant savings.
So, for many styles of writing code, a sophisticated minifier makes a huge difference. Thanks for UglifyJS{,2} :)