First - removing as much synchronization as possible, while staying
compatible is certainly a good thing. Synchronizing is _really_ costly -

Have you heard of "thin locks" and other VM level optimizations that reduces
"significantly" the cost of locks? Do not compute the cost of locks based on
their implementation in Sun's old JDK, or based on an implementation that does
not implement state-of-the-art techniques.

Yes, it is a good think not to "over synchronize" classes. But, you have to
make sure that your optimization is "sound".

As for the Hashtable size(). Is there any point where it matters ?

Yes. The "synchronize" will force the "current processor" to update its data
cache. Without the synchronize keyword, you processor could read an obsolete
value from its local cache, even though all other methods are synchronized. e.g:

Time 1
Processor 1:
reads "var1" which is also stored in the local "read" cache
Time 2
Processor 2:

uses locks to read var1. This forces all "write" caches on all processor
to "flush" to memory, and the local "read" cache to be updated.

Time 3
Processor 2:

uses locks to update var1. This forces the local "write" cache to "flush"
to memory.

Rule of thumb => if you are unsure, use synchronization and let the system
optimize the execution.

Systems like "HotSpot/Jalapeno" are getting to the point where the are able to
identify things like "single threaded programs" and thus eliminate all lock
operations when possible.

There are some C programmers that still use pointer arithmetic to "optimize"
their program; what they do not realize is that "gcc -o2" generates better code
when programs are written using array notation than when pointers navigate
through arrays. => Some optimizations are better left for the system to do.

Now that Jalapeno has been "open-sourced", be patient a little, and you'll see
Free Java implementation catch=up with these state of the art techniques. I
personally think that it would be a better time investment to write a "bug free"
and "complete" Classpath instead of optimizing already (other than obvious
optimizations like O(n) -> O(log n)).