I have about 20 layers that I'd like to put on a Google maps API. I assume I'll need some form of a tile cache server and perhaps under mapserver or geoserver. These layers amount to about 400mb so the option of using static maps using kml/kmz is not an option. The data exists on two geodatabases, ESRI and PostGIS.

Can you provide insights on publishing large amounts of data to google maps api? especially in a way that allows users to easily do queries,load and unload layers at a relatively acceptable speed? Are there any good reads that improve the understanding of the issues and challenges?

You might take a look at the new Google Fusion Tables. I personally haven't worked with it much yet and don't know if it works well with amount of data you have but storing data on Google large server base might be a good thing. Here are a couple pf links to some blogs and the Fusion Tables.

It is often useless to try to display so large amounts of data, because humans have limited capacities to read information (see this example). Even if we had no network constraints to diffuse and display big datasets, we would have to face this readability constraint.

If a dataset is too large to be diffused and displayed, it means it has to be simplified according to the visualization scale. A solution may be to enrich your dataset into a multi-scale dataset using generalisation techniques and use a 'scale aware' web visualisation client.