Yesterday, Google’s Webmaster Blog gave some great advice for Web 2.0 application developers in their post titled “A Spider’s View of Web 2.0“.

In that post, they recommend providing alternative navigation options on Ajaxified sites so that the Googlebot spider can index your site’s pages and also for users who may have certain dynamic functions disabled in their browsers. They also recommend designing sites with “Progressive Enhancement” — designing a site iteratively over time by beginning with the basics first. Start out with simple HTML linking navigation and then add on Javascript/Java/Flash/AJAX structures on top of that simple HTML structure.

Before the Google Webmaster team had posted those recommendations, I’dÂ published a little article early this week on Search Engine Land on the subject of how Web 2.0 and Map Mashup Developers neglect SEO basics. A month back, my colleague Stephan Spencer also wrote an article on how Web 2.0 is often search-engine-unfriendly and how using Progressive Enhancement can help make Web 2.0 content findable in search engines like Google, Yahoo!, and Microsoft Live Search.

We’re not just recycling each other’s work in all this — we’re each independently convinced of how problematic Web 2.o site design can limit a site’s performance traffic-wise. If your pages don’t get indexed by the search engines, there’s a far lower chance of users finding your site. With just a mild amount of additional care and work, Web 2.0 developers can optimize their applications, and the benefits are clear. Wouldn’t you like to make a little extra money every month on ad revenue? Better yet, how about if an investment firm or a Google or Yahoo were to offer you millions for your cool mashup concept?!?

But, don’t just listen to all the experts at Netconcepts — Google’s confirming what we’ve been preaching for some time now.

It is just too easy to jump right to the mashup stage with today’s tools. In my mind, that is the problem.

I am not trying to claim perfection, but my process is always:

1. Take the design and carve out the main elements (i.e. we’ll need lists, vertical navigation, heading, subheading, and paragraphs)

2. Design the page without CSS or JS, using just good semantic markup (H1, et al). At this point you’ve catered to the lowest common denominator (the bot). The important part here: links. Each link should point to something, not only activate a script. You can use JavaScript later to change the behavior of the link. But for the bot (or the person without JS) you’ve now made sure your content is 100% available.

3. Start styling the page with CSS. You’ll probably need to add SPANs and DIVs to realize the design most likely, but that’s fine, as it doesn’t materially affect the look of the unstyled page. Donâ€™t hide objects that youâ€™ll want a bit of JavaScript to later show. If the user doesnâ€™t have JS activated, they may never see it!

4. Use window.onload to load your JavaScript. Walk the DOM, add “onclick” events and “return false;” to any links you want to activate a script (insetad of linking to another page). Hide/animate/etc.

5. Profit (?)

It is definitely not as easy (or fun) as just tossing together a bunch of scripts and creating something neat, but it does guarantee that as you peel back each layer of web tech (behavior, style) you still have a usable site.