I attended the first 30 minutes of Cal Henderson’s talk about Scalable Web Architectures. He’s among the very select club of technical speakers who manage to be funny and deeply knowledgeable, speaking in terms that non-geeks can actually understand. I worked several years as a Performance Engineer, so I admire people who can talk about non-functional requirements and make it interesting. Performance & Capacity talks tend to be some of the most boring in the IT industry, so that’s not a small feat.

I left the session midway just because I felt that the content was not the right fit for me. In my current role as an emerging technologies consultant I don’t typically have to deal with capacity assessment and planning, so I decided to attend the SEO session instead, presented by Chris Smith and Neil Pattel. I would have stayed till the end if I could attend the SEO session another time. Some conferences have 2 time slots for each session. I like that model better, as I often find myself having to choose between two topics I’m deeply interested at, and speakers are sometimes in a sunk cost situation: the travel and the time commitment has already been done, so adding one more session has only a marginal cost.

Smith & Pattel’s session is very good, but the slides are self-explanatory, so I don’t think I’ll be adding much value here by describing the content. The overall message is: there’s a lot of money left on the table by websites from companies large and small by just not leveraging totally ethical Search Engine Optimization techniques. This session presentation can be used a comprehensive checklist for you to assess your site’s ability to maximize its potential to be found.

My personal take from the SEO session is that the whole list of SEO things you should do seems to be very “automatable” to me, like those scripts for Java performance analysis that can scan your code and highlight areas for improvement. You still need the human component for the high-level assessment, but by automating more mechanical tasks you can hopefully focus on the real thing that is the ultimate driver of traffic: having good, relevant content.

Well, it’s more like tape delay, actually. I’m attending Dion Hinchcliffe’s workshop “Building Successful Next Generation Web 2.0 Applications” at the Web 2.0 Expo in New York. The room is practically full, and I’ve always been a big fan of Hinchcliffe’s great diagrams and clear thinking. His background is as an Enterprise Architect, so he speaks a language I can understand. I’m really bad in paying attention and taking notes, so I’ll just write the points that made an impression with me, or what I think to be the key messages of the session, not a summary of everything Dion said. As a final caveat, this is not necessarily what was said, but my imperfect and biased notes of what I think it was said. The slides will probably be available from the conference website anyway. Here is the just of it, in bullet points:

Whoever has the best data wins: The most successful apps are fundamentally powered by data

Attracting people to your website is a very expensive proposition, it makes sense to go where people are already

RSS: Not only for people to subscribe in their readers, it’s machine readable, so it allows others to add your info to their apps (gave a Mutual Funds example, whose date was absent from many aggregation services just because it did not have a feed)

Twitter had 10 times more users from its API than from Website – I’m surprised by how low that is, actually. I thought guess something like 30 times or more. Mentioned later that 90% of Twitter traffic is via the API, and related it to unpredictable scaling and peaks

The days of the 3-tier app (presentation/app/backend) are long gone! Each of the 3 tiers is now broken in very distributed components such as mashups/widgets/APIs/RSS/storage.

Providers of 3rd party sourcing need to make their services more consummable and be good citizens for their partners

Amazon’s S3 cost 10 to 15 times less than if you build your own storage capability

The platform overtakes the web site: showed how the bandwidth consumed by Amazon Web Services passed the bandwidth consumed by Amazon’s Global Websites

TechCrunch reported this morning that Google’s Chrome browser already represents 8.12% of their hits – Just checked that: Chrome is about to overtake Safari (8.84%) for TC visitors. Is the Googlezon Orwellian world happening already?

The major issue holding widespread adoption of mashups in business contexts is the lack of access to a user’s private data

A key Web 2.0 Strategy:Turning applications into platforms

Openly exposing the features of SW and data to customers, end-users, partners, and suppliers for reuse and remixing

Google’s OpenSocial: maybe the future of building social networking applications

Apache now allows to run OpenSocial (and all Google Gadgets, for that matter) in any Apache server

Demoed Flash Earth as a mashup example. Mashups are also moving towards standards.

Overall, I think it was a really good session, Dion’s message feels solid and authoritative. Some feedback for the organizers:

1. This was not really a workshop, just a regular lecture with Q&A at the end. I found the duration to be a bit too long, but I understand that having Dion speaking is a privilege and the session was dense with content, so maybe there’s not a really good solution for that.

2. Crowdvine is great, speakers should ask attendees to provide feedback and rate the session immediately after the session is over.