MVC is a great pattern as long as you don't take it too far. The concept is perfect, separating controller logic (queries, etc) from the presentation will make things *infinitely* easier in the long run.

It especially comes in handy when developing alternate versions of sites (think mobile). All your logic is already in place, you just have to show a different view (presentation layer) and you're good to go. Can you imagine creating a mobile version if all the app logic and views are in the same file? Nightmare.

What I don't understand is when people insist on NO php code being in the presentation. They would rather have it full of things like {row.value} instead of <?php $row['value'];?>. Why bother? php is perfectly fine for templating, why increase the processing load by parsing a template with a templating language? Unless you're building the presentation layer to work across multiple backends - asp, jsp etc., I see this as a total waste of time.

A lot of people, myself included, stayed at the Watertown Hotel last year. It's walking distance from the seminar and it's very nice (and affordable). I'll be making reservations there again for this one. June 18, 2008

I third the VPS option. It's still relatively cheap, I pay about $40/month through linode, but you're actually getting what you're promised.

Pretty much every hosting plan that offers 100's of gigs of bandwidth for $5-$10/month anymore is promising more than what they deliver. The fine print is: if you go over the unpublished CPU, Memory or Database limits, you're all done. That can be a tough lesson for a start up to learn, as it may result in days of downtime.

Shared plans aren't meant for the [high traffic] database intensive sites that people are using today, if you're trying to use them, caching better be your best friend.

I track outgoing clicks on a couple of the sites I manage at work and was getting seriously skewed results from spiders/bots following the links on the site. I tried filtering by user agent, but found that alot of bots spoof them. The only successful way I've found of filtering them out is to check for empty($_SERVER['HTTP_REFERER']) in php. I haven't had one crawler trigger the script since I started that.

I'm not sure how you might test that for scrapers since you don't want to filter users who type in the URL or bookmark it, but I thought I would throw it out there in case it gives someone an idea.