A blog about web development, marketing on the web and web project management.

Okay, one of the reasons why posting has been light on this blog is because we have been busy. My wife and I have also had our first child (well, two years ago) and to be honest, our work has become less innovative in terms of doing cool new things and more iterative, as in we have been applying a lot of the cool things we learned and developed over the last few years.

This happened because we changed our business model from agency style to "plug us into your operations and we will be your dev crew" style.

Anyways, every once in a while we like to take stock and see where we may be able to gain some time so as to try and work in our next direction or new model, whatever that may be. The applications that we use are often places where we can find cool new stuff and gain time.

What things have you done to find extra time? Please, share below!

Here are three things that, in the last year point five have helped us find some extra time.

Navicat: we moved all of the bits of PHPmyAdmin accesses over to navicat at about the end of 2007 and this was an excellent move. Tonnes of time gained.

Buying a netbook and using Maxivista and Ultramon: adds two new monitors to my existing setup. I now have a 4 monitor two computer setup which allows me to not only have two more monitors but I can also have open two versions of many of the apps that I use (useful for multitasking with teams etc.)

Moving simpler client sites to WordPress: now that it has one step upgrades and almost doesn't require the use of an FTP client to get up and running on some hosts.

Lately I've taken to subscribing to many newsletters and the "free" programs being offered by internet marketers - think Frank Kern, Yanik Silver et al. - and people like Aaron Wall and Shoemoney and that Brian fellow over at Copyblogger.
If you, like me, receive some of these emails, you may have noticed how their sales methods have taken the typical long web page sales pitch and turned it on its side. They've spliced it into emails and videos and feed that info to us in a much more interactive and entertaining manner then the long winded sales pages of old.
So this morning while trying to convince my 21 month old son to go to the park (that's right, to convince him to go to the park) I found that the usual things were not working. That is when it hit me.
Parental persuasion ala Frank Kern
Please note that I have not met nor do I know Frank Kern, and I am only singling him out because his name stuck with me. I suppose this parody below could be recognizable to Jeff Walker's children as well. One last note, I have found the free info they give away as they get you to the offer/pitch/monthly service to be quite valuable.
Dad: Hey son, want to go to the park with your favorite car and play on the swings?
Son: No!
Dad: Oh, did I mention that I found some extra strawberries, your favorite fruit? I figured you would want them and washed some for you. Want to go to the park with your favorite car and play on the swings and have some strawberries?
Son: My car...? No!
Dad: We can take the little soccer ball and some balloons. We know you like balloons so we bought some extras last night. Want to go to the park with your favorite car and play on the swings, with the ball and the balloons and have some strawberries? You don't have to do anything, just have fun!
Son: Balloons...? Where's my car? No!
Dad: Hey listen. Its 8:00, the street cleaner will pass soon. If we go now we can also see the street cleaner! You love the street cleaner and if we don't go now who knows when he will pass again.. Maybe never! Want to go downstairs and see the street cleaner, then go to the park with your favorite car and play on the swings, with the ball and the balloons and have some strawberries?
Son: Street cleaner...? Balloons...? Where's my car? Hmm... (asks for shoes, walks over to the car...) No!
Dad: Listen. If we go down now, we can stop at the bakery. I'll order an espresso (you love the noise from the espresso machine!) and get you some toast and jam (I'll pay; you get this fre.e!) and we can sit at a table outside and watch the street cleaner. Then we can go to the park with your favorite car and play on the swings, with the ball and the balloons and have some strawberries? And one last BONUS: we can stop at the fountain and throw in some rocks!! Look, if you don't have fun doing this, later I will take you to the pool. YOU CAN"T LOSE!!
Son: (getting up on the car...) Yes! Lets do it! (makes some vroom vroom noises with the car).
Father and son head down the elevator out to the bakery. Son demands the water fountain in the park so we head straight there, where strawberries are eaten and some rocks thrown in. Then he says "casa" (hey, we live in Spain) and demands to go home. We stop to watch the street cleaner pass by and head home, not having visited the swings, played with the balloon or balls or had breakfast at the bakery.
(Okay, so in the end I also pick on we folks who buy these things - be they live the internet lifestyle products or self-help books - and don't implement them to the fullest :)
[...]

Lets face it, when a visitor does arrive it only follows that we should do our best to help them see the value in our website, no?

Welcome new visitor, here is our feed, blah blah... Can't we do better then that?

I see a lot of variations on the Welcome new visitor, here is our feed type of thing when I arrive at blogs and such these days. Sometimes this gets customized if the site determines that I am a "Googler" (visiting from a search engine) and then offers me some piece of text to try and make me become a passionate user of their site.

This strategy never makes me a passionate user.

What does work is when I read the page in question and then navigate around the site and find more great content.

So the trick should be to make great-content discovery the goal.

Welcome Googler, let us help you out

Here we present one solution that works for helping people discover your site. As a side effect it will increase your pageviews in a proper, natural way. (We have a whole pile of other solutions for this, however thatmanuscript post isn't quite ready yet.)

Check referer string

If search engine, grab query text

Do a full text search on your content to find other articles on your site that are related to their search query

Pass the resulting list to the reader in a user friendly way

Maybe keep that list persistent for the session, unless they close it

What we have done is created a custom, on-the-fly navigation system based on their search query! This little widget should work to keep them poking around your site.

Placement etc.

We've been using this on several sites now (along with some other ideas alluded to above) and it works. Pageviews per user go up. Bounce rate falls (more on that in the future too).

We have had to play with the placement of this box: top of the page? Floated to the right/left of the main page content? Following them down the page (with js)?

As they say, your mileage may vary, but chances are you will get more mileage out of more readers, and that is a good sticky thing.

This post comes a bit late in the whole web 2.0 cycle. I feel that it bears repeating because I have come across sites that don't follow some basic principles when pulling in 3rd party data from sites such as flickr, twitteret. al.

APIs and data portability

The blessing of popular and easy to use APIs and the data portability of web 2.0 applications has had an unfortunate side effect, and that is that some implementations that use these services do not integrate appropriate contingency design should these 3rd party services fail.

Caching data calls to APIs is a good bit of contingency design. Many APIs will require caching - like that of Amazon - but I suspect this is intended to help limit resource use of the API host, not the site using the API. The reasons a person using API accessed data on their website would want to cache the data are:

To speed up the load time of their website

To have a back up plan if the API call fails

A simple implementation to handle those two cases would be one that caches an API call for a given amount of time and one that freshens stale cached data and triggers an error should an API call fail.

Caching is good contingency design practice

As I said above, this post is a bit late to the party but it is worth writing as recently I have come upon at least three sites where firebug and other widgets have revealed issues retrieving API fetched data and the site loading times have been horrible.

A decent implementation idea would be to roll your own caching wrapper and agnostically plug it in to a stable caching tool, perhaps something like Cache Lite for PHP. In this manner you have a reusable, caching library independent piece of code that can handle caching/flushing and refreshing of data which could function to handle the two cases discussed above.

And that's it. It's been 541 days since my last post. Wow. I hope this is a re-start of a new phase of blogging. Right, and it looks like I had not built the commenting functionality into this version of the site. What a surprise. I'd still like feedback so if anyone has any email me at mike at this domain and I'll pop a comment right into the database. Off to build some commenting functionality... Comments should be working now.

This is an interesting consequence that seems to be getting pushed on SEO, rather then perhaps looking at it from the aspect of accountable reporting, no?

Nicholas states that:

With search engine optimization - or SEO, as it's commonly known - news organizations and other companies are actively manipulating the Web's memory. They're programming the Web to "remember" stuff that might otherwise have become obscure by becoming harder to find.

The result is that:

People are coming forward at the rate of roughly one a day to complain that they are being embarrassed, are worried about losing or not getting jobs, or may be losing customers because of the sudden prominence of old news articles that contain errors or were never followed up.

In Summary

So, in the past as the print info (newspaper issues) simply disappeared or, more recently, as they hid the content behind paywalls and poor SEO, newspapers didn't have to worry about the consequences of articles that contain errors or were never followed up, but now people may suffer from these mistakes and lack of integrity.

What do you think the answer should be? Nicholas Carr asks Should the Net forget? I'm not so sure, and I don't think that the answer is that simple.

There's a learning curve to moving print onto the web, and this case encompasses one facet of what needs to be conisdered, but it would be great if some form of integrity from those doing the reporting kept these kinds of things from happening.

In our CMS/Framework, we set up a controller with the code from above to respond at a given URL, for example http://www.example.com/__FOO. By passing a function name as a GET variable, in this case 'f', and the parameters necessary for that function to work as subsequent GET parameters, the result of that function will be printed to the screen.

Some simple suggestions
Well I don't consider myself an expert, I do have experience with working with larger datasets and there are a couple of things that I always do to keep queries performing well.
Optimize Queries with EXPLAIN
Explain is your friend, get to know it well. If you take the time to read thru the Explain documentation on the MySQL site, you will find some valuable information, some of which is hilighted below.
Optimizing joins
Single sweep what?
MySQL resolves all joins using a single-sweep multi-join method. This means that MySQL reads a row from the first table, and then finds a matching row in the second table, the third table, and so on. When all tables are processed, MySQL outputs the selected columns and backtracks through the table list until a table is found for which there are more matching rows. The next row is read from this table and the process continues with the next table.
Why is this important? Imagine a main table - tableA - with 80,000 rows of data. This table has a corresponding n:n table that maps entries in tableA with a locations table. A query could be written as:
SELECT tableA.*, locations.location from tableA
Left Join tableA2locations on
tableA2locations.tableA_id = tableA.id
Left Join locations on
tableA2locations.location_id = locations.id
where locations.location = 'sometown'
Keeping the above quote in mind, MySQL will read a row from the first table and join the corresponding data from the joined tables for that row and then sweep thru the rest of the data, joining as it goes along.
This leads us into the following section.
Number of rows needed to execute a query
You can get a good indication of how good a join is by taking the product of the values in the rows column of the EXPLAIN output. This should tell you roughly how many rows MySQL must examine to execute the query.
From the above, you can determine that for a query on tables that have not been properly indexed, a join can quickly become unwieldy when dealing simply with three tables with records in the thousands (1000*1000*1000 = a slow query). See HackMySQL for a good example of this.
Reducing the number of rows needed to execute a query
So beyond indexing properly for joins, you can still end up with a query that runs in a way that causes a bottleneck.
Taking our example from above, imagine that we use a where clause that limits the tableA selection to half (tableA.foo = 'bar' below):
SELECT tableA.*, locations.location from tableA
Left Join tableA2locations on
tableA2locations.tableA_id = tableA.id
Left Join locations on
tableA2locations.location_id = locations.id
where locations.location = 'sometown' and tableA.foo = 'bar'
This starts us out with 40,000 rows of tableA data to examine. If there are a further 2000 rows from tableA2locations, thats 800,000 rows of data. Not astronomical, but significant. If this was a 3 or 4 table join, things could get ugly. What to do? The answer may be obvious to some: select first with the most limiting table:
SELECT tableA.*, locations.location from locations
Left Join tableA2locations on
tableA2locations.location_id = locations.id
Left Join tableA on
tableA2locations.tableA_id = tableA.id
where locations.location = 'sometown' and tableA.foo = 'bar'
This starts us out with 1 selection from the locations table, then 2000 from tableA2locations. If the join between tableA2locations and tableA is indexed correctly, we are then left with an index join based on ID, rather then having to initially select 40,000 rows from tableA as in the previous example.
When I first started programming, it made sense to me to select from the main table (tableA) and join the lookups. But once you add some data to the mix and start to play with Explain, you quickly realize that selecting from the limiting table can make your server's life a little easier.
For further reading on the topic, I always send[...]

This post was originally published on May 13th, 2004. As others are writing about the topic, I thought bringing it out of the archives would be worthwhile.
A little recap
The idea of placing multiple states of buttons and other elements that are used in background images took its roots, I believe, from Pixy's Fast Rollovers. The CSS Zen Master extended this to another purpose in CSS Sprites: Image Slicing’s Kiss of Death. Didier Hilhorst came up with a nice application of this method, and I worked it backwards in Responsible CSS - Recycle your background images.
The idea behind the 'sprites' method can obviously be extended to any html element, and there are tangible benefits for doing this, just as long as the designer does his or her usual homework.
Benfits of using the 'sprites' method
What are the possible the benefits of using this method? Essentially it lies in faster download times for your web content.
Readers of Andy Kings book, Speed Up Your Site: Web Site Optimization will notice that this method reduces http requests and makes more efficient use of the data packets used to transfer files to the users computer, and that that is a good thing.
Packet size and http requests
From Web Page Design and Download Time, by Jing Zhi of Keynote Systems (seen here - pdf), cited in Andy's book:
The basic performance principle is therefore to make fewer requests and transmit fewer packets. From this principle, we can derive two basic design rules for wellperforming Web pages. First, reduce the overall size of the page, thereby reducing the number of bytes (and packets) to be transferred over the Internet. Second, limit the number of embedded objects on the page, such as images, each of which must be requested and transferred separately from server to browser.
They also found that it was the number of packets and not necessarily the overall size of the page that was important. If a packet could hold 1460 bytes (the figure given in the article) and your object was 1600 bytes, it would require two packets. They found that this object would transfer at the same speed as another object that was greater in size but still fit in two packets.
Potential payoff
The potential payoff for using this method versus individual images, then, is a faster download time due to reduced number of packets and fewer http requests.
Reducing http requests is easy. One file instead of two or three etc. is simple. But packet requests? That depends...
An example
The number of packets sent will depend on the size of the file and the users internet connection.As an example, lets look at the fiftyfoureleven.com logo at the top of the page. When this design was first being coded, that link consisted of two 3.34kb images, one for the link state and one for the hover state. Now, by using one image that contains both states and simply bumping it back and forth depending on the hover state, that has been reduced to one 5.35 kb image. Right there is a savings of 1.33 kb. Good news.
Now, for arguments sake lets say that a packet can hold 1460 bytes (packet size for connections greater than 128kb/s = 1500 bytes -40bytes for tcp/ip headers). The two image method used 6 packets, 3 for each image (3.34/1.46, rounded up). The single image method uses 4 packets (5.34/1.46, rounded up).
Things are looking good.
How to optimize
In his alistapart article, Dave refers to the image that holds all of the sprites as his 'master image'. The key to benefitting from this method is to ensure that the file size of your master image isn't a bloated equivalent versus the sum of its pieces.
Conclusion
Great benefits can be realized when combining a master image from slices that fall well below the size of one packet, as that unused packet space goes wasted.
After doing a little more research, it seems that packet size can vary depending on the co[...]