When Design Best Practices Become Performance Worst Practices

When Design Best Practices Become Performance Worst Practices

Your design team has come up with a gorgeous prototype for the next iteration of your home page. It conforms to known design and usability best practices, and your testers loved it in the lab. You push the design to your live site and the results are … well, a little disappointing. Not terrible, but your conversion rate hasn’t made the leap you expected.

Why? You’ve done everything by the book, but perhaps the book is missing a chapter or two. Chances are, you’ve accidentally made one of three common design mistakes, and these mistakes have affected how your pages load, which ultimately hurts page views, bounce rate, conversions, and pretty much every business metric you care about.

The Relationship Between Page Speed and Business Metrics

I’ll keep this short, in case you’re already familiar with the knowledge that there’s a direct relationship between how quickly a web application performs and how effective it is at doing whatever it’s intended to do. Over the past few years, there’s been a growing body of evidence that makes an undeniable connection between page speed and key performance indicators such as page views, bounce rate, cart size, sales, customer satisfaction, and pretty much any other business metric you can think of.

Unfortunately, most retailers—even big household-name e-commerce shops—aren’t hitting this three-second target. Among the top 100 online retailers, the median time to interact (aka TTI, the moment when a page’s primary content renders and becomes interactive) is 5.3 seconds; only 18% of these sites had a TTI of 3 seconds or less, and 26% took 8 seconds or longer to become interactive.

A Bit of Background into Our Research …

Since 2010, Radware has been measuring and tracking the performance and page composition of leading e-commerce sites. The purpose of this research is to gain ongoing visibility into the real-world performance of leading e-commerce sites—to learn how these sites perform for visitors sitting at home using the internet under normal browsing conditions—and to provide strategies and best practices that will enable site owners to serve pages faster. (Full disclosure: We have a technology called FastView that automates many of these best practices.)

An under-appreciated and under-utilized WebPagetest feature that allows you to generate a timed frame-by-frame filmstrip-style view of a page’s entire load time led us to the discovery that a surprising number of these sites are making the same three design/usability mistakes over and over, and that these mistakes are profoundly hurting load times, and ultimately users.

When Good Practices Go Bad

The reason these mistakes proliferate is because they’re insidious. To understand how they happen, we first have to understand how they’re masked by the guise of various design best practices. (Note: Throughout this post, I’ll be citing examples from specific websites, but I don’t want this to be perceived as throwing anyone under the bus. These are incredibly common issues, so there’s no need for any one site owner to bear more than their share of performance shame.)

The feature banner or carousel pushing sales and special promotions is such an entrenched aspect of e-commerce design that we don’t even think about it. In fact, we take feature banners so much for granted that we don’t always put much thought into when and how they end up on the page.

Why this best practice is a worst practice: For many of the pages we studied, the feature banner was the last element to load. This filmstrip view (which has been compressed to 5-second increments) of the QVC.com home page is typical of many of the pages we looked at:

The navigation elements don’t load until close to the five-second mark, and then the feature area displays a progress indicator for almost five more seconds before the banner image suddenly renders.

A separate eye-tracking study by Jakob Nielsen demonstrates exactly why this loading sequence fails both users and site owners: a user who is served feature content within the first second of page load spend 20% of his or her time within the feature area, whereas a user who is subjected to an eight-second delay of a page’s feature content spends only 1% of his or her time visually engaging with that area of the page.

In other words, rendering your most important content last is a great way to ensure that it won’t get seen.

The solution: There are a few straightforward fixes here. First is optimizing the order in which your page objects render. This tip is so obvious that it feels silly to write it out, but given the number of household-name sites that fail to do this, it bears mentioning.

The next tip—or really, set of tips—is to optimize the feature images themselves. Images comprise more than half of a typical page’s total payload, and much of this bulk is unnecessary. Image compression is a basic performance technique that allows developers to reduce this payload, but of the 100 sites we tested, only 9% had properly implemented it.

We also found that 87% of sites don't take advantage of progressive image rendering, probably because of the bad reputation progressive JPEGs earned in the 1990s, back when the entire Internet moved at glacial speed. A progressive JPEG loads in a series of scans, beginning with a low-resolution version of the image and progressing at increasing resolution levels until the final full resolution is achieved. Implemented properly, progressive image rendering improves perceived load time because the user receives visual feedback earlier than with a baseline JPEG.

Best Practice #2: Primary call-to-action buttons/links are located at the bottom of the feature banner

Again, this is a practice that, like banners themselves, we’ve grown to expect. A typical banner’s layout, as illustrated here on the Costco home page, goes something like this: background image, header copy, descriptive text, and then CTA button.

Why this best practice is a worst practice: While some feature banners do become interactive before they’ve fully rendered, many do not until the image fully loads, which is signaled by the rendering of the CTA button. If you’re a user who is conditioned to believe that the feature banner isn’t clickable until the CTA button is visible, then you could be sitting on your hands for several seconds.

In the case of this version of the Costco home page, the link to “Click here for online-only coupon offers” should ostensibly be very important to shoppers. Yet it’s buried at the very bottom of the banner. Let’s see how this positioning renders in the last 5 seconds that it takes for the Costco home page to fully render:

Note that the CTA doesn’t load until the very last frame. In other words, some users could spend the entire nine seconds that it takes this page to fully render before they see the CTA button and begin to interact with the page.

The solution: Again, optimizing the order in which the page objects render, plus optimizing the images and buttons themselves to render progressively, would fix this issue. And here’s a crazy idea: if the CTA is important, consider simply moving it to the top of the banner.

Best Practice #3: Designing and usability testing in an ivory tower

In the spirit of total honesty, I will confess up front that, back in my usability testing days, I was totally guilty of doing this. I’d work with designers to develop prototypes on their souped-up Macs, then bring our testers into the lab and put the site through its paces. I’d take rigorous notes, oversee the necessary revisions, re-test as needed, lather, rinse, and repeat. We’d move the prototype down the assembly line for implementation and then issue a round of high fives.

Why this best practice is a worst practice: Where to even start … Designing and testing wireframes and prototypes in studios and labs only tells you how a site performs in an ideal setting—using a fast system, the latest browser, a speedy connection, and a preloaded browser cache—all while sitting just a few feet away from the host server. But most visitors aren't in an ideal setting. A typical visitor's at-home computer setup could be five times slower (or even more!) than your lab setting.

And then there’s the issue of third-party scripts that can slow down or even block page rendering. This is a common and often an excruciating problem that can only be felt on a live site.

At some point between the design phase and the go-live phase, performance gets lost as each team (design, test, development) assumes that the granular aspects of page load and performance is someone else's responsibility. When that happens, the result is pages like this, which take upwards of 20 seconds to load for actual users:

The solution: I can’t stress this enough: performance should be the responsibility of every person who touches a page, from conception through to deployment. Too often, performance is an afterthought and becomes the responsibility of the last person to touch the page: namely, developers. Not only is this unfair, it also neglects the many ways that, as designers and user experience people, we can add our unique expertise to make pages faster.

Performance must be integrated into every phase of the design, testing, and development process. Web teams need to see their pages the way that real people see their pages: slowed down frame by frame.

Before deployment, one way to do this is to artificially throttle the connection during the design and test phases. Observe the order in which objects render and where things stall. After deployment, you can use synthetic tools such as WebPagetest (which has the advantage of being free and easy to use) in conjunction with real-user monitoring tools, of which there are many to choose from.

Fixing These Problems = Happier Users + Happier Site Owners

Let’s bring this back to metrics. I’ll close with a few more case studies for the unconvinced among you:

Comments

This is a superbly written, concise summary of the most important points that many get wrong (including me most of the time, making performance optimised techniques a default part of your workflow is hard!). Very mich enjoyed reading it and thinking about whether we are paying sufficient attention to this stuff at FT.

I was a bit surprised to see no mention of the Navigation Timing API though. Is there some reason why you don't rate it as a way of diagnosing performance bottlenecks?

Thanks Tammy. Just interesting... have you ever considered optimizing your own website webperformancetoday.com? Accordging to siteloadtest.com it's way too far from being optimized, in many aspects. 38 external scripts, no image optimization, no css or js merging, no minification, Cache-Control=4800 seconds for static content, etc.

Hey Tammy,
I thought this was a great article. Regardless of the terminology used, you don't have to look far to see the issues you describe all over the place. Anyway, I included your post in my roundup of the month's best web design/devleopment, security, and CMS content. http://www.wiredtree.com/blog/januarys-best-web-designdevelopment-cms-security-content/

I used Webpagetest -- http://www.webpagetest.org/ -- a free online tool supported by Google. It lets you simulate real-world load times across a number of browser and connection types. To generate videos and filmstrip views of the tests, you just need to select the "Capture Video" option.

Cool! I didn't realize Webpagetest had that feature. It reminds me of video editing software, thanks I will check it out! I learned about these two other resources the other day via a thread related to a post from your article on LinkedIn. You should check them out, if you are not already familiar with them:

I think you're confusing "design best practice" with "things that often happen" in a way that is bordering on irresponsible. What UX designer do you know who REALLY thinks a carousel is best, or that a CTA should be inside a banner, or that user testing is best done on ideal equipment? None of the things you listed are best practices that any designer in his/her right mind would advocate. These are frequent mistakes that happen when business owners start dictating design, or when corners have to be cut due to circumstances beyond the designer's control. Any good designer is concerned about the performance consequences of his/her designs - design isn't art, the way the design will be consumed is part of its success. Design best practices are not in conflict with performance best practices - the goal of design is to get the right content to the right user at the right time.

I'm quite curious to know why a comment critiquing the characterization of these practices as "design best practices" disappeared almost immediately upon being posted. Are comments on this site only allowed to be positive? That's distressing.

Your comment raises a really good question about the phrase "best practice". I agree with you: a best practice *should* be a tried-and-true method for completing a task most effectively. What I saw in my analysis of the top 100 ecommerce sites points to the reality: that certain conventions have become so widely adopted that they've emerged as seemingly "best" practices, when in fact they are not.

I also agree with you that it's tough to challenge when site owners start dictating design. One of the purposes of my article is to give designers ammunition they can use to push back.

Regarding corners being cut: I don't see how that applies to the points I raised. Optimizing page elements so that critical objects load first instead of last is about education, not time or resources. And testing outside ideal lab settings doesn't have to be time-consuming or expensive. There's software available that can artificially throttle network speed to simulate slower load times. After pages go live, there are free tools -- such as WebPagetest.org, the tool we used in our research -- which simulate how pages load for various browsers and connection speeds and can generate videos and filmstrip views showing exactly how the page renders. These tools are free and easy to use. I suspect that the reason they're not used more widely is due to lack of awareness that they exist and that they're important -- not time and budget constraints.

My goal wasn't to pick on designers, and if my post came across that way, then I apologize. I'm very aware of how challenging it is to balance the best interests of users while also pleasing clients and other bosses. This post was intended to be a wake-up call that these practices are widespread, that they hurt the user experience, and that they're preventable.

It seems like you're talking about conventions then, not best practices. Best practices in design are things like, "The visual hierarchy of the elements on the page should match the hierarchy of importance of information," not specific visual solutions like the ones in your post. Those will always change as design and technology evolve. Bad design is everywhere; the ubiquity of something does not mean that it is a best practice, merely that it has, for better or worse, become a convention. I can find a lot of people who smoke, but that doesn't mean it's a best practice for health, you know?

My comment about corners being cut was with regard to design resources, not anything to do with performance. Most people would love to do fully contextualized research, but sometimes there is simply not time & money to do it. Some user research is always better than none, so we cut corners and fall back to a more idealized environment.

I work for a huge website that relies heavily on organic traffic; all designers are expected to understand how their designs impact page performance and work with engineering to find elegant solutions to design problems while maintaining performance. There are probably plenty of designers working at agencies who don't understand these factors, and should definitely take time to understand it and learn to design for it. I think an article pointing out how common design elements impact performance is great - I think what you wrote is great for helping designers to understand.

My issue was simply that it is a pretty strong misrepresentation of designers and the discipline of design to suggest that these are our best practices.

This information is invaluable. I discovered similar things when I did extensive A/B testing of homepages. What was a best UX practice only hurt the page in the end and I couldn't understand why. Now I know. Thank you!

Thanks! And I agree: a follow-up post about techniques for improving load time would be a good idea. It's a massive topic that can't be covered in a single post, but I'll start thinking about where to start.

In the meantime, I suggest checking out Steve Souders's excellent book, High Performance Web Sites, if you haven't already: http://stevesouders.com/hpws/. It's the bible for the performance community.

Would love to see (for selfish reasons) how news and content sites perform. I have noticed anecdotally that sites like Huffington Post, Buzz Feed are painfully slow to load. I suspect all those 3rd party scripts and image compression issues are to blame.

Also, it would be great to see how these new website services rank in terms of performance - services like Wix, Weebly, Square Space.

And yes, you're right: third-party scripts are a major issue with news providers. Not sure why this is still an issue when there have been known workarounds (e.g. deferring scripts, or programming them to load asynchronously) for at least a couple of years now.

Way back in 2009, AOL found that visitors who were in the top 10th percentile for page speed viewed 50% more pages than viewers in the bottom 10th percentile:

Thanks so much for all the feedback and insight so far, folks. There's been some interesting discussion on Twitter as well. One person brought up the idea that users' need for speed is "borderline insane" and has a lot to do with an irrational need for instant gratification. I've encountered this belief before, which is why I created this post ("Our need for web speed: It's about neuroscience, not entitlement") about the psychology behind web performance:

Nice article. May I please ask what connection speed these times are being calculated on? Is there an agreed industry standard or do you base each case on analytics gathered specifically for that site?

Thanks, Greg. The test simulates a DSL connection at 1.5 Mbps. It's the low end of normal, a deliberate choice because we want to create awareness that these users exist in significant numbers. It's too easy to fall into the trap of thinking that everyone's on a souped-up line in an urban centre.

Too right. Its gotten so that I don't even use the internet at home because I have no patience with how long it takes sites to render. (Facebook being a primary offender.) At work, using a government connection you would think that you could have very fast page loads. You would be wrong. Because of that I only use the web to look up information I need to do my job. At home I only use too sites google and amazon and those as little as possible. Apparently all of these site designers are only designing for people on Fios or Xfinity, for getting the masses of use whose only options are DSL and dial-up.

Very informative article. Thank you. So many users are browsing their web on their mobile phones. More than one second feels like too long too wait—especially because users on their phone are choosing to be immersed in the content on their screen instead of their surroundings. Waiting a few seconds for content to load is frustrating and starts to defeat the purpose of why a user visited the site in the first place: Fleeting curiosity. Why not give them something to read in the first second?

Thanks, Michael. I agree with you. The key is to load the most meaningful content first, which is completely do-able programmatically. It's just that someone needs to think of doing it, and then make it a priority.

Hate to tell you, but speed has been an issue since day one of the Internet. I was there, I know. I could go into all kinds of stuff about .5mhz processors and 300baud modems; but it's like everything else. The more powerful the technology, the more it gets used with the assumption that everyone has it. Do I hear you saying "Less is More"? Good article!

Thanks, John! I was there, too, dialling up on my 14.4 modem that would lose the connection about five times per hour. I have a good friend who taught herself to play guitar while waiting for pages to load. True story. :)

Great article backed up with data and a nice reminder that good UX practice is incredibly holistic in nature, comprising so many different elements, not just visual aesthetics or interaction design, but the total experience, including things such as speed and a user's environment.

Thank you, Peta. That's exactly the point I was trying to make. As UX people, we need to take the user's complete context into consideration. If all we do is create beautiful prototypes that don't function optimally in the real world, then we're only doing half our job.