Optimizing user perceived performance, or “how sluggish your app feels, and what to do about it” is one of my favorite themes in web development and I was happy to see a lot of talks related to it at this year’s Nordic.js conference.

Nordic.js itself was really awesome experience and it was real privilege to be able to attend it with a couple other Mates. Even though it is a Javascript conference, it had full stack diversity of subjects from accessibility and design systems to internal workings of V8 Javascript engine and NodeJS monitoring.

In this post I’m sharing the insights I got and some tools to get started getting on par with performance in any web project. We’ll soon see that that providing the best experience equally to all user groups is necessity to generate optimal business value.

She presented statistics about device range and mobile networking coverage showing that in global scale 60% of people are browsing in 2G speeds. It means that we need to design and implement for more challenging network conditions to be able to really do global business. That’s easy to forget during development when there are high-end devices and stable 4G access available.

Global coverage means that user perceived performance optimizations will also make your site or application more accessible.

Based on Google Analytics 53% of mobile site loads are abandoned when load takes longer than 3 seconds. Also presented case studies results from Google, Pinterest and Netflix prove that performance optimizations increase ad views, conversions, SEO traffic driving more value for the business.

Also there’s older study from Amazon, which found out that each “100 milliseconds of extra loading time costs them 1% in sales”.

So what about the solutions? Start by measuring first paint, first meaningful paint and time to interactive eg. with Lighthouse audit. Lighthouse is a performance audit tool integrated in Chrome (though it is availale as a standalone version), which provides report like one shown below with concrete steps for improvements.

That talk extended definition of perceived performance from optimizing metrics pointing to “First meaningful paint” and “Time to interactive” to include requirement “Smooth at all times”.

In this context smoothness means that browser would be able to render content with device frame rate (usually 60 frames per second). And to optimize it one would need to figure out when browser is not able to keep up with the device frame rate.

For good reference of details, tools and optimization techniques I really recommend checking out the Anna’s slides.

One of the interesting techniques which has totally slipped under my radar was will-change CSS property. It is used to hint browsers about expected changes to CSS properties allowing browser to optimize them. This is a kind of last resort option which should be used responsibly. We all probably have had situations when we have to prioritize something, and suddenly every item is the most important thing. Browser doesn’t need to feel same…

Consider using Lighthouse in continuous integration, and maybe enforce some score threshold.

This kind of work isn’t always easy to sell in consulting business, but I hope those insights in Isa’s talk help us all further.

And remember performance (just like security, accessibility, usability and friends) isn’t an oregano sprinkled on the top of in the end – it’s a continuous whole team effort helped with performance budget and monitoring for regressions.

If you want to know even more about the available tooling and considerations regarding modern frontend stacks, I recommend continuing to Addy Osmani’s post “The Cost of Javascript in 2018”.