Register for this year’s #ChromeDevSummit happening on Nov. 11-12 in San Francisco to learn about the latest features and tools coming to the Web. Request an invite on the Chrome Dev Summit 2019 website

Measuring the Critical Rendering Path

The foundation of every solid performance strategy is good measurement and
instrumentation. You can't optimize what you can't measure. This doc
explains different approaches for measuring CRP performance.

The Lighthouse approach runs a series of automated tests against a page,
and then generates a report on the page's CRP performance. This approach
provides a quick and easy high-level overview of CRP performance of a
particular page loaded in your browser, allowing you to rapidly test,
iterate, and improve its performance.

The Navigation Timing API approach captures Real User
Monitoring (RUM)
metrics. As the name implies, these metrics are captured from real user
interactions with your site and provide an accurate view into
real-world CRP performance, as experienced by your users across a variety
of devices and network conditions.

In general, a good approach is to use Lighthouse to identify obvious CRP
optimization opportunities, and then to instrument your code with the
Navigation Timing API to monitor how your app performs out in the wild.

Auditing a page with Lighthouse

Lighthouse is a web app auditing tool that runs a series of tests against a
given page, and then displays the page's results in a consolidated report. You
can run Lighthouse as a Chrome Extension or NPM module, which is
useful for integrating Lighthouse with continuous integration systems.

Instrumenting your code with the Navigation Timing API

The combination of the Navigation Timing API and other browser events emitted
as the page loads allows you to capture and record the real-world CRP
performance of any page.

Each of the labels in the above diagram corresponds to a high resolution timestamp that the browser tracks for each and every page it loads. In fact, in this specific case we're only showing a fraction of all the different timestamps — for now we're skipping all network related timestamps, but we'll come back to them in a future lesson.

So, what do these timestamps mean?

domLoading: this is the starting timestamp of the entire process, the
browser is about to start parsing the first received bytes of the HTML
document.

domInteractive: marks the point when the browser has finished parsing all
of the HTML and DOM construction is complete.

domContentLoaded: marks the point when both the DOM is ready and there are no stylesheets that are blocking JavaScript execution - meaning we can now (potentially) construct the render tree.

Many JavaScript frameworks wait for this event before they start executing their own logic. For this reason the browser captures the EventStart and EventEnd timestamps to allow us to track how long this execution took.

domComplete: as the name implies, all of the processing is complete and
all of the resources on the page (images, etc.) have finished downloading -
in other words, the loading spinner has stopped spinning.

loadEvent: as a final step in every page load the browser fires an
onload event which can trigger additional application logic.

The HTML specification dictates specific conditions for each and every event: when it should be fired, which conditions should be met, and so on. For our purposes, we'll focus on a few key milestones related to the critical rendering path:

The above example may seem a little daunting on first sight, but in reality it is actually pretty simple. The Navigation Timing API captures all the relevant timestamps and our code simply waits for the onload event to fire — recall that onload event fires after domInteractive, domContentLoaded and domComplete — and computes the difference between the various timestamps.

All said and done, we now have some specific milestones to track and a simple function to output these measurements. Note that instead of printing these metrics on the page you can also modify the code to send these metrics to an analytics server (Google Analytics does this automatically), which is a great way to keep tabs on performance of your pages and identify candidate pages that can benefit from some optimization work.

What about DevTools?

Although these docs sometimes use the Chrome DevTools Network panel to
illustrate CRP concepts, DevTools is currently not well-suited for CRP
measurements because it does not have a built-in mechanism for isolating
critical resources. Run a Lighthouse audit to help
identify such resources.

Feedback

Was this page helpful?

Yes

What was the best thing about this page?

It helped me complete my goal(s)

Thank you for the feedback. If you have specific ideas on how to improve this page, please
create an issue.

It had the information I needed

Thank you for the feedback. If you have specific ideas on how to improve this page, please
create an issue.

It had accurate information

Thank you for the feedback. If you have specific ideas on how to improve this page, please
create an issue.

It was easy to read

Thank you for the feedback. If you have specific ideas on how to improve this page, please
create an issue.

Something else

Thank you for the feedback. If you have specific ideas on how to improve this page, please
create an issue.

No

What was the worst thing about this page?

It didn't help me complete my goal(s)

Thank you for the feedback. If you have specific ideas on how to improve this page, please
create an issue.

It was missing information I needed

Thank you for the feedback. If you have specific ideas on how to improve this page, please
create an issue.

It had inaccurate information

Thank you for the feedback. If you have specific ideas on how to improve this page, please
create an issue.

It was hard to read

Thank you for the feedback. If you have specific ideas on how to improve this page, please
create an issue.

Something else

Thank you for the feedback. If you have specific ideas on how to improve this page, please
create an issue.