Share this story

Since Chrome's very first release, performance has been one of Google's top priorities. But Google is against a competing force: Web developers. The Web of today is a more complex, bandwidth-intensive place than it was when Chrome was first released, which means that—although Internet connections and the browser itself are faster than they've ever been—slow pages remain an everyday occurrence.

Google engineers have been developing "Never Slow Mode" in a bid to counter this. Spotted at Chrome Story (via ZDNet), the new mode places tight limitations on Web content in an effort to make its performance more robust and predictable.

The exact design and rationale of Never Slow Mode aren't public—the changelog for the feature mentions a design document but says it's currently Google-internal. But taken together, that design and rationale will ensure that the browser's main thread never has to do too much work and will never get too delayed. They will also ensure that only limited amounts of data are pulled down over the network. This should make the browser more responsive to user input, lighter on the network, and a bit less of a memory hog than it would otherwise be.

As well as capping various data sizes and limiting the amount of time that JavaScripts can run, Never Slow Mode also blocks access to certain features that webpages can currently use that incur a performance cost. In particular, scripts are prohibited from using document.write(), which is widely used by scripts to dynamically emit HTML (potentially including embedded CSS and JavaScript) into a page. They're also blocked from making synchronous XMLHttpRequests to transfer data to and from servers. Synchronous requests tend to make pages feel slow, because the browser can't run any other scripting while it's waiting for the synchronous request to complete. Asynchronous XMLHttpRequests remain supported, as these let the browser do other things while it's waiting for the remote server to respond.

The budgets for execution resources get reset each time a user interacts with the page. So each time a page gets scrolled or tapped, it can run a bit more JavaScript and pull down a bit more data.

The size and execution time limitations are draconian, to say the least, and the JavaScript changes will outright break many existing pages. Together, they make Never Slow Mode a little mysterious: this isn't a mode that can be used for general-purpose Web browsing, because pages will either run out of resources or depend on forbidden JavaScript. The current implementation of Never Slow Mode notes that it's only a prototype (with an "idiotic implementation," no less). So whatever its ultimate purpose, there's still much development left.

We've asked Google for comment but have heard nothing at the time of writing.

As long as it stays as a “mode” it’s actually a great thing. It’ll allow developers to test against it. And if google proves that the majority prefer that mode then developers will by default code against it.

As well as capping various data sizes and limiting the amount of time that JavaScripts can run, Never Slow Mode also blocks access to certain features that webpages can currently use that incur a performance cost. In particular, scripts are prohibited from using document.write(), which is widely used by scripts to dynamically emit HTML (potentially including embedded CSS and JavaScript) into a page. They're also blocked from making synchronous XMLHttpRequests to transfer data to and from servers. Synchronous requests tend to make pages feel slow, because the browser can't run any other scripting while it's waiting for the synchronous request to complete.

It's not optimum. Being optimum would involve summary execution of the web developers if a page takes longer than X to load. But it's maybe the best that can be done. Everybody knows that if they don't face the axe, developers will load pages with all whatnot, because shiny!

Google controlling web standards, not good. But the rational seems decent? It might be best for mobile or secure devices.

Oh, and maybe some web designers will actually pay attention when I say make your damn site work with javascript disabled!

It's 2019. I have little interest in using a website that doesn't fetch without a full page reload. JS is necessary full stop.

It's interesting to hear this because I find that sites that rely on JS for updates often end up sucking up too much memory and slowing the browser down, so I rely on explicit full page reloads to clean up the memory footprint.

I find JS updates useful for very simple dashboard-like features, but otherwise, just reload the darn page.

Can they come up with an experimental "don't gobble up 3+ GB of memory after being open for more than a few hours" mode?

I don't get that gripe: unused memory is wasted memory, I *want* my apps to gobble up as much as they can productively use. Granted, they also have to free it up gracefully if something else needs it. But "Gobbles up 3GB" isn't an issue if there's a couple of GB left. On the contrary.

Can they come up with an experimental "don't gobble up 3+ GB of memory after being open for more than a few hours" mode?

I don't get that gripe: unused memory is wasted memory, I *want* my apps to gobble up as much as they can productively use. Granted, they also have to free it up gracefully if something else needs it. But "Gobbles up 3GB" isn't an issue if there's a couple of GB left. On the contrary.

The trouble is, there's a huge difference between *needing* 3GB of RAM, and merely caching 1 GB while actively using the other two.

Can they come up with an experimental "don't gobble up 3+ GB of memory after being open for more than a few hours" mode?

I don't get that gripe: unused memory is wasted memory, I *want* my apps to gobble up as much as they can productively use. Granted, they also have to free it up gracefully if something else needs it. But "Gobbles up 3GB" isn't an issue if there's a couple of GB left. On the contrary.

I don't think you understand how RAM works....

A more useful response might explain the problem with the post you replied to. Not trying to just be a jerk, I think you missed a great opportunity to teach them something.

Can they come up with an experimental "don't gobble up 3+ GB of memory after being open for more than a few hours" mode?

I don't get that gripe: unused memory is wasted memory, I *want* my apps to gobble up as much as they can productively use. Granted, they also have to free it up gracefully if something else needs it. But "Gobbles up 3GB" isn't an issue if there's a couple of GB left. On the contrary.

Other thanBecause of "free it up gracefully" being a fantasy in my experience, I prefer the efficiency of a "take what you need" mentality rather than a "take what you can use" one when it comes to how applications use my computer's resources.

Can you say how that would break the page of an average Ars Technica Article? Does that change if a person has subscribed?

Almost certainly it won’t break a thing. No one is using synchronous xmlhttpr these days. Document.write is weird. I used it once as a CDN fallback to write script tags to load local files instead. That site is still up and would break if the CDN failed. Unlikely of course but I am sure other people use it for stuff that I am unaware of.

Therefore this Never Slow concept, while potentially benefiting humanity in the same way as Gmail, G-Maps, G-Search, G-Fibre, Android etc, should be seen through the same lens. How so? Not sure.

Maybe the concept is faster and lighter webpages will encourage people to stop using AdBlock?

Meanwhile, if you want to roll your own Never Slow, use AdBlock Origin, turn off all images, use Ghostscript etc. (Oh look Ma I can do my own Never Slow browser!) I makes a huge difference.

I wish they'd apply the rules to Adsense. I run Google's own Lighthouse audit on a site I have Adsense on, and Adsense is guilty of document.write() and other naughty stuff that Google frowns on, which drags the site's performance score down.Google needs to carry out some automated testing against their own ad network and ensure that ads they serve comply with their own performance rules.

Can they come up with an experimental "don't gobble up 3+ GB of memory after being open for more than a few hours" mode?

I don't get that gripe: unused memory is wasted memory, I *want* my apps to gobble up as much as they can productively use. Granted, they also have to free it up gracefully if something else needs it. But "Gobbles up 3GB" isn't an issue if there's a couple of GB left. On the contrary.

Other than "free it up gracefully" being a fantasy in my experience, I prefer the efficiency of a "take what you need" mentality rather than a "take what you can use" one when it comes to how applications use my computer's resources.

Right now, my 16 GB PC has 8GB free and is using 20% CPU. I'd love it if it used that to background-load and -render the handful of top bookmarks it knows I'll click. Or to pre-launch Civ5, even if only to flush it if I eventually visit other sites / launch other apps.

It's 2019. I have little interest in using a website that doesn't fetch without a full page reload. JS is necessary full stop.

I emphatically disagree — while I have little interest in using Web sites whose basic navigational features and UI are inconsistent with standard browser behavior and Web conventions, page loads are only problematic when they're slow.

Can they come up with an experimental "don't gobble up 3+ GB of memory after being open for more than a few hours" mode?

I don't get that gripe: unused memory is wasted memory, I *want* my apps to gobble up as much as they can productively use. Granted, they also have to free it up gracefully if something else needs it. But "Gobbles up 3GB" isn't an issue if there's a couple of GB left. On the contrary.

Other than "free it up gracefully" being a fantasy in my experience, I prefer the efficiency of a "take what you need" mentality rather than a "take what you can use" one when it comes to how applications use my computer's resources.

Right now, my 16 GB PC has 8GB free and is using 20% CPU. I'd love it if it used that to background-load and -render the handful of top bookmarks it knows I'll click. Or to pre-launch Civ5, even if only to flush it if I eventually visit other sites / launch other apps.

I'd love it, too...

But if it's not implemented at the OS level, it just screws up my intentional background load of Civ to have Chrome caching every link it sees on the page.

Maybe if the OS has a way to tell an app to knock it the fuck off because the user is trying to use the OS for more than just the app...

Maybe give the OS the managing of optional memory use, and an API to communicate and negotiate with apps about it, but still it doesn't seem smart enough yet to know what I want to do well enough to be particularly useful.

Right now, my 16 GB PC has 8GB free and is using 20% CPU. I'd love it if it used that to background-load and -render the handful of top bookmarks it knows I'll click. Or to pre-launch Civ5, even if only to flush it if I eventually visit other sites / launch other apps.

Your OS tries to do that now by filling the spare memory with data from the disk that it thinks you are likely to read soon. One application should never cause another application to give up memory, that is too easy to abuse. The only thing that can do that is the OS managed in-memory disk cache because the OS is always aware of all memory allocation by definition. So, what you are saying happens to a degree now. The OS only prefetches the files you are very likely to load because to do otherwise is a waste of power. You may not care on your PC, but a laptop user does. Lastly, with SSD's, the file loading issue is really much impactful so, so the benefit is much more limited when compared to slow spinning disks. Spinning disks can't be too busy reading potentially useful data because that would slow down fetching data needed right away.

However, to your example, how would it know that it's Civ5 you want to run? Why not Civ6 that you just downloaded? Is that the only application you use besides the browser? My point is that it's very hard for code to be intuitively smart like that because it can never understand the full context of the moment (and when it does, we will hail our overlords! ).

If you want another page you commonly use preloaded, then open it in a new tab. Why would the browser make that choice for you?

The only type of software that really benefits from intentionally taking all available memory are databases. They know that there is data that will be accessed or indexes for it and reasonably assume that the computer's/server's sole purpose is to run that piece of DB software.

TL;DR: Trying to use all available memory is not beneficial for most applications.