You're absolutely right, even though I suspect you were half-joking. We all know by now that programs tend to expand to use the available resources and there's little reason to think this will be any different. If we're loading pages "34% faster" then we can soon expect those pages to be filled with 34% more ads!

Now, the next logical step is to have this algorithm analyze the actual scripts and figure out a way to convince the various malwares that they've been loaded satisfactorily even though they haven't. That way you could avoid downloading almost 99% of modern web pages.

Now, the next logical step is to have this algorithm analyze the actual scripts and figure out a way to convince the various malwares that they've been loaded satisfactorily even though they haven't. That way you could avoid downloading almost 99% of modern web pages.

I run the NoScript extension, so I already get all of those benefits without any need for fancy page analysis.

Two things... I prefer uMatrix, it's awesome. It is better than NoScript in my opinion and is whitelist-based. It now has a neat feature that lets you sync it across multiple profiles via the same mechanism that lets you sync your tabs. Yes, it is that awesome. You just need to enable the cloud mode. It's an odd name so i didn't notice it until last release.

The other thing is, if they increase the speed by 34% and people start doing that, they'll just cram more shit in their sites so that it ends up actuall

Dillo is the crazy stupid fast browser, it's fast like running Lynx but is a real graphical browser.The biggest downside is you can't log in to anything, though that used to be possible a decade ago. (I wish the middle click scrolling worked more like firefox too)

Funnily Slashdot is the site that needs javascript the most out there in my experience. Most pages even dreadful ones are read top to bottom,/. need that javascript/ajax thing to load threaded comments else it's a pain.

There is SSL support (or let's say, some HTTPS support), some CSS support.Best of all the browser is actively developed, changelog for the upcoming/unreleased version has the kinds of improvements you wish for (well I don't know what CSS guessing is, but they mention things that have to do with 'width' and CSS)

Yeah, I'm going to recompile it with HTTPS support. I'm not sure why I didn't do so the first time. I should probably ask 'em what kind of other switches are available. Who knows? Maybe there's something I can do to help. Probably not but it's an idea. I've got a ton of stuff on my plate at the moment so it's hard telling. Maybe I'll just send 'em a few bucks - they can get together and have pizza and beer on me.

It does look like an interesting project and like it has some good potential. I should ask about

Sorry if I'm bugging you - I'm always trying to learn new things and to get additional opinions. If it helps, I only ask for such from people who seem intelligent and I value their opinions greatly. Contrary to popular opinion, I do not know everything and there's opinions other than my own that are quite valuable.

Because with the default configuration the web interface is available only from localhost I think, not the LAN. So I thought it woud be fun to ssh in and do it from the terminal, although I had other options.As for a browser for daily use (any uses) a buddy asked me what to do. My opinion was that a "lightweight browser" that is full-featured still (such as Midori or other) can't be an answer anymore, because the web itself is monstrously bloated. I left it at that, and he went with NoScript:)

If your buddy has a technical bent, there's a small learning curve, there's something called uMatrix. It's a bit like an old-school software firewall except it's just for the browser and is whitelist based. Yes, yes it is awesome. Once you get up to speed and add your regular sites and configure for least privilege then you're golden and it's trivial to browse the web with reasonable security from browser exploits.

Dillo user here. Using dillo right now. I mostly brows using this if I can since it is so damn fast. I get deeply aggravated by the sluggishness of chrome and firefox by comparison. If you go into settings on slashdot and enable classic mode, not only does it work perfectly in dillo, it's like going back to a rose-tinted version of a decade ago, because not only is shit fast, it's REALLY FAST since it's as liughtweight as back then, but your computer and browser are much faster.

I actually had an issue with NoScript in Chrome that caused my tabs to sometimes use a large amount of CPU and other times crash. I couldn't even go to Google.com without it crashing. Some pages with absolutely no external links or even javascript would take tens of second to load with it enabled and pretty much instant with it disabled. I was able to find people complaining about this starting many many years ago. I eventually built a new computer, fresh install of Windows. Gave it another try before I eve

No, the next step is to kill Javascript which has now become a cancer that is destroying the Internet.

I'd agree, but sadly, a huge number of sites won't work at all without Javascript. Even sadder, I actually need to use some of those sites.

And when I say "need", I mean "need", they're not optional for me, I have to use them for work or work-related stuff.

To be honest, I like some of the functionality that Javascript provides (ajax, responsive menus, etc) but yeah, it's wormed its way into even the most basic functions of many sites these days- a lot of sites won't even load a page without it.

Go one better, implement a php interpreter in JS, then have your page load a JS script containing the interpreter which loads the PHP scripts and runs them client-side and finally renders the html output (which, of course, can contain lots of JS again - so you can even have your php output JS which contains the interpreter and go full inception mode).

Then market it to companies as having all the power of PHP but client side so they don't need such powerful servers to host pages.

That'd be kind of neat. I've done jack and squat for any web work as of late. I am supposed to be involved on a project with a friend of mine (a competition of sorts) but he's asked for an extension to our start date. I should probably be using this time to pick up some skills and refresh my memory.

Go one better, implement a php interpreter in JS, then have your page load a JS script containing the interpreter which loads the PHP scripts and runs them client-side and finally renders the html output (which, of course, can contain lots of JS again - so you can even have your php output JS which contains the interpreter and go full inception mode).

Then market it to companies as having all the power of PHP but client side so they don't need such powerful servers to host pages.

Oh dear, I actually almost like this....
(almost)
My current position of a year hired me as a LAMP stack programmer, then revealed they effectively have 0 server access, and need all this junk developed in javascript - Including saving and recovering settings, etc, etc... So for the last year I've been developing in javascript, which I now totally loathe. But the benefit of having PHP (or whatever server side language) is that it.... runs on the server, and (for most websites of appreciable size) work with

I like your sig.;-) I write novellas on a very regular basis. (See comment history if curious.)

However, I'll spare you - this once.

My current position of a year hired me as a LAMP stack programmer, then revealed they effectively have 0 server access, and need all this junk developed in javascript

Umm... How the hell does that even happen? A bit more specifically, the "0 server" access part is also intriguing.

I work for a small government department on one of their websites currently. Please don't expect me to be able to explain any of the business practices here, they make my head hurt.As for server access, apparently the hosting (and original development) was contracted out, and any server changes require about a month of communication and meetings. And my boss loves javascript.
Your sig isn't bad too; I love how casual dolphins are:)
I should update my sig however. At least half the time I manage to keep my

I modeled traffic and, as such, I worked mostly for municipalities/governments of various sizes including federal and some international work. (Long since sold and retired.) I understand... Just "government" means that I understand that I do not understand, nor do I want to.

That said, anything worth saying is usually too long to fit on a bumper sticker or make a sound-bite for television. (Bite or byte? I have no idea, having seen it both ways and being too lazy to look.) I also hate repeating myself and, t

It uses the exact same blocklists every adblocking service uses, from adblocking at dd-wrt level router plugin to Alternative-DNS a free dns servers that block ads at the dns level:http://www.alternate-dns.com/ [alternate-dns.com]

the only thing adblock plus added was a checkbox that defaults to unchecked. But if you do check it the only ad it will allow is adsense. Yet noone checks the exlude.

I have 7 blocklists in my adblock plug, from peter lowe's list to the "Adblock Detector blocker" list.

with adblock detector blocker, you no longer get sites that say "We detected you are running adblock, blah blah blah"

also by not allowing the detection of adblock you can now easily block ads on sites that have embedded ads in video like Hulu.:P

There is also a greasemonkey script to hide and block the detection of adblocking plugins.

Adblock plus does not get paid to exclude any sites. that's a false statement as the code is open source. you can allow adsense if you want to support some dumbass youtuber begging for sheckels or just leave it off as it is by default.

And never run just 1 blocklist, get the ad blocking list, then get the peter lowe malware, social media like, plus, share, all social media embeds. And block statistic gathering like Statcounter and Alexia from measuring your visits.

I repeat myself but you can save a lot of time and effort just by using uMatrix. Seriously, it does all that and it's tiny as all hell. It's like an old-school software firewall except it's just for your browser. And it is awesome. Err... I've collected a bunch of people who've tried it and claimed to like it. I'm on a mission!

Actually, I'm not even remotely affiliated and I don't particularly care what you use but I've seen you post before and you seem reasonably sane. So, I figure you *really* might want

I don't know about better. A bunch of shrunken heads would be kind of awesome. And, if you've never tried uMatrix, give it a spin. *nods*

I'm actually really, really satisfied with it. Sometimes I don't even bother with uBlock. I just use a live USB OS and keep things in RAM. I do that a lot, oddly. I don't really store any data locally so it's maybe five minutes to get things up to a good browsing experience. I should probably do a few roll-my owns with persistent storage.

That is completely and utterly wrong. It says so right on their website [adblockplus.org]. 10% of Adblock Plus's "Acceptable Ads" sponsors are on the paid whitelist. To be very clear, companies can pay Adblock Plus to be excluded from blocking as long as they still abide by the Acceptable Ads policy.

This page loaded 488KB of data. It took my browser 25.5 seconds* to download it all. What you're suggesting is all of that page data be included in a single response? So the browser would have to wait 25.5 seconds before it could even start rendering the page? Where it'd be difficult for the browser to then cache content that could be shared across multiple pages? Compare that to the current dependency structure where the DOM was loaded in 2.41 seconds & the page was considered loaded at 5.91 seconds.

They would all load a lot freaking faster if they would stop designing them with multiple, stupid, scrolling, 20 megapixel background images and dozens of megabytes of irritating javascript "special effects". Just saying.

A lot of different ideas have been tried. From Microsoft Chrome https://en.wikipedia.org/wiki/... [wikipedia.org] that allowed the users own computer to produce rich content.
Add more gzip https://en.wikipedia.org/wiki/... [wikipedia.org]?
Give the user the site text or images to get them looking, then load in the ads? Load the ads first, then present the full page?
The problem is all the trackers, ads, super cookies need to be connected. Giving the users a bit of quick up front content to then allow ads to load is fun. Keep the use

What can a site do? Run a script to detect an ad blocker? Suggest a monthly payment and block the page from that user or request the ad block is removed?

Wired http://www.wired.com/ [wired.com] has started doing that and I've started not visiting their site, even though I whitelisted them so I could do it for free. Screw them...

On the other hand, Stack Overflow https://stackoverflow.com/ [stackoverflow.com] has stated publicly that they are fine with ad blockers. Their reasoning is that if you're running one, you don't want ads, and wouldn't click on any if you saw them.

Came here to say that. "Designers" and most people that create websites are all concerned about style and not about how it functions. Toss in the fact that they are all on well equipped machines and good networks which makes everything load quickly and they never see the problem so it never gets fixed. It would be great if they even just optimized the images for size!

About ten years ago I was on the maintenance group for a bunch of government websites and there was one site on Cold Fusion. It was slow as

The Joomla! CMS took six seconds or longer to load itself before displaying a dynamically-generated page on one of my websites. After I converted the website to static pages, each page loaded in less than five seconds. More tweaking is needed to reduce the load times. The average Internet user has an attention span of a goldfish (i.e., six seconds or less).

One place I was at was looking at implementing a new CMS and at one meeting we were discussing the options for the architecture of the system. I was from the maintenance group and was there to provide feedback because we would be looking after it long term. I was in favour of having the system generate static HTML pages whenever a change happens (something the software supported and that we saw another department implement) because it would reduce the hardware required for serving the site. Well, we had th

I'm a computer professional and I've been working full time on the Internet since before it went commercial. I spend eight or more hours a day with a web browser open, and I make enough money to own a home and an acre of productive land, four vehicles, have two children in college, and donate a significant portion of my income to social causes I support. I hope to have a comfortable retirement on my savings, assumin

Vulcanising and HTTP2 Push are the way to go.Allthough I do wonder if this method then still has a chance of improving a sites performance.Personally I'd say well and automatically curated HTTP2 Push and automated minifying and compression are probably the best method overall.I do doubt that this method could improve much more if that were in place.

But I could be wrong.

Does anyone have experience with http-push and perhaps some insights to offer?Please comment below. Thanks.

Google has been working on a new compression scheme where the dictionary is fixed and stored in the browser. It makes sense since a lot of HTML and Javascript is highly repetitive and would likely be selected for inclusion in the dictionary by gzip anyway. You can even optimize Javascript to be more compressible under this scheme.

Not having 14 scripts be needed to post a comment, not having 8 other scripts clogging the pipes for one advertisement, 6 scripts for tracking you, and multiple other scripts for whatever reason.

Nor having a giant, moving graphic as the base part of your page which can't be turned off, menus which bounce up or down when you hover your mouse over them, or needing to have the latest and greatest browser so you don't miss out on the latest and greatest "features" of a site.

But no, finding an algorithm to speed web page loading is what we should concentrate on.

I went and actually read TFA. It seems all they've done is create a bastardized version of a less efficient SPDY/HTTP2 protocol fetching system. Essentially, they're trying to solve a problem that is already solved, but the existing solution is already faster, more efficient, and more well thought out in general.

I went and actually read TFA. It seems all they've done is create a bastardized version of a less efficient SPDY/HTTP2 protocol fetching system. Essentially, they're trying to solve a problem that is already solved, but the existing solution is already faster, more efficient, and more well thought out in general.

When they get their degrees from MIT, they're already well-prepared to go to work on systemd.

I guess the browser-side performance isn't so much what they're talking about (rather, reducing network round-trips), but still, I have always wondered why we're still sending xml and js, plain or gzipped, rather than sending compact binary formats.

EXI is a W3C standard; it's more compact than gzipped xml and it's more than a dozen times faster to parse.

Rather than coding in JS all the time, lots of people are using javascript as an intermediate representation or bytecode. This is tremendously inefficient.

For binary XML, as well as for the various fledgling binary json or binary yaml formats, the binary representation can be quickly converted to a plaintext one that has basically only minor formatting differences from the original. (I was about to say "to a human-readable one that..." but that's a stretch for a lot of XML.)

An AST / IR / bytecode is decompilable; e.g. from what I understand LLVM can do a good job of translating its IR back to C. Obviously a lot more information that could help with human comp

``The larger and more resources a web page contains, the better the algorithm's efficiency gets -- which should be useful on today's JavaScript-heavy sites.''

Browsers don't have enough trouble properly dealing with all the JavaScript that web sites shove down out Internet connection now. How nice that you've found a way for web sites lard up their pages with even more of the stuff.

Not (apparently) having learned anything from the switch to digital tv broadcasting (where the higher bandwidth was not used for better quality, but was co opted to shovel more channels of low quality shit) this "34% faster" algorithm will simply result in web coders programming at least 34% more crap ads and scripts into web pages.

Hey, Spartacus, put your name to this post, it's the most beautiful thing I've ever seen! Fuck scripting, just write the motherfucking content! I've only been saying that for the past twenty fucking YEARS!

I may be missing the "satire" here but Line-Width, seriously? I shouldn't have to scroll vertically to read your text because you've made a stylistic decision to limit the viewable area to 60-80 characters.

Surely that's the responsibility of a window manager to adjust horizontal width?

We are now only discovering the terrible price of web standardisation and brower stability.

Web design was a lot simpler when the lowest common speed was a 56K dial-up. Now that everyone is connected to the Internet with a fire hose on the last mile, most web designers don't even stop to optimize their pages.

I still write for dialup because I want page loads to take 0.01s, not the what 15 SECONDS that Facebook takes on 200Meg wired fucking BROADBAND? It seriously takes the piss, and I'm still trying to figure out what takes fucking Wikipedia so long to load when I run a WM instance on a dual core netbook and with 380GB of content it's still INSTANT.

Sure they do. Designers now make their pages behave like "apps" with AJAX everywhere, loading content on demand to reduce overall traffic!

Of course, the actual effect is that the "Back" button and all other standard navigation breaks, URLs don't update, scroll bars get fucked up, direct links are impossible, and browser memory usage balloons, making the site feel more like a Flash-based page from the 1990's.