In a situation experts say is comparable to what could have happened with the Year 2000 (Y2K) bug, the Internet exceeded a memory limit for old routers this week, causing slow browsing speeds and halting access to certain websites.

The memory max-out caused problems for eBay users in Britain who were prevented from logging on to the auction site. Elsewhere, browsing speeds were slow and websites had sporadic outages.

Experts say we should have been prepared — just as we tried to be for Y2K, a much-hyped disaster that never happened.

Not so this time.

“If this sounds like Y2K, don’t be surprised because it’s very much like it,” said Carmi Levy, an independent technology analyst based in London, Ont.

Related

“We knew this was coming, we didn’t prepare, we got caught … Like so many things in today’s online era, we simply didn’t take it seriously until it was too late.”

Mr. Levy said the slowdown and outages are largely the fault of Internet service providers and companies that have made large investments in data centres and who are responsible for keeping the Internet running.

The issue is traffic exceeded the capacity of old router memory.

“For the longest time, the standard size of a routing table was what was known as 512,000 [routes] … which was pretty good up until recently because the Internet was smaller than 512,000 routes,” Mr. Levy, explained.

“The problem is, is this week, thanks to the very rapid growth of the Internet, the actual map of the overall Internet exceeded 512,000.”

The memory maximum of 512,000 used to be an acceptable size, considering there were only 256,000 routes in 2008. But now, technology from the 1990s and 2000s is outdated.

Integral to the problem — and the solution — is the Border Gateway Protocol (BGP).

This is like a map of the Internet stored in memory on a router. It is a routing table that connects an address in a browser and the Internet Protocol (IP) address, managing online traffic. It routes website requests through routers, allowing users to access a site.

“The problem when the Internet becomes bigger than the size of the memory is there’s a chance … that I’ll be going to a website that is so new that it is not on that routing table. That resource has not yet been mapped. Then [the BGP] throws up its arms and goes. ‘What the hell do I do now?’” he said. “There are some pieces of equipment that can no longer see the entire Internet and that’s causing problems.”

Companies like network equipment maker Cisco Systems have been advising customers to upgrade for months.

“Since the early 1990s, we’ve watched as the number of entries on the Internet routing table has steadily grown,” the company said in an online statement in May.

It added the possibility of a “512K resource exhaustion” has been known for some time and advised users to reconfigure accordingly.

It’s difficult to pinpoint which websites will be negatively affected, considering the process is somewhat arbitrary, said Martin Karsten, associate professor of computer science at the University of Waterloo.

Since the early 1990s, we’ve watched as the number of entries on the Internet routing table has steadily grown

“It’s viewed as a random process,” he said. “They don’t deliberately throw out any particular one.”

However, if the backlog affects a popular site, such as eBay, the extent of the problem appears larger.

To resolve the issues, industries need to reconfigure old routers, which should resolve the problem for another year or two, Mr. Karsten said.

After that, upgrading hardware upgrades and buying new routers will be necessary.

While some reports call the problem evidence the Internet is “full” because it is running out of IP addresses, Mr. Levy said that’s an exaggeration.

“The Internet has growth capability,” he said.

“The problem that we’re seeing now is that some companies are still using ancient hardware … in a modern era.”