You can all relax now. The near-unprecedented outage that seemingly affected all of Google's services for a brief time on Friday is over.

The event began at approximately 4:37pm Pacific Time and lasted between one and five minutes, according to the Google Apps Dashboard. All of the Google Apps services reported being back online by 4:48pm.

The incident apparently blacked out every service Mountain View has to offer simultaneously, from Google Search to Gmail, YouTube, Google Drive, and beyond.

Big deal, right? Everyone has technical difficulties every once in a while. It goes with the territory.

But then, not everyone is Google. According to web analytics firm GoSquared, worldwide internet traffic dipped by a stunning 40 per cent during the brief minutes that the Chocolate Factory's services were offline. Here's the graph of what that looked like:

SNAP! An almost exact same thought occurred to me - and probably a million other people on the planet, upon reading of this news.I recall sometime in (I think it was) 2008/9 reading of a spate of mysterious outages in international undersea cable telecoms links to some countries in the the Middle East and elsewhere. The comment was vouchsafed that a deep-sea fishing trawler's net must have accidentally snagged the cables, or something...At the time, I thought to myself "Yeah, right. I wonder if they snagged the satellite transmissions network too?"

The problem is that so many webpages hit Google themselves (for analytics and what not), so even if you weren't trying to get to Google or Google's apps directly, there's still a very good chance that the sites you were trying to access didn't work very well.

The problem is that so many webpages hit Google themselves (for analytics and what not), so even if you weren't trying to get to Google or Google's apps directly, there's still a very good chance that the sites you were trying to access didn't work very well.

SNAP! An almost exact same thought occurred to me - and probably a million other people on the planet, upon reading of this news.I recall sometime in (I think it was) 2008/9 reading of a spate of mysterious outages in international undersea cable telecoms links to some countries in the the Middle East and elsewhere. The comment was vouchsafed that a deep-sea fishing trawler's net must have accidentally snagged the cables, or something...At the time, I thought to myself "Yeah, right. I wonder if they snagged the satellite transmissions network too?"

I learned to say the pledge of allegianceBefore they beat me bloody down at the stationThey haven't got a word out of me sinceI got a billion years probation- The MC5

Follow the path of the unsafe, independent thinker. Expose your ideas to the danger of controversy. Speak your mind and fear less the label of ''crackpot'' than the stigma of conformity.- Thomas J. Watson, Sr

It's not rocket surgery.- Me

I recommend reading through my Bio before responding to any of my posts. It could save both of us a lot of time and frustration.

Google has plenty of good products, gmail, search...the usual. But their ad network and analytics are not in that category. The analytics, used by many webmasters and therefore potentially affecting many sites (even counting the 'place the code in the footer' trick) is one of the most overrated online tools.

Yes, you can do it yourself, but by tapping into the codebase that google provides, you save yourself quite a bit of maintenance, which points to what it provides for the end user in those cases; using a shared library means that if there are any security updates, the sites that use the codebase directly from there are exposed for less time.

Yes, you can do it yourself, but by tapping into the codebase that google provides, you save yourself quite a bit of maintenance, which points to what it provides for the end user in those cases; using a shared library means that if there are any security updates, the sites that use the codebase directly from there are exposed for less time.

But for things like shared libraries stored on high-bandwidth CDNs and whatever, any benefit for the user is marginal. If a site can't load the 43 kb of jquery in addition to the images and everything else from the site, a CDN for jquery isn't going to help at all. For security, well, that's a two-edged sword. Users have less security and less privacy to start with by virtue of sites using shared libraries, so how does that balance out? My bet is that the small worry of an exploit in jquery (or whatever) is by far outweighed by the guaranteed loss of security and privacy from shared libraries.

I guess this goes back to all those fights we saw a decade or so ago about cookies and privacy. The pro-cookie side cited technical functionality while the anti-cookie side looked at the bigger picture. The big picture seems to be more important today, or perhaps it might be better to say "more obvious today".

Blocking at DNS may behave differently than DNS directing a request a server that doesn't respond in a timely manner. In the first case things fail fast, in the second case things might take a looong time to fail, making the site load slowly or not at all. That was my experience during the Google blackout. I was thinking of loading my hosts file into an editor and adding the various DNS names that seemed to be causing my browser to stall to resolve to localhost or something, but I figured it would be more effort than just waiting out the problem.

Whether there's value or not in the Google service is a separate issue (and the value is often to the website, not the reader).

I was thinking of loading my hosts file into an editor and adding the various DNS names that seemed to be causing my browser to stall to resolve to localhost or something, but I figured it would be more effort than just waiting out the problem.

The station is able to tap into and extract data from the underwater fibre-optic cables passing through the region.

The information is then processed for intelligence and passed to GCHQ in Cheltenham and shared with the National Security Agency (NSA) in the United States. The Government claims the station is a key element in the West’s “war on terror” and provides a vital “early warning” system for potential attacks around the world.

The Independent is not revealing the precise location of the station but information on its activities was contained in the leaked documents obtained from the NSA by Edward Snowden. The Guardian newspaper’s reporting on these documents in recent months has sparked a dispute with the Government, with GCHQ security experts overseeing the destruction of hard drives containing the data.

The Middle East installation is regarded as particularly valuable by the British and Americans because it can access submarine cables passing through the region. All of the messages and data passed back and forth on the cables is copied into giant computer storage “buffers” and then sifted for data of special interest.

Information about the project was contained in 50,000 GCHQ documents that Mr Snowden downloaded during 2012. Many of them came from an internal Wikipedia-style information site called GC-Wiki. Unlike the public Wikipedia, GCHQ’s wiki was generally classified Top Secret or above.

The disclosure comes as the Metropolitan Police announced it was launching a terrorism investigation into material found on the computer of David Miranda, the Brazilian partner of The Guardian journalist Glenn Greenwald – who is at the centre of the Snowden controversy.

Scotland Yard said material examined so far from the computer of Mr Miranda was “highly sensitive”, the disclosure of which “could put lives at risk”.

The Independent understands that The Guardian agreed to the Government’s request not to publish any material contained in the Snowden documents that could damage national security.

As well as destroying a computer containing one copy of the Snowden files, the paper’s editor, Alan Rusbridger, agreed to restrict the newspaper’s reporting of the documents.

The Government also demanded that the paper not publish details of how UK telecoms firms, including BT and Vodafone, were secretly collaborating with GCHQ to intercept the vast majority of all internet traffic entering the country. The paper had details of the highly controversial and secret programme for over a month. But it only published information on the scheme – which involved paying the companies to tap into fibre-optic cables entering Britain – after the allegations appeared in the German newspaper Süddeutsche Zeitung. A Guardian spokeswoman refused to comment on any deal with the Government.

A senior Whitehall source said: “We agreed with The Guardian that our discussions with them would remain confidential”.

But there are fears in Government that Mr Greenwald – who still has access to the files – could attempt to release damaging information.

He said after the arrest of Mr Miranda: “I will be far more aggressive in my reporting from now. I am going to publish many more documents. I have many more documents on England’s spy system. I think they will be sorry for what they did.”

One of the areas of concern in Whitehall is that details of the Middle East spying base which could identify its location could enter the public domain.

The data-gathering operation is part of a £1bn internet project still being assembled by GCHQ. It is part of the surveillance and monitoring system, code-named “Tempora”, whose wider aim is the global interception of digital communications, such as emails and text messages.

Across three sites, communications – including telephone calls – are tracked both by satellite dishes and by tapping into underwater fibre-optic cables.

Access to Middle East traffic has become critical to both US and UK intelligence agencies post-9/11. The Maryland headquarters of the NSA and the Defence Department in Washington have pushed for greater co-operation and technology sharing between US and UK intelligence agencies.

The Middle East station was set up under a warrant signed by the then Foreign Secretary David Miliband, authorising GCHQ to monitor and store for analysis data passing through the network of fibre-optic cables that link up the internet around the world

The certificate authorised GCHQ to collect information about the “political intentions of foreign powers”, terrorism, proliferation, mercenaries and private military companies, and serious financial fraud.

However, the certificates are reissued every six months and can be changed by ministers at will. GCHQ officials are then free to target anyone who is overseas or communicating from overseas without further checks or controls if they think they fall within the terms of a current certificate.

The precise budget for this expensive covert technology is regarded as sensitive by the Ministry of Defence and the Foreign Office.

However, the scale of Middle East operation, and GCHQ’s increasing use of sub-sea technology to intercept communications along high-capacity cables, suggest a substantial investment.

Intelligence sources have denied the aim is a blanket gathering of all communications, insisting the operation is targeted at security, terror and organised crime.