GCHQ documents detailing the system show the goal was "either (a) a Web browsing profile for every visible user on the Internet, or (b) a user profile for every visible website on the Internet."

To get the data, the agency spliced taps into the fiber-optic cables that form the Internet's international backbones and stored it in a database known as the "Black Hole." Between 2007 and 2009, Black Hole vacuumed up raw data for later analysis at rates of as many as 10 billion entries a day. By 2012, it was pulling in 50 billion entries a day, and GCHQ wanted to double that rate.