NSA Dragnet Debacle: What It Means To IT

PRISM shows companies can't assume their data is safe in the hands of commercial providers.

New York's 32-Story Data 'Fortress'

(click image for slideshow)

Director of National Intelligence James Clapper confirmed Thursday that the U.S. government has been secretly collecting information since 2007, exploiting backdoor access to the systems and data of major Internet and tech companies in search of national security threats. That NSA dragnet, revealed by The Washington Post and The Guardian and code-named PRISM, reportedly taps into user data from Facebook, Google, Apple and other U.S.-based companies. (Those providers have mostly denied that the NSA has such backdoor access.)

If news of the NSA dragnet is true -- and it's hard to believe at this point that it's not -- it's hard to justify combing through all of the providers' data and records without a specific due process. One contributor to Forbes.com, a fellow at the Adam Smith Institute in London, thinks it's a capital idea: "This is in fact what governments are supposed to do, so I'm at something of a loss in understanding why people seem to be getting so outraged about it."

I strongly disagree. While Clapper's release states that surveillance is "subject to oversight by the Foreign Intelligence Surveillance Court, the Executive Branch and Congress" and must be "specifically approved by the court to ensure that only non-U.S. persons outside the U.S. are targeted," the release also acknowledges that information about U.S. persons could be acquired in this dragnet. The release states that such acquisition, retention and dissemination of "incidental" findings about citizens will be minimized, but surely there are other, more nuanced ways to catch bad guys.

In any case, we need to be extraordinarily careful of using surveillance technology in a way that ever starts to put ordinary, law-abiding citizens under the microscope, even "incidentally" or "minimally." There should always be probable cause and a precise investigation, not broad, sweeping data collection. There is always a tension and balance between liberty and security. This type of broad data collection is unbalanced and has a huge potential for abuse; it feels like a police state.

The NSA operation isn't only bad for personal freedom, it's also bad for business. What foreign company will want to do business in the U.S. if it's our government's acknowledged practice that it performs warrantless collection of the data stored in the cloud by major U.S. companies in order to combat non-specific threats? If I worked for a foreign company, I'd also suspect nationalized corporate espionage as part of the U.S. government effort.

And if you work for a multinational corporation, you're going to have to think seriously about how a provider might be disclosing your data to the U.S. government. While the disclosure thus far seems limited to consumer companies (AOL, Google, Yahoo, Skype, Facebook, Apple), that's only what we know now. It's not much of a leap to assume that the feds are also monitoring enterprise cloud providers. And the NSA trumps contractual obligations every time.

The NSA operation also calls into question the cloud computing movement -- because where there's scale and centralization, there's a far easier ability to monitor. It's much harder to monitor many small providers and thousands of businesses with on-premises computing.

Another key takeaway for enterprise IT leadership: You better make sure that your data is encrypted when it leaves your premises. The paranoid among us might note that the Patriot Act, which gave U.S. law enforcement far-reaching powers, was signed into law in October 2001, and then the Advanced Encryption Standard was announced in November 2001 -- an eerie timing coincidence. However, AES, based on the work of Belgian researchers, has been publicly inspected globally and is considered technically sound.

But will the software itself be flawed? Would the U.S. government go so far as to coerce independent software vendors to install backdoors? In a country where officials can search your laptop at the border based on a "hunch," and where law enforcement can sample your DNA whenever you're arrested, and where the Patriot Act and Digital Millennium Copyright Act are allowed to stand, why would you be surprised by this dragnet or any further revelations?

My final business technology takeaway: The lack of clear boundaries on government surveillance should be a major motivation to use open source software for security and encryption. While the very largest multinational corporations have the buying power to make sure that proprietary software vendors don't allow a third party to inspect their source code for flaws and backdoors, smaller enterprises don't have such clout or finances. Proprietary software has better feature sets, but until the U.S. government regains the trust of citizens and businesses alike, better to ensure that the encryption software you use hasn't been tampered with.

==--The other shoe already dropped, but everyone seems to be politely ignoring it.

When Bush started all of this, a technician revealed that massive-bandwidth fiber trunks were being tapped into in special rooms at all the phone and internet providers.

Then last year, a Wired article quoted a systems-designer/whistleblower as saying that every phone call, email, ATM transaction, HTML request, electronic toll booth use, and library book loan that anyone in the country does is being tape recorded and stored in the new NSA data center in Utah at the rate of petabytes per day.

THIS is the data made available through PRISM, which is just a GUI into the massive database. I guess some of it is cached on disk.

That triggered a Senate investigation at which Clapper famously lied about it going on. For some reason, everyone insists on talking about mere telephone billing records--something their realtime data collection can't get.

It was also revealed then that there was a back door in every cell phone firmware allowing the NSA to turn on the GPS and microphone remotely.

I myself had a job interview with a friend of a friend contractor in an Arlington bar about a job tuning up the heuristics of the ontological model for the software that reads every single email anyone sends. I would have worked at UM, where the software was developed and where I did graduate work in knowledge representation and natural language understanding. I said "what about encryption", and he said they can brute-force anything, and if they can't, then they know it's important and they'll let the big machine crunch on it until they do.

I'm sure he wasn't supposed to tell me that, but he was drunk, bragging, and wanted to get in my pants. I didn't get the job because my DOJ security clearance had expired and the FBI was backlogged with clearance checks after 9/11. They needed someone NOW, with an active clearance.

I told the Washington Post, but they couldn't officially believe it unless I worked on the project or had documents.

We also now know details of the massively-parallel "big machine", also in Utah. I calculated that with 100 of the Nvidia CUDA arrays available now at Amazon, they could generate every 12-character password using every keyboard character in 20 minutes.

It also came out then that the NSA position is that they're not "intercepting" your phone calls until a human actually plays back the recordings.

Yes, that's very reassuring. I'm uploading (handing over) the source code for my new software even as we speak!

> all of the providers are denying it.

Not only do they have no incentive whatsoever to affirm it, and not only would the users throw rocks through their windows if they did, but if they 'fess up, they'll be shown the Patriot Act and hauled away to Security Prison for revealing state secrets.

A good analysis. It's dispiriting that the government's surveillance has been confirmed in this manner-- but is it really surprising? As Andrew suggests, it's a bit unnerving that so much data is being collected, but the point about tax dollars puts the dragnet into perspective.

I don't think that makes it okay--but it makes me less concerned about being personally targeted than about (again, to follow Andrew) the dynamic between citizens and government. Things were already pretty bad; the far right has decided that "facts" matter less than "principles" and "faith," and the left hasn't held the Obama administration accountable for its failings (e.g. why has President Obama forgotten how much Candidate Obama talked about transparency? Would Candidate Obama have taken such a unilaterally harsh stance on whistle blowers?). These conditions, among others, had already polarized rhetoric and neutered Congressional efficacy.

Now, you have to wonder if there's any reversing the widespread disillusionment this will cause. The President points out that Congress has been briefed on this program-- but that's not nearly good enough, and he knows it. Perhaps the realities of a digital, post-9/11 world demand that certain assumptions and entitlements be discussed-- but that discussion never really happened. Never in my life have I seen such a huge gulf between Americans' collective perception of a law and what the law actually does. And that's not okay.

Effective democracy only works when there can be informed debate. I appreciate that national security must be maintained, and that means the government has to keep certain secrets. But I have to believe we could have, as a society, had some conversation about giving up privacy that would also have allowed the government to keep its methods and strategies under wraps. Instead of doing that, we rushed the Patriot Act into law. As a result, whenever the government doesn't feel like having a debate, it can point to "national security" and refuse to admit anything, let alone divulge additional details.

I don't think it's surprising that the government runs a program like this. To be honest, I'm not even sure how I feel about the way the data is being used, now that they have it. I just think it's discouraging that the government didn't have to have a conversation - let alone break any clear laws - to get this far.

"What foreign company will want to do business in the U.S. if it's our government's acknowledged practice that it performs warrantless collection of the data stored in the cloud by major U.S. companies in order to combat non-specific threats?"

Oh, that's quite an easy question to answer... United Kingdom, Canada, Australia, New Zealand and The Netherlands - which happens to be all of the countries (aside from the US) that participate in the Echelon program of sharing SIGNIT back and forth.

This process has been going on for years and I highly doubt that public outcry will stop it. The big thing that's happening at this point in time is that the American public are losing faith and trust in their government doing the right things with the data that it is collecting (mostly due to the incompetence and political motivation of the existing administration). If you trust your government to do the right thing, does it inherently matter if they're analyzing data and metadata regarding your activities?

Point blank - here's how it all works... if something is in a digital format, unless it's stored on a system that's powered down, disconnected from all external cabling and is stored in a locked room, you should treat it as public knowledge, period. Once you overcome the obstacle (more of a mental block) that there are no secrets in the digital age, things aren't quite so bad.

If someone at the NSA feels the need to read through my e-mail and finds it interesting how I plan to restore my Imperial convertible, have at it! I'm quite sure there are better uses of my tax dollars... like running the systems and analyzing the petabytes of data available to find the small voice that may be rallying troops to cause another incident in the United States in the wilderness of funny LolzCat pictures and gossip about the latest Hollywood starlet who can't hold her liquor.

When it comes to cloud security - see above, once it's digital, it should be considered public knowledge. There are way too many large targets out there and way too many determined hackers who do these sorts of things (for fun AND profit). Remember the 80s when there was a technology embargo against the Soviet Union? Ever seen what those determined Russian programmers could make even an old IBM XT do, because it's all they had to work with? Now with technology being commoditized and ubiquitous, just about anyone on the face of the planet can order up a used PIII system with enough oomph to run Linux and get to work on coding the next great 0day assault. It comes down to where an organization wants to put their risk, what they feel comfortable with, and how they plan to mitigate the issues inherent with the level of risk that they're comfortable with.

I agree. The basic question is "Can the government be trusted to protect our privacy?" Recent activity gives me no reason suspect that the answer might be "yes". Therefore, it behooves me to take my own actions to ensure that information can't get into the wrong hands (I.e. encrypt the *%$#!! out of EVERYTHING).

Nevertheless cases like this one clearly show the risks. So your proprietary data is somewhere in the cloud and next thing you know a hack at some three letter agency spills out your stuff into the public. The cloud has its benefits, but having full control over your data is not one of them. Anything from getting it stolen over losing it all when the cloud vendor ceases to exist to having no access because either the cloud or the Internet connection is down somewhere. That is fine for non-critical systems that have a backup on site, for everything else it is like playing Russian roulette - quite a disservice of IT for any company. There is a reason why the cloud is called the cloud, it is nothing more than vapor, here today, gone tomorrow.

While I agree with the statements about government overreach, I think it is far too early to be making broad conclusions about the security of your data in the cloud. First, we have no idea what, if anything, actually happened here; all of the providers are denying it. Second, there is a huge difference between cloud providers providing internal access to the U.S. government, while likely staring a FISA warrant in the face, and them wantonly granting access to just anyone or leaving the doors wide open for foreign hackers. If the Feds wanted information in your private data center, they'll get it. Of course you'll know about it, but, pursuant to the Patriot Act, you can't tell anyone (i.e. your employees or customers) about it.

In sum, I don't think that what we know so far justifies making damning conclusions about data security in the cloud vis-a-vis on premise or on private networks.

Healthcare data is nothing new, but yet, why do healthcare improvements from quantifiable data seem almost rare today? Healthcare administrators have a wealth of data accessible to them but aren't sure how much of that data is usable or even correct.