NSA’s UN Hacking Raises Yet More Questions: Is Nothing Left Sacred?

Last weekend, German newspaper Der Spiegal revealed the rather unsurprising news that the NSA had successfully bugged both the UN and a number of European countries’ US embassies, and also hacked into the UN’s video conferencing software in order to access any and all calls using it.

Such is the extent of the NSA’s pervasive spying network that this news isn’t the least bit surprising anymore. As we’ve mentioned several times on SiliconANGLE before, such ‘intelligence gathering’ is the bread and butter of spy agencies all over the world – after all, few people are outraged if its government officials that are the ones being spied upon.

The problem is not these tit for tat spying games. Rather, its the individual citizens who get irate when they learn that they’re being watched – for no good reason, we should add. Of course, the NSA has stressed time and time again how it does everything humanly possible to avoid unnecessary spying on US citizens at least, but with the latest news out of Germany, one has to wonder how genuine that claim is when it’s been caught out directly tapping calls made from inside the UN – which, as we know, is based in New York.

No doubt the NSA will argue that the UN headquarters is “technically” not actually in the US as it falls under the control of the UN, despite its agreement that it abides by US laws. The status of the UN headquarters is quite similar to that of foreign embassies, which although based in the US, are treated as the sovereign territory of whichever country occupies that space. No doubt, the NSA uses the same reasoning to justify spying on these too.

A second question raised by the Der Spiegal article has to do with President Obama’s insistence that the NSA’s activities are designed to protect the nation against terrorism. If so, why on earth does it feel the need to spy on European diplomats, who’re supposedly America’s closest allies in the War on Terror? Then again, as I mentioned earlier, spying on foreign officials is supposedly “fair game” and so few people will care, even if it does embarrass the US in front of its friends.

These questions will undoubtedly be brushed under the carpet as the next ‘revelation’ from Ed Snowden leaks into the media, but another talking point is the following snippet that tells how the NSA was able to hack into the UN’s video conferencing program:

“Furthermore, NSA technicians working for the Blarney program have managed to decrypt the UN’s internal video teleconferencing (VTC) system. The combination of this new access to the UN and the cracked encryption code have led to “a dramatic improvement in VTC data quality and (the) ability to decrypt the VTC traffic,” the NSA agents noted with great satisfaction: “This traffic is getting us internal UN VTCs (yay!).” Within just under three weeks, the number of decrypted communications increased from 12 to 458.”

Yay! Hooray! Just how obsessed have we become with hacking into every communications channel known to man?! Is nothing sacred to these guys anymore???

Kind of dumb that these ‘agents’ are celebrating these little victories in such a childish fashion, but hey why should I care? It’s only European diplomats right? They’re all “fair game”.

Except of course, the snippet above doesn’t say anything about the company that provides the UN with its video conferencing software. If the NSA has cracked its most secure encryption protocols – which are likely to be fairly secure seeing as its the UN – it certainly doesn’t bode well for the rest of us when we’re using software from non-PRISM clients like Line, WhatsApp and so on precisely to avoid being spied upon – how many of these have also been hacked?

Mike Wheatley is a senior staff writer at SiliconANGLE. He loves to write about Big Data and the Internet of Things, and explore how these technologies are evolving and helping businesses to become more agile.

Before joining SiliconANGLE, Mike was an editor at Argophilia Travel News, an occassional contributer to The Epoch Times, and has also dabbled in SEO and social media marketing. He usually bases himself in Bangkok, Thailand, though he can often be found roaming through the jungles or chilling on a beach.

Premium Research

Wikibon argues strongly against Revolution towards a 3rd platform. The conclusion from this analysis is that applications will evolve; conversion should be avoided like the plague. The greatest opportunity is to continuously adapt today's operational applications by the addition of real-time or near real-time analytics applied directly to the current organizational processes and applications that support these processes. This is likely to translate to the greatest value to most organizations, and where possible avoid the risks of converting systems. The study of organizations that have applied real-time analytics to their current operational systems have shown incredible improvements in lower costs and greater adaptability. Business and IT executives should understand the enormous potential for adding decision automation through real-time analytics to current operational applications in their organizations. New technologies should be judged by their ability to support real-time analytics applied to operational systems, and supporting incremental improvement over time.

In a recent web-based survey conducted by Wikibon, 300 North American enterprises whom had either been utilizing, or considering the adoption of public cloud, answered questions regarding IaaS (Infrastructure as a Service) perceptions and usages. These questions varied in topic but were centered around an examination of which workloads were best suited for usage in the public cloud. This research examines a few additional key insights that shed some light on the growing IaaS world.

Today's Technology infrastructure management is largely non-differentiated and wasteful. Technology executives must re-think the strategic role of human capital and begin to implement new ways to consume IT as a service. This post draws on the learnings of senior executive Alan Nance from Royal Philips who is dogmatic in its approach to transforming its infrastructure to a service model.

There have only been two successful volume introductions into the marketplace in the last 50 years - DRAM and NAND flash. There has to be a clear volume case with good economics for 3D XP to be able to gain a foothold in consumer products. Without volume in the consumer space, there is unlikely to be much volume traction in the enterprise space. CIOs, CTOs and enterprise professionals should take a wait and see stance, and monitor the adoption of 3D XP in the consumer and military spaces. If and when there is volume production for 3D XP, enterprise adoption should start about two years later.

The use of open source software continues to accelerate and expand in the marketplace, especially in areas where technology is significantly disrupting established business models. IT organizations should be actively seeking to understand how open communities operate, how different licensing models work, and how they can be more actively engaged with both the vendors and communities that are shaping open source software.

CIOs understand that a clear cloud strategy is critical for IT today. Wikibon believes the biggest mistake organizations can make is converting major applications into the public cloud (including SaaS) without thinking about the implications to their existing business process workflows. Wikibon recommends that IT develop and implement a hybrid cloud strategy using the existing management workflows and compliance processes for both the public and private cloud components in the hybrid cloud.

In 2014, Wikibon defined a new category "Server SAN" that sits at the intersection of software-defined storage, hyperscale methodologies and converged infrastructure. This article is the executive summary of primary research that gives the status of the market, examines the vendor ecosystem, lays forth the revenue and 10 year forecast and gives direction for expansion beyond simple "hyperconverged infrastructure". This information is available for public consumption, the full research is available to Wikibon clients.

In this research paper, Wikibon looks back at the introductory Server SAN research, adjusts the Server SAN definition to include System Drag, and increases the speed of adoption of Server SAN based on very fast adoption from 2012 to 2014. The overall growth of Server SAN is projected to be about a 23% CAGR from 2014 to 2026, with a faster growth from 2014 to 2020 of 38%. The total Server SAN market is projected to grow to over $48 billion by 2026. The traditional enterprise storage market is projected to decline by -16% CAGR, leading to an overall growth in storage spend of 3% CAGR through 2026. Traditional enterprise storage is being squeezed in a vice between a superior, lower cost and more flexible storage model with Enterprise Server SAN, and the migration of IT towards cloud computing and Hyperscale Server SAN deployments. Wikibon strongly recommends that CTOs & CIOs initiate Server SAN pilot projects in 2015, particularly for applications where either low cost or high performance is required.

If containers are at the center of a shift in how applications are developers and delivered, and their pace of growth and change is unprecedented in IT history, this could have a massive ripple effect on both suppliers and consumers of the ecosystem of IT technologies.