From Shannon to Snowden: the Human Target in Information Theory from the Beginning until Today

Tags:

Kurzthese:

There is a surprising continuity in information theory from Claude Shannon and Norbert Wieners theory to fight airplanes in WWII up to inline game advertising with big data analysis by Hadoop and detecting patterns in mass surveillance data.
From this point of view event meta data analysis is only a refinement to statistical analysis, a filter to reduce the amount of data making it easier and handier.
The speaker is personally involved multiple times, he did his PhD and Diploma thesis with subjects to Wieners theory, refused to serve as a soldier at a airbase with the Nike rocket system and is working in big data environments.

Beschreibung:

The story starts in the submarine warfare in WW I. Without computers the fighting of submarines was not an easy task. The best strategy of a submarin to escape a destroyer was to try a random walk.

Norbert Wiener on board of a ship crossing the Atlantic ocean described this random motion as the Wiener process, which is fairly unknown to the public, even if it is the foundation to explain Brownian motion. This is surprising, as the theory is nothing less than fundamental and looks completely innocent.

Claude Shannon used this description to analyze the motion of an airplane, steered by a human pilot. His works on how to communicate in a feedback loop to a rocket system fighting the aiplane improved the usability of anti-airplane rockets crucially.

The result was the start of Information Theory. His statistical analysis at AT&T led to the development of the Nike weapon system, which persistet in updated versions until 1976. The brute force scaling during the cold war area simply was to put nuclear was heads into the Nike rockets. This was absolutely unacceptable from the civil point of view and had mayors of certain cities leave the staff exercises in the West German government bunker in the Eifel.

The nuclear Nike program has been stopped with the SALT talks.

With Big Data analysis, the general pattern is back. The "random walk" is now completely virtual. The goal is to lead the user to the "BUY" button on a web site or in a computer game. It is well known, that users going to leave a game receive extra rewards and that by evolutionary development, called A-B testing a guided through a multi page website to the "BUY" button.

In a military or surveillance environment, the target is less virtual, and it can be everybody, not even the gamer or the customer of a website. Thus, the amount of data is much bigger as even big data could handle today. Therefore, meta data is the perfect filter to extract the important sample of the enormous flood of input.

All these statistical method are pure probalistic. This means, you get some ten percent of errors, which is not a problem for a gamer. However, in a surveillance or military environment given the current distribution of military power this means that "the war on terror" causes more innocent deads than the terror itself and therefore cannot be distinguished from terror by statistical methods.