I see our world on a trajectory into a dystopian future that is frightening and undesirable. Technological progress is deeply transforming our societies, and while most of it is for the better, we need to step back occasionally and look at the bigger picture.

Storage costs are on a steep downward curve, similar to CPU costs -­­ except that the end of Kryder's Law (the storage equivalent of Moore's law) is not in sight yet. Kryder's conservative forecast in 2009 estimated that a zetabyte of storage will cost about 2.8 billion USD by 2020. Extrapolating that prices will halve roughly every two years, this means that a zetabyte might be as cheap as 100 million USD sometime between 2030 and 2040.

All human speech ever spoken, sampled at 16 khz audio, is estimated to be roughly 42 zetabytes. This means that by the time I reach retirement age, storage systems that can keep a full audio transcript of everything humanity has said in the last 10 years will be within the reach of many larger nation­ states. Perhaps I will live long enough to get CD quality, too. Impressive, but also terrifying.

A future where every word ever spoken and every action ever taken is recorded somewhere will lead to a collapse of what we understand as freedom in society. There is good reason that both the east German StaSi and the KGB kept vast troves of "kompromat" on anyone that showed political ambitions - such data was useful to discredit people that were politically active or to blackmail them into cooperation.

The trouble with kompromat is, though, that nobody needs to actually use it, or threaten its use, for it to become an effective deterrent to political activity. We can see this in western societies already: It is not uncommon for qualified and capable individuals to decide against standing in elections for fear of having their lives examined under a microscope. When everything you have ever done has been recorded, are you sure that none of it could be used to make you look bad?

What about the famous "three felonies a day" that even well-­meaning and law­-abiding citizens run into?

Clapper's argument that "it isn't collection until you look at it" is disingenuous and dangerous. By this logic, vast files tracking people's lives in pedantic detail are not problematic until that data is retrieved from a filing cabinet and read by a human. Transporting his logic into East Germany of the early 80's, collecting excruciating detail about people's private lives was OK, it was only when the StaSi actively used this data that things went wrong.

The discussion whether phone metadata records should be held by the government or by private entities does not matter. Data should only be held for the period which is necessary to perform a task, and storing data in excess of this period without allowing people to view / edit / remove this data carries the implicit threat that this data may be used to harm you in the future. Involuntary mass retention of data is oppressive. And while checks and balances exist now, we cannot be sure how they hold up over time. Deleting the data is the only prudent choice.

Well-­intentioned people can build highly oppressive systems, and not realize what they are doing. Erich Mielke, who had built the most oppressive security agency in living memory in order to protect "his" country from external and internal foes, famously said "but I love all people" in front of East German Parliament. He did not grasp the extent of the evil he had constructed and presided over.

Nobody wants a full-­packet­-capture society. It is fundamentally at odds with freedom. Arbitrary collection and retention of data on people is a form of oppression.

Policy recommendation: A different form of SIGINT budget

How do we balance the need to protect our countries against terrorism and foreign aggression with the need for privacy and data deletion that is necessary to have functioning democracies and non-­oppressive societies?

This question has been much-discussed in recent months, culminating in a set of recommendations made by the panel of experts that the Obama administration had convened.

I agree with the principles set forth in the above­-mentioned document, and with several of the recommendations. On the other hand, I feel that some of the recommendations focus too narrowly on the "how" of collection, rather than policing the overall end goal of avoiding mass surveillance. Regulations that are overly specific are often a combination of cumbersome-in-practice and easily-side-stepped -- properties you do not want from a law.

My personal recommendation is a different form of SIGINT budget: Aside from the monetary budget that Congress allots to the different intelligence agencies, a surveillance budget would be allotted, of a form similar to:

For the fiscal year 2014, you are allowed to collect and retain data on [X number] citizens and [Y number] non-­citizens for a year. Restrictions on the purposes of collection and retention of data still apply.

This budget could be publicly debated ­ and would make sure that data collection is focused on the areas that truly matter, instead of rewarding SIGINT middle managers that try to improve their career trajectories by showing how much "more data" they can collect. The budget would be accounted for in "storage-hours" to create incentives for early deletion. People can get promoted by showing the ability to do the same work while retaining less data, or retaining the data for briefer periods.

This may look similar in practice to the way cloud providers (like Amazon) charge for storage. The agencies get to store and keep data, but they get charged internally for this, daily or weekly. Retain too much data and your collection system runs out of budget - but you can free up budget by deleting old data. The overall budget is public, so the public can have a clear view of how much data is collected under all programs, instead of the undignified spectacle of "we do not collect this data under this program" non-denials.

The big trouble with sniffing internet traffic is that it is fundamentally addictive. You can see the spiral of escalation in almost every criminal hacking career. It is easy to underestimate that the same addictive property of data collection applies to organisations. Middle managers can shine by showing growth in collection, upper management can speak of "total domain dominance" and similar powerful-­sounding words. Collection becomes an end by itself. By imposing hard limits on the number of people whose lives can be touched through surveillance, we make sure that our efforts are focused on the real problems -- and remain liberty­-preserving.

If, for whatever reason, a SIGINT agencies runs out of "surveillance budget" in a given fiscal year, they can always ask Congress to grant an "emergency loan" - provided, of course, that this remains an exception.

Public budgeting and proper accounting of retained data, implemented in modern democracies, would give citizens a clean and understandable method to evaluate and discuss the extent of governmental data collection for national security, without introducing detailed micro­-management-rules on the "how" of collection. It would provide a clear answer to "how much data are you actually keeping", and create strong incentives for early data deletion. It is not perfect, but it may be the cleanest way of achieving both the security and the privacy that a free society needs.

Sunday, January 12, 2014

This is the first part of a two-part blog post on the need for intelligence reform.

Why do I even feel entitled to an opinion?

I have been dealing with the technical side of computer network attacks for more than 15 years, and have written exploits for about as long as the now-famous "tailored access operations" team inside NSA has existed. Many people consider me to be an expert on all things related to reverse engineering and exploitation. Through my work, I have had as much as exposure to government-organized hacking as you can have without getting a clearance. I understand this stuff, and as a firm believer in the ability of democracies to right themselves through informed debate, I feel the need to stray from my usual technical stomping grounds and talk about politics.

Over the years, I have met and talked with a number of people that used to work in, or close to, the intelligence community. I have found the vast majority of them to be conscientious, hard-working, idealistic (after all, pay in the government sector is often significantly below the private sector, so a sense of duty plays a large role), and overall good people. In political discussions, we had more commonalities than disagreements. Politically, while I am slightly left-of-center on many political questions, I am a defense and intelligence hawk (at least by European standards) - I do believe that intelligence agencies have a legitimate role to play in both foreign policy and counter-terrorism, and I am aware enough of the realities of international law that mean that countries that neglect their defense and intelligence organizations do so at their own peril.

At the same time, having grown up in a country more heavily burdened by historical abuse of state security institutions than most, and in a region of the world where - in living memory - many countries lost 5-10%+ of their entire population in wars fueled by nationalist ideals, I am instinctively worried about concentrating excessive powers in state security institutions. I am also easily alarmed by nationalist thoughts and ideology.

The Snowden revelations, but much more so the reactions to the Snowden revelations, have caused me to think about the implications of the technological changes we are in the midst of - for both society and surveillance. I conclude that our societies need a reform of the legal frameworks for signals intelligence in a digitized world - not only in the English-speaking countries, but also in all those countries that aspire to obtain the same capabilities.

Policy ideas are always the result of a combination of practical considerations and personal ideology. In order to be transparent with my personal ideology, I should explain as much of it as possible before delving into my ideas for reform. To do this, I will address a few common arguments that I have encountered that express incredulity at the public outrage, and explain why I think the outrage is (partially) justified.

"The Russians and Chinese are much worse, so where's the outrage about them?"

People are outraged at the disclosures about widespread espionage by English-speaking countries while they are not outraged by Russian or Chinese espionage because people expect different behavior from friends than from adversaries. Most of the world considers the English-speaking countries to be committed to principles of democracy, justice, and fairness. When dealing with them, these countries are treated as friends and allies. Nobody in central Europe for example is worried about a US invasion, while a faint fear of Russian invasion is never far away.

Expectations are different when it comes to Russia or China: These countries have such an abysmal record of human rights; such an abysmal record when it comes to questions of the rule of law that nobody expects anything from them. Russia is, for all purposes, treated as an aging and wounded bear, unpredictable but still dangerous. China is even compared to 1910-1914 Germany in the current issue of "The Economist", hardly a flattering comparison.

In short, it is entirely normal to expect different behavior from your friends than from your enemies or rivals. Having your apartment burgled by a known criminal gang is one thing, having your friend, whom you had over for dinner repeatedly, burgle your apartment, is a very different thing.

"We do not violate the privacy of our own citizens, and everything we do is outside our territory, so what's the damage?"

The problem with this argument is a discrepancy between the legalistic interpretation of the constitution and the emotional interpretation of the constitution - a discrepancy between "the letter of the law" and "the spirit of the law".

A constitution is aspirational - it outlines the basic principles and values to which a society aspires. These principles are universally recognized by a country's population as "the right thing to do".

In practice, though, the US cannot reasonably grant the rights in the 4th Amendment to people living in China, and Germany could not enforce the constitutionally guaranteed equality of all humans in apartheid-era South Africa. As a result, Constitutional rights end at borders. It is important to keep in mind, though, that this is not because we think that Chinese do not deserve protection from unreasonable search & seizure, or because we think that Freedom of Speech should not apply outside of our borders - but only because we are in no practical position to grant rights to someone living under the jurisdiction of another government. (There is the other matter that we'd violate international law, but if history is any guide, international law does not exist unless the strongest player wants to enforce it).

Nobody extends their constitutions across their borders because it would mean intervening in other countries. But the principles in the constitution are good principles, and we should try to adhere to them wherever possible. We cannot force the Chinese government to allow Freedom of Speech in China, but that does not mean that it would be OK for us to further suppress Freedom of Speech there - just because China happens to be outside of our borders.

Secondly, there is the Universal Declaration of Human Rights. This is as close to an universal constitution as humanity has gotten, and it explicitly mentions in article 12:

No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.

The UDHR is a good document, and one that all important powers signed after the atrocities of the two world wars. The US was a driving force in drafting it and getting it ratified - why are we completely ignoring it now, arguing that any privacy protections do not apply to non-citizen outside of our territory?

"Corporations collect vastly more data, and they are not under democratic control."

There is an important bit of truth in this statement: Corporations are collecting ever-more data, and it is quite unclear whether existing legal frameworks are sufficient to protect privacy. In my personal opinion, all developed nations should pass legislation that enforces something similar to the OECD's "seven principles for the protection of personal data", and hold companies accountable for this. People need to understand what data is collected, for what purpose, and have wide-ranging ability to inspect, edit and delete the collected data.

At the same time, the argument that insufficient legal oversight in one area justifies insufficient legal oversight in another area is clearly wrong. Both areas, corporate and government data collection, need to have their oversight fixed.

"Sufficient controls are in place to prevent abuse of power"

I'd be strongly inclined to believe this argument - but there are two important points that we should keep in mind. First off, checks and safety procedures are hardly ever perfect, and tend to erode in times of crisis. One could say that most democracies are two terror strikes and one opportunist away from a dictatorship, and safeguards are much more quickly eroded than they are rebuilt. Democratic societies need to stay in constant debate about where the limits of surveillance are supposed to lie.

I believe that today the controls in the US are sufficient to prevent the most egregious abuse of power. I do not have much faith, though, that they would survive one major terrorist strike combined with a wrongly ambitious president or vice president.

Legal safeguards in a democracy buy you time. If you elect a madman, dismantling the safeguards will take him some time. Hopefully, the safeguards take longer than 8 years to dismantle. Being a security-minded person, I'd like to have some margin of error on this.

The second point to consider is that of "creeping abuse". Post 9/11, exceptional powers were granted to the security apparatus to protect our societies from further terrorist strikes. These powers were explicitly granted for counter-terrorism. The natural inclination of the security apparatus is then to slowly and carefully widen the definition of terrorism. We can see this in action: Glenn Greenwald's partner, David Miranda was detained under legislation explicitly drafted for counter-terrorism - using rights only granted for fighting terrorists that are bent on mass killing, which Mr Miranda was clearly not about to do. We have also watched Mr Clapper publicly twisting the meaning of the word "collection" until it implied that a stamp collector doesn't collect stamps unless he looks at them.

In short: I am uncomfortable with what I perceive is an insufficiently wide safety margin against abuse - and we have all seen an abuse of anti-terror legislation for an entirely unrelated cause, that of self-defense of the security organizations against embarrassment. We need much stronger safeguards, and much more transparency.

"Spies spy, why are people surprised?"

I am not surprised, or even particularly worried, about state-to-state espionage. My opinion on this is that where matters are truly vital (nuclear proliferation, questions of war and peace etc.) intelligence collection should lead to better-informed leaders and hopefully peaceful outcomes.

My ethics dictate that strength should not be abused - e.g. I would consider it unethical by a strong developed nation to use espionage against a weak developing country to get a leg up in trade negotiations - but in general, nobody is surprised or outraged that the people in the White House want to know what the leaders in Tehran are thinking, and vice versa.

People are surprised because governments everywhere have been hesitant to explain to their own population what exactly intelligence agencies do. Similar to internet companies that hide the true extent of data collection in a gigantic EULA that no user understands, governments everywhere "hide" what these agencies do in plain sight: Large quantities of dispersed legalese and vague formulations.

Democratic governments need to become better at explaining what these agencies are for and what the exact authorities and limitations of these agencies are. Voters can then decide if they are cool with that. The historical tendency to hide these organisations from public view is wrong, antidemocratic, and ultimately harmful to both the democracies and the mission of these organisations.

"Everybody does it and has always done it!"

One could easily get into an argument about whether this statement is true or not - historically, many countries (including the US) only performed intercept and cryptanalysis during times of war. Then again, politicians tried to disband signals intelligence (SIGINT) organisations, these organisations had a tendency to be conserved elsewhere in the bureaucracy. So even if we accept that SIGINT collection in times of peace is an unchangeable fact of life, the nature of collection has changed significantly in recent decades.

Even during the height of the cold war, when the US had all its ears focused on Russia, the odds that some random Russian person had their communication intercepted and archived by the US were near-zero.

The technological explosion we're living in changed this: International communication has grown exponentially, and it is likely that the majority of the population of most industrialized nations have participated in communications that were intercepted (if not necessarily read by a human being).

This is a radical change. Technology has amplified everybody's ability to communicate, but also created a society where virtually everybody's data has been touched by one, if not more, security organisations - both domestic and foreign. The legal framework has simply been outpaced by technological progress, and the security agencies have been extremely happy to not draw attention to this.

This new reality needs to be addressed - not only in the countries that were hit by the recent revelations, but in all modern democracies (many of which have even weaker oversight over their intelligence agencies than the famous "5 eyes").

Summary:

Technology has changed the world, vastly expanding everybody's ability to communicate - but at the same time, also vastly expanding not only the potential for surveillance, but actual surveillance.

Intelligence collection should not be done "in bulk" - a regular person should have negligible odds of ever having their communication intercepted and archived.

Intelligence reform is needed - in all modern democracies - to ensure that people can have privacy, to combat the mistaken view that "all is fair if it's not on my territory", and to strengthen the safeguards against abuse.

My next post will talk a bit more about what reforms should be enacted, and what may happen if we fail to act.

Sunday, June 02, 2013

I was honored to invited as a keynote speaker at SOURCE Dublin 2013, which was held on the 23rd/24th of May. Given that I had nothing technical to speak about, and given that keynote talks are not supposed to be technical, I needed to come up with an entertaining topic related to IT security.

A few weeks earlier, I had watched Dave Aitel give an interview on TV somewhere, and he had said a particular sentence that I found thought-inducing: He compared the hiring of computer security folks by the DoD to "building a new Navy" that will control the trade routes of the future, the internet.

I found the thought interesting, and decided that I will see how far I can stretch this sentence. When thinking about the internet as the ocean, one is half a step away from the word "Pirates". And boy, everybody loves hearing about Pirates - the topic is rich in historical sources of dubious veracity and colorful lore. Clearly, I had found a great way to entertain the audience for 50 minutes.

So I used this as an excuse to buy and read some books on the history of piracy, and managed to construct what I immodestly think is a great piece of entertainment - a talk that manages to engage the audience, draw parallels between the early Boucaniers and Hackers (it's always good to flatter the audience a bit), comment on how the Boucaniers turned into Privateers, and generally get people to dream a bit. (Slides)

While flattering the audience is good and well, I also wanted to get the audience to question something they believe in. With everybody arguing about the evils of government-tolerated industrial espionage, I wanted to tell the audience that one country's criminal is often another country's hero - so what better way to draw a parallel between today's attackers and 16th-century Britain, an upcoming power attempting to gain the upper hand against almighty Spain.

All in all, I gave the talk (somewhat nervously), and I think I managed to engage and entertain the audience. I was quite happy with how it went, particularly because I was extremely nervous about having to give a presentation with no technical verifiable truth in it.

I had expected the talk to be an entertaining diversion, with little real relevance. Now, some very surprising things happened after the talk:

First, a number of people took the talk way too seriously, attempting to derive policy recommendations from my very tenuously constructed analogy. Analogies are great for examining a problem - given something unknown, there are few more interesting activities than to construct different analogies and then reason about where they fit and where they do not fit. Thus, they are great tools for understanding and examining - while being dangerously bad for prediction and policy advice.

Secondly, a different set of people begun arguing that the analogies are flawed because at some lower level of abstraction they break down ("I can't use the internet to turn sewage into shrimp, hence the internet can't be like an ocean"). I had difficulty understanding the effort and emotion people put into finding places where the analogy breaks down - given that I had meant it as entertaining, they seemed to me like the guy in a superhero movie that complains that some action scene was unrealistic.

Then something else happened that caught me completely off-guard: Pretty exactly one week after my presentation went online, the NYT published an op-ed contributed by JC Hirsch and Sam Adelsberg titled "An Elizabethan Cyberwar" - which was clearly strongly inspired by my keynote, down to individual details of my constructed analogies. The article takes the Britain/Spain analogy, mentions the deniability afforded to the British Crown by the privateering constructs that I had highlighted, and then proceeds to provide policy advice based on this.

I was stunned - first off, that something I had constructed for entertainment would end up inspiring an NYT op-ed a week later and secondly, that people are really trying to construct advice from it.

To clarify: I used Dave's analogy of "the internet as sea" and constructed the analogy to the Spanish Main as a form of entertainment, something to discuss over a glass of wine - not as something that should be used to draw any real-life lessons about the internet, or about cyberwarfare.

So what real-life lessons did I learn through this ? A good analogy is like a good joke: It is impossible to contain, travels fast, and can have surprising unintended consequences. Also, everybody is so desperate to understand "the internet" that the path from "small conference talk in Dublin" to "heavily influencing a NYT op-ed" is short. This highlights how little we understand technology's impact, and how much even tenuously constructed analogies fill an emotional need. Finally, as for good jokes and cyber attacks, attribution for good analogies seems hard - selfishly, I would have really liked a footnote to the op-ed.

Update: It seems analogies are like 0days - often discovered by multiple parties in parallel, confusing anyone who wants to do attribution :-). It seems the authors of the NYT op-ed had developed these ideas independently prior to my talk, and just delayed the publication of the article due to current events. They were not influenced in any way by my keynote. :-)

Sunday, March 31, 2013

I am happy to announce that we have a winner for the reverse engineering challenge: Among the submitters, Marion Marschalek's report stood out - both in terms of technical depth, but also in regards to the structure and readability of the report. Remarkably, this is Marion's first reverse engineering project. :-)

At the same time, I would like to say "Thank you" to everyone who submitted - I will make time in the next few weeks to send emails with more detailed feedback for each submission. It was great to see that this contest encouraged a number of first-time analysts to tackle a relatively thorny piece of malware.

Sunday, January 20, 2013

As a field, reverse engineering has undergone a rapid change in recent years:a rise in importance and visibility has led to a rapidly growing community ofreverse engineers. More people are doing reverse engineering, better tools are developed, and it has mutated from a "dark art" to an almost-mainstream endeavor.However, as the community grows, the most visible parts remain unchanged. While there are female reverse engineers in the field, they are still under-represented in absolute numbers and visibility of their work in conference attendance and presentations.What can we, as a growing field, do to change this? Progress can be made on themacro level by many small and decentralized contributions on the micro level. So, when I heard about the Syscan speaker's honorarium this year, I decided to put it to good use.I asked a few friends if they'd be willing to form a panel of judges for a women-only reverse engineering challenge, with the first (and only) prize beinga ticket to fly to and attend Syscan Singapore 2013. Luckily for me, they agreed :-)So, the details:1) The panel of judges posts a particularly interesting piece of malicious software. The executable file in question is found at the end of this mail (base64 encoded .zip with password 'infected'). 2) The challenge itself is to reverse engineer the software - create a report which ... - documents the features and capabilities of the malware - discusses the obfuscation and anti-reverse engineering measures employed - makes all these things accessible to a layperson with a computer security or computer science backgroundReports will be judged on technical merit and how well the submitter explainsthe technical details. Understanding is one part of the reverse engineering process documenting and communicating the result is an equally important part.3) Only women are eligible to submit4) The author of the best submission receives an economy-class round-trip flight ticket to attend Syscan 2013 in Singapore. Also, the authors entrance fee is covered.The panel of judges consists of: Carrie M. Jung Ralf-Phillip Weinmann Tim Kornau Thomas Dullien (me) Skylar Rampersaud Shyama RoseSubmissions should be sent to thomas.dullien+REChallenge2013@googlemail.comDeadline for submission is the 24th of March 2013, 23:59 GMT+1. Winners willbe announced on the 31st of March 2013. Unless the author explicitly objects, the winning report will be published.Furthermore, while this is not required, a lightning talk at Syscan about thesolution would be appreciated.Good luck ! Syscan is a great conference, and Singapore is definitely worth a visit, so don't hesitate ! :-)

About Me

I like simple things. And complex things. And drinking beer with people like Fyodor Yarochkin.
I like South America. And some parts of Asia, specifically Kuala Lumpur.
I like French. I like Spanish. I'd like to like more languages.