April272012

To paraphrase Ben Franklin, he who sacrifices online freedom for the sake of cybersecurity deserves neither. Last night, the Cyber Intelligence Sharing and Protection Act (CISPA) (H.R. 3523) through the United States House of Representatives was sent to a vote a day earlier than scheduled. CISPA passed the House by a vote of 250-180, defying a threatened veto from the White House. The passage of CISPA now sets up a fierce debate in the Senate, where Senate Majority Leader Harry Reid (D-NV) has indicated that he wishes to bring cybersecurity legislation forward for a vote in May.

The votes on H.R. 3523 broke down along largely partisan lines, although dozens of both Democrats and Republicans voted for or against CISPA it in the finally tally. CISPA was introduced last November and approved by the House Intelligence Committee by a 17-1 vote before the end of 2011, which meant that the public has had months to view and comment upon the bill. The bill has 112 cosponsors and received no significant opposition from major U.S. corporations, including the social networking giants and telecommunications companies who would be subject to its contents.

In fact, as an analysis of campaign donations by Maplight showed, over the past two years interest groups that support CISPA have outspent those that oppose it by 12 to 1, ranging from defense contractors, cable and satellite TV providers, software makers, cellular companies and online computer services.

As TechDirt observed last night, the final version of CISPA — available as a PDF from docs.house.gov contained more scope on the information types collected in the name of security. Specifically, CISPA now would allow the federal government to use information for the purpose of investigation and prosecution of cybersecurity crimes, protection of individuals, and the protection of children. In this context, a "cybersecurity crime" would be defined as any crime that involves network disruption or "hacking."

Civil libertarians, from the Electronic Frontier Foundation (EFF) to the American Civil Liberties Union, have been fiercely resisting CISPA for months. "CISPA goes too far for little reason," said Michelle Richardson, the ACLU legislative counsel, in a statement on Thursday. "Cybersecurity does not have to mean abdication of Americans' online privacy. As we've seen repeatedly, once the government gets expansive national security authorities, there's no going back. We encourage the Senate to let this horrible bill fade into obscurity."

The Center for Democracy and Technology issued a statement that it was:

"... disappointed that House leadership chose to block amendments on two core issues we had long identified — the flow of information from the private sector directly to NSA and the use of that information for national security purposes unrelated to cybersecurity. Reps. Thompson, Schakowsky, and Lofgren wrote amendments to address those issues, but the leadership did not allow votes on those amendments. Such momentous issues deserved a vote of the full House. We intend to press these issues when the Senate takes up its cybersecurity legislation."

Alexander Furnas included a warning in his nuanced exploration of the bill at The Atlantic:

"CISPA supporters — a list that surprisingly includes SOPA opponent Congressman Darrell Issa — are quick to point out that the bill does not obligate disclosure of any kind. Participation is 'totally voluntary.' They are right, of course, there is no obligation for a private company to participate in CISPA information sharing. However, this misses the point. The cost of this information sharing — in terms of privacy lost and civil liberties violated — is borne by individual customers and Internet users. For them, nothing about CISPA is voluntary and for them there is no recourse. CISPA leaves the protection of peoples' privacy in the hands of companies who don't have a strong incentive to care. Sure, transparency might lead to market pressure on these companies to act in good conscience; but CISPA ensures that no such transparency exists. Without correctly aligned incentives, where control over the data being gathered and shared (or at least knowledge of that sharing) is subject to public accountability and respectful of individual right to privacy, CISPA will inevitably lead to an eco-system that tends towards disclosure and abuse."

The context that already exists around digital technology, civil rights and national security must also be acknowledged for the purposes of public debate. As the EFF's Trevor Timm emphasized earlier this week, once national security is invoked, both civilian and law enforcement wield enormous powers to track and log information about citizens' lives without their knowledge nor practical ability to gain access to the records involved.

On that count, CISPA provoked significant concerns from the open government community, with the Sunlight Foundation's John Wonderlich calling the bill terrible for transparency because it proposes to limit public oversight of the work of information collection and sharing within the federal government.

"The FOIA is, in many ways, the fundamental safeguard for public oversight of government's activities," wrote Wonderlich. "CISPA dismisses it entirely, for the core activities of the newly proposed powers under the bill. If this level of disregard for public accountability exists throughout the other provisions, then CISPA is a mess. Even if it isn't, creating a whole new FOIA exemption for information that is poorly defined and doesn't even exist yet is irresponsible, and should be opposed."

What's the way forward?

The good news, for those concerned about what passage of the bill will mean for the Internet and online privacy, is that now the legislative process turns to the Senate. The open government community's triumphalism around the passage of the DATA Act and the gathering gloom and doom around CISPA all meet the same reality in this respect: checks and balances in the other chamber of Congress and a threatened veto from the White House.

Well done, founding fathers.

On the latter count, the White House has made it clear that the administration views CISPA as a huge overreach on privacy, driving a truck through existing privacy protections. The Obama administration has stated (PDF) that CISPA:

"... effectively treats domestic cybersecurity as an intelligence activity and thus, significantly departs from longstanding efforts to treat the Internet and cyberspace as civilian spheres. The Administration believes that a civilian agency — the Department of Homeland Security — must have a central role in domestic cybersecurity, including for conducting and overseeing the exchange of cybersecurity information with the private sector and with sector-specific Federal agencies."

At a news conference yesterday in Washington, the Republican leadership of the House characterized the administration's position differently. "The White House believes the government ought to control the Internet, government ought to set standards, and government ought to take care of everything that's needed for cybersecurity," said Speaker of the House John Boehner (R-Ohio), who voted for CISPA. "They're in a camp all by themselves."

Representative Mike Rogers (R-Michigan) -- the primary sponsor of the bill, along with Representative Dutch Ruppersberger (D-Maryland) -- accused opponents of "obfuscation" on the House floor yesterday.

While there are people who are not comfortable with the Department of Homeland Security (DHS) holding the keys to the nation's "cyberdefense" — particularly given the expertise and capabilities that rest in the military and intelligence communities — the prospect of military surveillance of citizens within the domestic United States is not likely to be one that the founding fathers would support, particularly without significant oversight from the Congress.

CISPA does not, however, formally grant either the National Security Agency or DHS any more powers than they already hold under existing legislation, such as the Patriot Act. It would, however, enable more information sharing between private companies and government agencies, including threat information pertinent to legitimate national security concerns.

It's crucial to recognize that cybersecurity legislation has been percolating in the Senate for years now without passage. That issue of civilian oversight is a key issue in the Senate wrangling, where major bills have been circulating for years now without passage, from proposals from Senator Lieberman's office on cybersecurity to the ICE Act from Senator Carper to Senator McCain's proposals.

If the fight over CISPA is "just beginning", as Andy Greenberg wrote in Forbes today, it's important for everyone that's getting involved because of concerns over civil liberties or privacy recognizes that CISPA is not like SOPA, as Brian Fung wrote in the American Prospect, particularly after provisions regarding intellectual property were dropped:

"At some point, privacy groups will have to come to an agreement with Congress over Internet legislation or risk being tarred as obstructionists. That, combined with the fact that most ordinary Americans lack the means to distinguish among the vagaries of different bills, suggests that Congress is likely to win out over the objections of EFF and the ACLU sooner rather than later. Thinking of CISPA as just another SOPA not only prolongs the inevitable — it's a poor analogy that obscures more than it reveals."

That doesn't mean that those objections aren't important or necessary. It does mean, however, that anyone who wishes to join the debate must recognize that genuine security threats do exist, even though massive hype about a potential "Cyber 9/11" perpetuated by contractors that stand to benefit from spending continues to pervade the media. There are legitimate concerns regarding the theft of industrial secrets, "crimesourcing" by organized crime and the reality of digital agents from the Chinese, Iranian and Russian governments — along with non-state actors — exploring the IT infrastructure of the United States.

The simple reality is that in Washington, national security trumps everything. It's not like intellectual property or energy or education or healthcare. What anyone who wishes to get involved in this debate will need to do is to support an affirmative vision for what roles the federal government and the private sector should play in securing the nation's critical infrastructure against electronic attacks. And the relationship of business and government complicates cybersecurity quite a bit, as "Inside Cyber Warfare" author Jeffrey Carr explained here at Radar in February:

"Due to the dependence of the U.S. government upon private contractors, the insecurity of one impacts the security of the other. The fact is that there are an unlimited number of ways that an attacker can compromise a person, organization or government agency due to the interdependencies and connectedness that exist between both."

On Monday, more than 60 distinguished IT security professionals, academics and engineers published an open letter to Congress urging opposition to any "'cybersecurity' initiative that does not explicitly include appropriate methods to ensure the protection of users’ civil liberties."

The open question now, as with intellectual property, is whether major broadcast and print media outlets in the United States will take their role of educating citizens seriously enough for the nation to meaningfully participate in legislative action.

This is a debate that will balance the freedoms that the nation has fought hard to achieve and defend throughout its history against the dangers we collectively face in a century when digital technologies have become interwoven into the everyday lives of citizens. We live in a networked age, with new attendant risks and rewards.

Citizens should hold their legislators accountable for supporting bills that balance civil liberties, public oversight and privacy protections with improvements to how the public and private sector monitors, mitigates and shares information about network security threats in the 21st century.

March272012

Over a century ago, Supreme Court Justice Lewis Brandeis "could not have imagined phones that keep track of where we are going, search engines that predict what we're thinking, advertisers that monitor what we're reading, and data brokers who maintain dossiers of every who, what, where, when and how of our lives," said Federal Trade Commission Chairman Jon Leibowitz yesterday morning in Washington, announcing the release of the final version of its framework on consumer privacy.,

"But he knew that, when technology changes dramatically, consumers need privacy protections that update just as quickly. So we issue our report today to ensure that, online and off, the right to privacy, that 'right most valued by civilized men,' remains relevant and robust to Americans in the 21st century as it was nearly 100 years ago."

The final report clearly enumerates the same three basic principles that the draft of the FTC's privacy framework outlined for companies :

Privacy by design, where privacy is "built in" at every stage that an application, service or product is developed

Simplified choice, wherein consumers are empowered to make informed decisions by clear information about how their data will be used at a relevant "time and context," including a "Do Not Track" mechanism, and businesses are freed of the burden of providing unnecessary choices

Greater transparency, where the collection and use of consumer data is made more clear to those who own it.

"We are demanding more and better protections for consumer privacy not because industry is ignoring the issue," said Leibowitz today. "In fact, the best companies already follow the privacy principles we lay out in the report. In the last year, online advertisers, major browser companies, and the W3C -- an Internet standard setting group -- have all made strides towards putting into place the foundation of a Do Not Track system, and we commit to continue working with them until all consumers can easily and effectively choose not to be tracked. I'm optimistic that we'll get the job done by the end of the year."

First, it will not apply to "companies that collect and do not transfer only non-sensitive data from fewer than 5,000 consumers a year," which would have been a burden on small businesses. Second, the FTC has brought action against Google and Facebook since the draft report was issued. Those actions -- and the agreements reached -- provide a model and guidance for other companies.

Third, the FTC made specific recommendations to companies that offer mobile services that include improved privacy protections and disclosures that are short, clear and effective on small screens. Fourth, the report also outlined "heightened privacy concerns" about large platform providers, such as ISPs, "operating systems, browsers and social media companies," seeking to "comprehensively track consumers' online activities." When asked about "social plug-ins" from such a platform, chairman Leibowitz provided Facebook's "Like" button as an example. (Google's +1 button is presumably another such mechanism.)

Finally, the final report also included a specific recommendation with respect to "data brokers," which chairman Leibowitz described as "cyberazzi" on Monday, echoing remarks at the National Press Club in November 2011. Over at Forbes, Kashmir Hill reports that the FTC officially defined a data broker as those who “collect and traffic in the data we leave behind when we travel through virtual and brick-and-mortar spaces."

During the press conference, chairman Leibowitz said that American citizens should be able to learn see what information is held by them and "have the right to correct inaccurate data," much as they do with credit reports. Specifically, the FTC has called on data brokers to "make their operations more transparent by creating a centralized website to identify themselves, and to disclose how they collect and use consumer data. In addition, the website should detail the choices that data brokers provide consumers about their own information."

While the majority of the tech media's stories about the FTC today focused on "Do Not Track" prospects and mechanisms, or the privacy framework's impact on mobile, apps and social media, the reality of this historic moment is it's world's world's data brokers that currently hold immense amounts of information regarding just about everyone "on the grid," even if they never "Like" something on Facebook, turn on a smartphone or buy and use an app.

In other words, even though the FTC's recommendations for privacy by design led TechMeme yesterday, that's wasn't new news. CNET's Declan McCullagh, one of the closest observers of Washington tech policy in the media, picked up on the focus, writing that FTC stops short of calling for a new DNT law but "asks Congress to enact a new law that "would provide consumers with access to information about them held by a data broker" such as Lexis Nexis, US Search, or Reed Elsevier subsidiary Choicepoint -- many of which have been the subject of FTC enforcement actions in the last few years." As McCullagh reported, the American Civil Liberties Union "applauded" the FTC's focus on data brokers.

They should. As Ryan Singel pointed out at Wired, the FTC's report does "call for federal legislation that would force transparency on giant data collection companies like Choicepoint and Lexis Nexis. Few Americans know about those companies’ databases but they are used by law enforcement, employers and landlords."

Another year without privacy legislation?

Whether it's "baseline privacy protections" or more transparency for data brokers, the FTC is looking to Congress to act. Whether it will or not is another matter. While the Online privacy debate was just about as hot in Washington nearly two years ago as it is today, no significant laws were passed.The probability of significant consumer privacy legislation advancing in this session of Congress, however, currently appears quite low. While at least four major privacy bills have been introduced in the U.S. House and Senate, "none of that legislation is likely to make it into law in this Congressional session, however, given the heavy schedule of pending matters and re-election campaigns," wrote Tanzina Vegas and Edward Wyatt in the New York Times.

The push the FTC gave yesterday was welcomed in some quarters. "We look forward to working with the FTC toward legislation and further developing the issues presented in the report," said Leslie Harris, president of the Center for Democracy and Technology (CDT), in a prepared release. CDT also endorsed the FTC's guidance on "Do Not Track" and focus on large platform providers. Earlier this winter, a coalition of Internet giants, including Google, Yahoo, Microsoft, and AOL, have committed to adopt “Do Not Track technology” in most Web browsers by the end of 2012. These companies, which deliver almost 90 percent of online behavioral advertisements, have agreed not to track consumers if they choose to opt out of online tracking using the Do Not Track mechanism, which will likely manifest as a button or browser plug-in. All companies that have made this commitment will be subject to FTC enforcement.

By way of contrast, Jim Harper, the Cato Institute's director of information policy studies, called the framework a "groundhog report on privacy," describing it as "regulatory cheerleading of the same kind our government’s all-purpose trade regulator put out a dozen years ago." In May of 2000, wrote Harper, "the FTC issued a report finding “that legislation is necessary to ensure further implementation of fair information practices online” and recommending a framework for such legislation. Congress did not act on that, and things are humming along today without top-down regulation of information practices on the Internet."

Overall, the "industry here has a self-interest beyond avoiding legislation," said Leibowitz during today's press conference. Consumers have very serious concerns about privacy, he went on, alluding to polling data, surveys and conversations, and "better, clearer privacy policies" will lead to people having "more trust in doing business online."

This FTC privacy framework and the White House's consumer privacy bill of rights will, at minimum, inform the debates going forward. What happens next will depend upon Congress finding a way to protect privacy and industry innovation. It will be a difficult balance to strike, particularly given concerns about protecting children online and the continued march of data breaches around the country.

Felten launched "Tech at the FTC" last Friday morning, a new blog that he hopes will play a number of different roles in the discussion of technology, government and society.

"It will combine Freedom to Tinker posts," he said, "some of which were op-ed, some more like teaching. The latter is what I'm looking for: explanations of sophisticated technical information that cross over to a non-technical audience."

Felten wants to start a conversation that's "interesting to general public" and "draws them into the discussion" about the intersection of regulation and technology. One aspect of that will be a connected Twitter account, @TechFTC, along with his established social identity, @EdFelten.

Possible future topics will include security issues around passwords and authentication of people in digital environments, both of which Felten finds interesting as they relate to policy. He said that he expects to write about technology stories that are in the news, with the intent of helping citizens to understand at an accessible level what the take away is for them.

Social media and the Internet are "useful to give people a window into the way people in government are thinking about these issues," said Felten. "They let people see that people in government are thinking about technology in a sophisticated way. It's easy to fall into the trap where people in government don't know about technology. That's part of the goal: speak to the technical community in their language.

"Part of my job is to be an ambassador to the technology community, through speaking to and with the public," said Felten. "The blog will help people know how to talk to the FTC and who to talk to, if they want to. People think that we don't want to talk to them. Just emailing us, just calling us, is usually the best way to get a conversation started. You usually don't need a formal process to do this -- and those conversations are really valuable."

In that context, he plans to write more posts like the one that went live Monday morning, on tech highlights of the FTC privacy report, in which he highlighted four sections of the framework that the computer science professor thought would be of interest to techies:

De-identified data (pp. 18-22): Data that is truly de-identified (or anonymous) can’t be used to infer anything about an individual person or device, so it doesn’t raise privacy concerns. Of course, it’s not enough just to say that data is anonymous, or that it falls outside some narrow notion of PII. But beyond that, figuring out whether your dataset is really de-identified can be challenging. If you’re going to claim that data is de-identified, you need to have a good reason-the report calls it a “reasonable level of justified confidence”-for claiming that the data does not allow inferences about individuals. What “reasonable” means-how confident you have to be-depends on how much data there is, and what the consequences of a breach would be. But here’s a good rule of thumb: if you plan to use a dataset to personalize or target content to individual consumers, it’s probably not de-identified.

Sensitive data (pp. 47-48): Certain types of information, such as health and financial information, information about children, and individual geolocation, are sensitive and ought to be treated with special care, for example by getting explicit consent from users before collecting it. If your service is targeted toward sensitive data, perhaps because of its subject matter or target audience, then you should take extra care to provide transparency and choice and to limit collection and use of information. If you run a general-purpose site that incidentally collects a little bit of sensitive information, your responsibilities will be more limited.

Mobile disclosures (pp. 33-34): The FTC is concerned that too few mobile apps disclose their privacy practices. Companies often say that users accept their data practices in exchange for getting a service. But how can users accept your practices if you don’t say what they are? A better disclosure would tell users not only what data you’re collecting, but also how you are going to use it and with whom you’ll share it. The challenging part is how to make all of this clear to users without subjecting them to a long privacy policy that they probably won’t have time to read. FTC staff will be holding a workshop to discuss these issues.

Do Not Track (pp. 52-55): DNT gives users a choice about whether to be tracked by third parties as they move across the web. In this section of the report, the FTC reiterates its five criteria for a successful DNT system, reviews the status of major efforts including the ad industry’s self-regulatory program and the W3C’s work toward a standard for DNT, and talks about what steps remain to get to a system that is practical for consumers and companies alike.

When asked about what the developers and founders of startups should be thinking about with respect to the FTC's privacy framework, Felten emphasized those three basic principles -- privacy by design, simplified choice, greater transparency -- and then offered some common sense:

"Start with the basic question of 'what Section 5 means for you,' he suggested. "If you make a promise to consumers in your privacy policy, consumers are entitled to rely on that. The FTC has brought cases against companies that made them and didn't hold up their responsibility around privacy. You have a responsibility to protect consumer data. If not, you may find yourself on the wrong side of the FTC act if there's a breach and it harms consumers."

December212011

The U.S. House of Representatives recently passed legislation that will, if it becomes law, make it easier for companies and users to share video viewing habits.

The current law, adopted in 1988 as the Video Privacy Protection Act (VPPA), requires written consent for rental information to be shared — which is nearly impossible for companies like Netflix or Facebook to accommodate. As a post on Wired.com points out, the VPPA also "requires the user to separately consent to each disclosure, which some users might find annoying." Annoying, yes, but also so cumbersome for all involved that it makes sharing more trouble than its worth.

The new legislation looks to change this. The Wired post explains that it will "tweak the Video Privacy Protection Act to allow users to consent to video sharing over the web. It would also allow users to consent once to all future sharing."

The original law clearly failed to have the foresight of technological advances and, thus, had the unintended consequences of impeding consensual sharing activity. To get further insight on the new legislation and the effects it might have — intentional and otherwise — I reached out to transactional and intellectual property attorney Dana Newman (@DanaNewman). Our short interview follows.

The Video Privacy Protection Act had some unintended consequences. Do you foresee something similar for this legislation?

Dana Newman: Just as the distribution of video has changed dramatically since the passage of the original law, it will undoubtedly continue to evolve in unforeseen ways such that the definition of a "video rental service" may become obsolete or much broader than currently envisioned. Also, the granting of a blanket consent to sharing video watching information opens up greater privacy concerns than one providing for consent to be granted on a case-by-case basis.

What's your take on what appears to be a course correction in this new legislation?

Dana Newman: The original law was written at a time when there was far less sharing of personal, consumer and media consuming information, so the impact wasn't as pronounced as it is in the digital era. The new law takes into account the fact that video rentals have shifted online and that we now choose to publicly share our media preferences and entertainment watching/listening/reading activities. It could pave the way for less selective control over disclosure of consumers' rental (or subscription) habits for other digital content, such as music or books.

What wider-reaching effects might this legislation have? Could it lead to updates in other areas, such as copyright?

Dana Newman: I see this as being about the licensee's privacy around their use of the content rather than a copyright issue. We've already had online click-wrap license agreements for some time. But this is an example of Congress recognizing the need to update the laws to adapt to a digital distribution model for content as opposed to the physical copies/physical stores model, as many have argued is needed in the area of copyright law.

TOC NY 2012 — O'Reilly's TOC Conference, being held Feb. 13-15, 2012, in New York City, is where the publishing and tech industries converge. Practitioners and executives from both camps will share what they've learned and join together to navigate publishing's ongoing transformation.

November022011

As we do more online — shop, browse, chat, check in, "like" — it's clear that we're leaving behind an immense trail of data about ourselves. Safeguards offer some level of protection, but technology can always be cracked and the goals of data aggregators can shift. So if digital data is and always will be a moving target, how does that shape our expectations for privacy? Terence Craig (@terencecraig), co-author of "Privacy and Big Data," examines this question and related issues in the following interview.

Your book argues that by focusing on how advertisers are using our data, we might be missing some of the bigger picture. What are we missing, specifically?

Terence Craig: One of the things I tell people is I really don't care if companies get more efficient at selling me soap. What I do care about is the amount of information that is being aggregated to sell me soap and what uses that data might be put toward in the future.

One of the points that co-author Mary Ludloff and I tried to make in the book is that the reasons behind data collection have nothing to do with how that data will eventually be used. There's way too much attention being paid to "intrusions of privacy" as opposed to the problem that once data is out there, it's out there. And potentially, it's out there as long as electronic civilization exists. How that data will be used is anybody's guess.

What's your take on the promise of anonymity often associated with data collection?

If we assume that companies have good will toward their consumers' data — and I'll assume that most large corporations do — these companies can still be hacked. They can be taken advantage of by bad employees. They can be required by governments to provide backdoors into their systems. Ultimately, all of this is risky for consumers.

Assuming that data can't be anonymized and companies don't have malicious plans for our personal data, what expectations can we have for privacy?

Terence Craig: We've moved back to our evolutionary default for privacy, which is essentially none. Hunter-gatherers didn't have privacy. In small rural villages with shared huts between multi-generational families, privacy just wasn't really available there.

The question is how do we address a society that mirrors our beginnings, but comes with one big difference? Before, anyone who knew the intimate details of our lives were people we had met physically, and they were often related to us. But now the geographical boundary has been erased by the Internet, so what does that mean? And how are we as a society going to evolve to deal with that?

With that in mind, I've given up on the idea of digital privacy as a goal. I think you have to if you want to reap the rewards of being a full participant in a digitized society. What's important is for us to make sure we have transparency from the large institutions that are aggregating data. We need these institutions to understand what they're doing with data and to share that with people so we, in aggregate, can agree whether or not this is a legitimate use of our data. We need transparency so that we — consumers, citizens — can start to control the process. Transparency is what's important. The idea that we can keep the data hidden or private, well ... that horse has left the stable.

What's the role of governments here, both in terms of the data they keep but also the laws they pass about data?

Terence Craig: Basically anything the government collects, I believe should be made available. After all, governments are some of the largest aggregators of data from all sorts of people. They either purchase it or they demand it for security needs from primary collectors like Google, Facebook, and the cell phone companies — the millions of requests law enforcement agencies sent to Sprint in 2008-2009 was a big story we mentioned in the book. So, it's important that governments reveal what they're doing with this information.

Obviously, there's got to be a balance between transparency and operational security needs. What I want is to have a general idea of: "Here's what we — the government — are doing with all of the data. Here's all of the data we've collected through various means. Here's what we're doing with it. Is that okay?" That's the sort of legislation I would like, but you don't see that anywhere at this point.

This interview was edited and condensed.

Privacy and Big Data — This book introduces you to the players in the personal data game, and explains the stark differences in how the U.S., Europe, and the rest of the world approach the privacy issue.

September272011

The Slow Way to SPDY -- attempting to actually try SPDY for yourself sounds like a nightmare as getting hold of a stable SPDY implementation at this point is not unlike an uphill climb on a slow mudslide - the protocol is currently on its third draft but not really stable, most of the available code is outdated, and despite the links on this page, hardly any of it is easy to get to work in a weekend. (via Nelson Minar)

Referendum Tool -- New Zealand faces a referendum on voting system (currently "mixed member proportional"), and this page is an interesting approach to helping you figure out which system you should endorse based on your preferences for how a voting system should work ("It is better if the Government is made up of one party, with a majority in Parliament, so that that party can implement its policies, and react decisively to events as they come up" vs "It is better if the Government is made up of a group of parties (a coalition), so that its decisions better reflect what the majority of voters want, even if that means important decisions might be delayed."). I like this because it helps you understand translate your preferences into a specific vote.