Big Data

Many of the information technologies that we take for granted today are not the result of some initial grand design. Rather, entrepreneurs used preexisting data sets in interesting new ways. That’s the power of big data.

Data collection and data sets are used to tailor new and better digital services to us. Online content and services are cheap or even free thanks to targeted ads based on big data.

Data-driven innovations are all around in services such as language translation tools, mobile traffic services, digital mapping technologies, spam and fraud detection tools, instant spell-checkers, and more. The economic benefits associated with data-driven innovation are profound but can be hard to measure and are likely underestimated as a result.

You can find more Mercatus research and commentary on big data and privacy concerns below.

Smart cities—places where connected devices, big data, the urban environment, and city dwellers intersect—have been widely lauded for their potential to improve government services and trans­parency. However, while most coverage of smart city innovations has focused on their potential benefits, these technologies also carry real risks for residents’ safety and civil rights.

A study from the Mercatus Center at George Mason University looks at both the positive and the negative aspects of smart cities. The study incorporates numerous case studies from cities that have experienced both success and failure in implementing smart city tools. The study recom­mends that reformers focus on the incentives of local officials, not just on the promises of new technologies, to ensure that smart city innovations provide a true benefit to citizens.

This study shows that preemptive, top-down regulation would derail the many life-enriching innovations that could come from new technologies. The study argues that permissionless innovation, which allows new technologies to flourish and develop in a relatively unabated fashion, is the superior approach to the Internet of Things. Combining public education, oversight, industry best practices, and transparency in a balanced, layered approach will be the proper way to address concerns about the Internet of Things—not prospective regulation based on hypothetical scenarios.

Privacy law today faces two interrelated problems. The first is an information control problem. Like so many other fields of modern cyberlaw — intellectual property, online safety, cybersecurity, etc. — privacy law is being challenged by intractable Information Age realities. Specifically, it is easier than ever before for information to circulate freely and harder than ever to bottle it up once it is released. This article sketches out some general lessons from online safety debates and discusses their implications for privacy policy going forward.

This article makes a seemingly contradictory argument: benefit-cost analysis is extremely challenging in online child safety and digital privacy debates, yet it remains essential that analysts and policy-makers attempt to conduct such reviews. While we will never be able to perfectly determine either the benefits or costs of online safety or privacy controls, the very act of conducting a regulatory impact analysis will help us to better understand the trade-offs associated with various regulatory proposals. However, precisely because those benefits and costs remain so remarkably subjective and contentious, this article argues that we should look to employ less restrictive solutions—education and awareness efforts, empowerment tools, alternative enforcement mechanisms, etc.—before resorting to potentially costly and cumbersome legal and regulatory regimes that could disrupt the digital economy and the efficient provision of services that consumers desire. This model has worked fairly effectively in the online safety context and can be applied to digital privacy concerns as well.

This article—which focuses on privacy rights against private actors rather than against the government—suggests that expanded regulation is not the most constructive way to go about ensuring greater online privacy. The inherent subjectivity of privacy as a personal and societal value is one reason why expanded regulation is not sensible. Privacy has long been a thorny philosophical and jurisprudential matter; few can agree on its contours or cite firm constitutional grounding for the rights or restrictions they articulate. Part I discusses some of the normative considerations raised by the debate on privacy right and argues that there may never be a widely accepted, coherent legal standard for privacy rights or harms in the United States. Part II considers the many enforcement challenges that are often ignored when privacy policies are being proposed or formulated. Part III of the article argues that the best way to protect personal privacy is to build on the approach now widely utilized to deal with online child safety concerns, where the role of law has been constrained by similar factors.

This paper deconstructs cultural and economic myths that come from fear arguments in the technology policy debate with a particular focus on online child safety, digital privacy, and cyber security. Information controls could stifle free speech, limit the free flow of ideas, and retard social and economic innovation. It is vital that public policy debates about information technology not be driven by technopanics and threat inflation. Instead, a four-part framework should be used to analyze the risks associated with new technological developments and determine the proper course of action. Technology fears that lead to harmful public policy can be countered by hard evidence and reasoning before they do serious damage to a free society.

Are social networking sites like Facebook, LinkedIn, and Twitter “information monopolies” that should be regulated as public utilities? While calls for social networking regulation are on the rise, there are good reasons why policymakers should avoid the rush and rethink classifying them as “-public utilities-.” Public utility regulation has traditionally been the arch-enemy of innovation, and regulation could have lasting effects on such a dynamic industry. Treating today’s leading social media providers as essential facilities threatens to convert predictions of “-natural monopoly-” into self-fulfilling prophecies.

In the field of Internet policy, 2011 has been the year of privacy. Congress has introduced six bills related to online privacy, and the Obama administration released two major reports recommending greater federal oversight of online markets. The Federal Trade Commission appears poised to step up regulatory activity on this front. State-level activity is also percolating, led by California, which floated two major bills recently.

These efforts would expand regulatory oversight of online activities in various ways. Some measures would institute “Fair Information Practice Principles”, governing the collection and use of personal information online. Others would limit some types of data collection, ban certain data or advertising practices, or create new mechanisms to help consumers block online ad-targeting techniques. Another measure would mandate websites adopt a so-called Internet “Eraser Button,” which would allow users to purge unwanted personal information from online sites and services.

Federal policy makers, state legislators, and state attorneys general have recently shown interest in regulating commercial advertising and marketing. Several new regulatory initiatives are being proposed, or are already underway, that could severely curtail or restrict advertising or marketing on a variety of platforms. The consequences of these stepped-up regulatory efforts will be profound and will hurt consumer welfare both directly and indirectly.

This book, published by O’Reilly Media, is a collection of essays and case studies, in which contributors inside and outside of government share their ideas on how information technology can make government more transparent and collaborative.

My message here today, condensed from two recent law review articles, boils down to three points. First, no matter how well-intentioned, restrictions on data collection could negatively impact the competitiveness of America’s digital economy, as well as consumer choice. Second, it is unwise to place too much faith in any single, silver-bullet solution to privacy, including “Do Not Track,” because such schemes are easily evaded or defeated and often fail to live up to their billing. Finally, with those two points in mind, we should look to alternative and less costly approaches to protecting privacy that rely on education, empowerment, and targeted enforcement of existing laws. Serious and lasting long-term privacy protection requires a layered, multifaceted approach incorporating many solutions.

This public interest comment addresses how the Children’s Online Privacy Protection Act (COPPA) affects online content and digital innovation. It goes without saying that COPPA is a complicated law and rule. When considering the rule and proposals to amend it, it is easy to get lost in the weeds and ignore the bigger picture. That would be a mistake. There are broader, more important questions that need to be asked as part of the Federal Trade Commission’s effort to expand this regulatory regime.