"Exquisite Source," by Harvey Blume (August 12, 1999)
Heads turned in June when the Linux operating system was awarded first prize by the judges of an international art festival. How far, one wonders, can the open source model go?

"With Liberty and Justice for Me," by Mark Dery (July 22, 1999)
Is the Internet giving ordinary people more control over their lives? An e-mail exchange with Andrew L. Shapiro, the author of The Control Revolution.

"Bits of Beauty," by Harvey Blume (June 3, 1999)
Yes, it's art. Now what is there to say about it? An assessment of the first-ever Cyberarts Festival in Boston, where art criticism is forced to play catch-up with technology.

"The MP3 Revolution," by Charles C. Mann (April 8, 1999)
The recording industry may indeed have something to worry about. If the much-talked-about digital format (or something like it) catches on, we could be carried back into our musical and cultural past.

WS: We all know that there are political values built into the design of constitutions and governments, not to mention everything from buildings to furniture to symphonies and poems. Now, as you point out, we're seeing that there are political values embedded in the design of software -- of computer interfaces, operating systems, and networks. In other words, we're realizing that there is ideology inherent in the design of code. What is the ideology embedded in the design of Code? Can you summarize, in a nutshell, the political values that inspired, and clearly animate, this book?

LL: I guess the fundamental value is responsibility -- that we, as a people, be responsible for the world we construct. Much of my book is aimed simply at showing what the values of cyberspace are, how they are changing, and how they relate to values from real space, not so much to argue for one set of values over the other, but rather to argue that we should choose.

Beyond this, I believe the values I push in the book are values that come from our constitutional tradition. The values of free speech and privacy; of limited and balanced protection for intellectual property; of limits on the power of government.

WS: In his Atlantic Unboundreview, Charles Mann argues that you have oversimplified the nature of code and that you tend to view business (incorrectly, he says) "as a monolithic force driving the Net toward more regulation." According to Mann, you don't make enough of the distinction between "soft code" and "hard code," while at the same time you place too much importance on encryption and its use by commercial interests to threaten values (such as fair use and privacy) previously taken for granted. Mann's point is that "market forces push both ways," and that your picture is therefore too dark. How do you respond to this?

LL: Charles Mann is certainly right that "market forces push both ways," and he is right that I have not done enough in my book to emphasize the incentives business would have to protect values such as "fair use." (A bookstore, for example, is not compelled to allow readers to browse books; but market incentives are enough to guarantee a fairly extensive right to browse.) But I do think Mann has missed an important aspect of my argument, and that he has been tripped up by an argument I don't make.

If the government took no sides in the battle over encrypted "trusted systems" (copyright management systems that make it possible to control the distribution of copyrighted material) then I would have more of the faith that Mann does that market forces could push in both directions. But a premise of my argument is that government won't stand neutral; that it will back up "code regulation" with the regulation of law; and that code plus law will tilt the market in the direction of perfect trusted systems.

Indeed, the government has done this already. The Digital Millennium Copyright Act makes it a felony to write code that attempts to circumvent the encryption technologies Mann describes, even if the intended use of the underlying material would be permissible under the copyright law. That means, for example, if the code blocks you from making an extraction that would be "fair use," then it is a felony to write code to circumvent that block.

This is law and code conspiring to tilt market forces quite decidedly in one direction rather than another. And it is not the first example. I describe one other in my book -- the Audio Home Recording Act. That statute requires manufacturers of DAT recording technology to build in a serial management system to make it difficult for users to make multiple copies of the same digital recording. The aim was to use code to protect digital recordings, and the law mandates that code. I don't know of a single U.S. manufacturer of DAT recording technology that offers, contrary to the law, a recording device that does not have that encryption technology built in. No doubt, but for the law, many would like to; but my point is that with the law, very few will.

So Mann is right that just now, with MP3 technologies, there are companies building machines that make it possible to rip CDs and send the copies to friends on the Net. But if the recording industry succeeds in getting a court to rule that those technologies violate the DMCA, or if they succeed in getting Congress to pass a law against these technologies, then I think one would be naive to believe "market forces" would effect a revolution. The knights of Silicon Valley want to retire to large homes on the water, not to federal penitentiaries.

Finally, Mann makes much of the distinction between "hardware" and "software." As I wrote when I introduced the term, by "code" I mean "the software and hardware that make cyberspace what it is" (p. 6). Both make up the "code" my book is about. Moreover, while I do believe that on the margin, burning code into hardware would make it more difficult to change, I think it is a mistake to argue that there is any necessary connection between the ease with which one can modify code and whether that code is soft or hard. Depending upon the implementation, one can write software to bypass hardware code; and depending upon the implementation, it can be practically impossible to write code that escapes the control of software. The ease of bypassing any bit of code is a function of how the code is designed, and not so much a function of whether it is hard or soft.

WS: You write of the potential for "open code" (or open source) to limit the power of government and business to regulate cyberspace in ways that could be detrimental. As you put it, "open code functions as a kind of Freedom of Information Act for network regulation." If Judge Jackson, in his "findings of fact" in the Microsoft case, is right about Linux and open source -- that it's essentially a fringe phenomenon that doesn't pose much of a threat to Microsoft or anyone else -- is there less reason to be hopeful about the benefits of open code? In other words, if Judge Jackson is right, does it make the vision of Code even darker?

LL: I don't see the two as linked. As I argued in the book, open code feeds a type of commons in cyberspace, and the commons, both in real space and in cyberspace, is an important check on government's power. But it doesn't follow that either real space or cyberspace should simply be a commons. To argue in favor of a commons is not to argue against private property. The point is balance. Thus I don't think the brightness of the future depends upon whether Linux displaces Windows. The brightness of the future depends upon a balance in the code space. A world that permitted only open code would be dark; a world that had only closed code would be dark as well.

WS: I think it's fair to say that you don't have a great deal of confidence in the ability of courts to resolve the fundamental questions we face as the Net changes. What, then, do you think can (or should) be achieved by the government's impending decision in the Microsoft case?

LL: Again, I don't think they are linked. I have little faith in courts resolving what they and the public view as "new" questions -- questions that the framers of our constitution, or the framers of a particular law did not anticipate or resolve. But I think courts are quite good in applying law to facts. Both sides in the Microsoft case have argued that traditional antitrust principles support their own view of the case. In this, I think both sides are right.

WS: You speak several times in the book about generations. Did the "first generation" of Net intellectuals, or digerati, have a different set of values than what we see on the Net today? Has there been a shift? And is there a new generation representing this shift? Where do you see yourself fitting in?

LL: I don't think it is an issue of values. I like the values of John Perry Barlow and Esther Dyson (emphasizing the freedom and creativity of the Net). I think the difference between us comes from a difference in experience. I've spent my professional life learning how law learns to regulate; I'm therefore skeptical of arguments that presume law can't learn. I view my work as building on the values Barlow and Dyson spoke of -- as well as the insights of people like Mitch Kapor ("architecture is politics") and William Mitchell (author of City of Bits) -- to tell a story that more realistically captures the threats that should make one work harder to defend the founding values of the Net, as well as the values from our tradition that the Net might threaten.

Charles Mann replies:

Lawrence Lessig is right -- with its ban on "circumvention," the Digital Millennium Copyright Act (DMCA) is a perfect example of the unholy collaboration between government and business that he fears. And he's also right about his larger point: the market's tendency to "push both ways" may be neutralized if the government aids one side. Here's why I nonetheless think he might be too worried.

In the 1980s a lot of software companies tried to stop "piracy" with copy-protection schemes. They didn't work. First, people broke the protection, in much the same way that some folks recently broke the copy-protection scheme on DVDs. As Lessig notes, the DMCA makes it a felony to circumvent copy protection, but it would not stop clever teenagers from breaking the protection abroad and putting the code-breaking method on the Web, which is exactly what the Dutch teenagers who broke DVD did. Nor would the law stop people from circumventing copy protection in their own homes -- at present, it would simply be too hard to enforce.

Second, copy protection didn't work because software companies realized that the restrictions it imposed so annoyed customers that the absence of copy protection became a selling point. Indeed, one of the reasons that id Software became a giant in the world of computer games was that its early products, such as Castle Wolfenstein, could be copied freely. Thus copy protection proved both ineffectual and disadvantageous; unsurprisingly, it vanished.

Lessig points to digital audio tape (DAT) as a counterexample -- and it's a pretty good one. (No one has broken the DAT copy-protection scheme, because it's burned into the hardware.) But DAT has been a major disappointment to hardware manufacturers, because the copyright restrictions are so annoying to consumers that they have avoided DAT in droves. Indeed, DAT's failure to live up to expectations is supposedly one reason that hardware companies have jumped in to make digital music players -- they didn't want to lose this profit opportunity, too.

Still, DAT has some market -- high-end recording studios and the like. This would suggest that efforts to privatize the code of the Net will, in part, depend on whether those who stand to lose by this arrangement will put up with a smaller piece of the pie. Because this depends on so many external factors, it may not be susceptible to prediction. In any case, Lessig's fine, thoughtful book should be read by anyone who cares about the potential of the Internet to increase the realm of human freedom.