We have all gone to a website and, in accessing the website’s services, have agreed to “terms and conditions” that include a litany of policies, including privacy policies governing how the company maintaining the website will use our personal information obtained while accessing the website. And let’s be honest, even as attorneys or soon-to-be-attorneys, many of us usually do not actually take the time to read the laundry list of items we are agreeing to just so we can obtain a 20% coupon. I know I’m guilty of regularly clicking “I agree” without reading every term and condition.

While we may think our assent to a website’s terms and conditions has little effect on our everyday life, our agreement does in fact matter, and not just for us but also for the company maintaining the website. For example, one such specific website that most, if not all, of us have used is Facebook. While, again, we likely have not paid very close attention to Facebook’s privacy policies such as its data and cookie policies, those policies explain that Facebook uses cookies or browser fingerprinting to identify users and track what third-party websites users browse. This use of cookies or browser fingerprinting is why you see ads for products or services that are, or at least should be, most relevant to you. Indeed, these processes are why I now regularly see ads for Nintendo products when on Facebook after having searched for and purchase Nintendo’s handheld 3DS video game system for my ten year old son. Continue reading “‘Click’ . . . You Just Agreed To Sell Your Privacy”

You only needed to read the title of the 2016 Nies Lecture in Intellectual Property presented Tuesday at Marquette Law School to know that Brad Smith was offering a generally positive view of the future of technological innovation. “A Cloud for Global Good: The Future of Technology—Issues for Wisconsin and the World” was the title.

Indeed, Smith spoke to the potential for what he called the fourth industrial revolution to improve lives across the world. But he also voiced concerns about the future of privacy and security for personal information in a rapidly changing world, and he called for updating of both American laws and international agreements related to technology to respond to the big changes.

For those who don’t know, Pokémon Go is an app for smartphones; the app is free, but players can make in-app purchases. The idea is for each player to “catch” creatures known as Pokémon, which the player does by “throwing” what is called a Pokéball at them. Once you catch the creatures, each of which has its own special powers and abilities, you can “evolve” them into stronger, more powerful creatures and you can go to gyms to “battle” other players.

Pokémon Go uses GPS to figure out where a player is located and presents the player with that “map.” Pokéstops (where players can go to get free goodies they need to play the game) and gyms are represented on the map as actual places, usually public places like parks, sculptures, or churches. To get to a Pokéstop or to battle at a gym, a player needs to physically move herself to that location. For example, the Marquette University campus is full of Pokéstops—e.g., a few sculptures on the southeast side of campus, one of the signs for the Alumni Memorial Union. Dedicated players certainly get some exercise.

Pokémon Go is also interesting because of how it mixes your real-life location with the mythical creatures. When a creature appears, you can take its picture, as if the Pokémon is right there in your real world. (See the pictures in this post.)

But Pokémon Go has been at the root of a number of accidents and incidents and it raises a number of interesting legal issues.

Apple is marketing its newest smartphone operating system, iOS 8, as a bulwark of personal privacy. Apparently, not even Apple itself can bypass a customer’s passcode and extract data from an iPhone that runs the new operating system. This means that even in response to a court order, the company will be powerless to comply. Competitors are likely to follow suit.

This is a development with profound implications for law enforcement, which views the ability to obtain such data with a warrant as crucial in its efforts to combat crime and terrorism. Defenders of the new technology point out that law enforcement may be able to obtain the same data in different ways; for example, if the data is stored “in the cloud” or if the password can be deduced somehow.

All of the interest in the Supreme Court tomorrow is likely to be focused on Hobby Lobby and, to a lesser extent, Harris v. Quinn. But I’ll be watching something that happens before either of those decisions is announced. I’ll be looking to see if the Supreme Court granted cert in the StreetView case. I hope the answer is no.

The StreetView case — Google v. Joffe — is one that I’ve blogged extensively about over the past year. See Part I, Part II; see also my coverage of the Ninth Circuit opinion, Google’s petition for rehearing, and the filing of Google’s cert. petition.) Briefly, Google’s StreetView cars intercepted the contents of transmissions from residential wi-fi routers whose owners had not turned on encryption. A number of class actions have been filed claiming that the interceptions were violations of the federal Wiretap Act. Google moved to dismiss them, arguing that radio communications (like wi-fi) basically have to be encrypted to be protected by the Wiretap Act. The district court and the Ninth Circuit disagreed, holding that the exception Google points to applies only to traditional AM/FM radio broadcasts.

Rebecca Tushnet points to this column by Cory Doctorow arguing that Hachette is being held hostage in its fight with Amazon over e-book versions of its books because of its “single-minded insistence on DRM”: “It’s likely that every Hachette ebook ever sold has been locked with some company’s proprietary DRM, and therein lies the rub.” Doctorow argues that because of the DMCA Hachette can no longer get access, or authorize others to get access to, its own books:

Under US law (the 1998 Digital Millennium Copyright Act) and its global counterparts (such as the EUCD), only the company that put the DRM on a copyrighted work can remove it. Although you can learn how to remove Amazon’s DRM with literally a single, three-word search, it is nevertheless illegal to do so, unless you’re Amazon. So while it’s technical child’s play to release a Hachette app that converts your Kindle library to work with Apple’s Ibooks or Google’s Play Store, such a move is illegal.

It is an own-goal masterstroke.

Everyone loves irony, but I can’t figure out how to make Doctorow’s argument work. First, I can’t figure out what the anticircumvention problem would be. Second, I can’t figure out why Hachette wouldn’t be able to provide other distributors with e-book versions of its books. Continue reading “Is Hachette Being Hoisted by Its Own DRM Petard?”

I noted back in October that Google had hired “noted Supreme Court advocate Seth Waxman” as it was preparing its petition for rehearing in the Street View case, “indicating perhaps how far they intend to take this.” (For background, see my earlier posts Part I, Part II, after the panel decision, and on the petition for rehearing.) My suspicions were accurate — after losing again at the rehearing stage in late December, Google has now filed a petition for certiorari, asking the Supreme Court to reverse the Ninth Circuit.

Google’s petition primarily makes the same substantive arguments it made in its petition for rehearing. The Ninth Circuit in the decision below adopted what I’ve called the “radio means radio” approach — “radio communications” in the Wiretap Act means only communications that you can receive with, you know, an ordinary AM/FM radio. I’ve argued that that is mistaken, and Google unsurprisingly agrees with me. Google provides three reasons why the Ninth Circuit’s interpretation cannot be sustained. Continue reading “Google Files Cert. Petition in Street View Case”

You won’t find out from this New York Times front-page story from yesterday, which is disappointingly long on alarmism but scarce on details, a phenomenon all too frequent in privacy reporting. In the third sentence — immediately after anthropomorphizing smartphones — the story tells us that “advertisers, and tech companies like Google and Facebook, are finding new, sophisticated ways to track people on their phones and reach them with individualized, hypertargeted ads.” Boy, that sounds bad — exactly what horrible new thing have they come up with now?

The third paragraph tells us only what privacy advocates fear. The fourth mentions the National Security Agency. The fifth quotes privacy scholar Jennifer King saying that consumers don’t understand ad tracking.

The sixth paragraph finally gives us a specific example of the “new, sophisticated ways” advertisers and tech companies are “track[ing] people on their phones”: Drawbridge. What does Drawbridge do? It’s “figured out how to follow people without cookies, and to determine that a cellphone, work computer, home computer and tablet belong to the same person, even if the devices are in no way connected.” But this doesn’t tell us much. There are more and less innocuous ways to accomplish the goal of tracking users across devices. On the innocent end of the scale, a website could make you sign into an account, which would allow it to tell who you are, no matter what computer you use. On the malevolent end of the scale, it could hack into your devices and access personal information that is then linked to your activity. The key question is, how is Drawbridge getting the data it is using to track users, and what is in that data? Continue reading “What the Heck Is Drawbridge?”

I do intend to get back to my four-part series on whether Google’s collection of information from residential Wi-Fi networks violated the Wiretap Act. That issue is being litigated in the Northern District of California in a consolidated class action of home wireless network users, and the earlier posts in my series examined the plaintiffs’, Google’s, and the district court’s arguments on this issue. See Part I; Part II. Since I wrote the first two posts, the Ninth Circuit weighed in, affirming the district court’s denial of Google’s motion to dismiss, allowing the plaintiffs to proceed with their complaint.

Since that post, there’s been another development: Google has filed a petition for rehearing and rehearing en banc. And they’ve brought in a bigger gun to do so — noted Supreme Court advocate Seth Waxman — indicating perhaps how far they intend to take this. Google has two basic arguments for why a rehearing should be granted. First, Google attacks what I called the panel’s “radio means radio” interpretation of the term “radio communications” — “radio communications” means “stuff you listen to on a radio” — is unworkable. Second, Google argues that the panel should never have reached the issue of whether wi-fi communications are “readily accessible to the general public” under an ordinary-language approach to that term, because that question involves disputed issues of fact. In the rest of this post I’ll review these two arguments. Continue reading “Google Calls in the Cavalry in the Street View Case”

On the most basic level, clicking on the “like” button literally causes to be published the statement that the User “likes” something, which is itself a substantive statement. In the context of a political campaign’s Facebook page, the meaning that the user approves of the candidacy whose page is being liked is unmistakable. That a user may use a single mouse click to produce that message that he likes the page instead of typing the same message with several individual key strokes is of no constitutional significance.

Time, and the Ninth Circuit, wait for no man. You may recall that I was halfway through my four-part series on the arguments in Joffe v. Google, the “Wi-Spy” case in which Google’s Street View cars intercepted and stored data captured from residential wireless networks. Google argued that that activity did not violate the Wiretap Act, because the Wiretap Act does not apply at all to Wi-Fi. There’s an exception in the Wiretap Act for “electronic communications readily accessible to the general public,” and the Act defines “readily accessible” for “radio communications” to mean that the communications must be encrypted or otherwise protected. Wi-Fi is broadcast over radio, and the plaintiffs did not set up encryption. Here’s Part I and Part II if you want to read more.

A couple of weeks ago Salon reported that the NSA had allegedly sent a request to self-printing site Zazzle asking that it take down a parody t-shirt that used an altered version of the NSA logo. When contacted, the NSA first claimed that “[t]he NSA seal is protected by Public Law 86-36, which states that it is not permitted for ‘ . . . any person to use the initials “NSA,” the words “National Security Agency” and the NSA seal without first acquiring written permission from the Director of NSA.'” But shortly after that, the NSA updated its statement to add that it had not contacted Zazzle to request the removal of any item since 2011, when it asked that a coffee mug with the NSA seal be removed from the site.

Putting the two statements together, it looked as though someone at Zazzle, remembering the earlier incident, mistakenly thought that all uses of the logo were forbidden. It seemed to be an isolated incident.

Except that now it’s happened again. This time, a computer science professor at Johns Hopkins, Matthew Green, received a request from his dean that he pull down a blog post on university servers that linked to some of the leaked NSA documents and contained the NSA logo. The university later confirmed that the reason for the request was that it “received information” that Green’s post “contained a link or links to classified material and also used the NSA logo.”