Ever gotten an email that your package could not be delivered? A recent USA Today report warns online shoppers to think before they click. Holiday season is open season for hackers who send out these fake “undelivered package” emails to lure unsuspecting gift-givers to open attachments containing malware or ransomware.

It’s December and—you may have noticed—we’re not running a new Futurography course this month. Never fear, though: We’ll be back in January to celebrate the 199th anniversary of Mary Shelley’s Frankenstein, looking at the ways the book has changed the ways we think about scientific advances. We’ll be featuring discussions of bioethics, the language of scientific innovation, and maybe even why our monsters have gotten so much sexier than they used to be. Stay tuned: It’ll be fun.

Advertisement

And that’s not all! In February, we’re planning our most practical unit yet—a series on self-defense against cybercrime. Then in March, we’ll be blasting back toward the stars with a course on the new space race. And in April we’ll be taking on synthetic biology. We hope you’ll continue to follow along.

In the meantime, this is the perfect moment to revisit our recently concluded course on internet governance. Here’s what we published in November:

Introduction: Who, if anyone, runs the internet? And does it have anything to do with subterranean lizard people? Find out here.

Cheat Sheet: What debates swirl around the problem of internet governance? What should you read if you want to get the details? All that and more.

Once you’ve worked through all of that, test what you’ve learned against our quiz about internet governance. And then find out what your fellow Slate readers think in our write-up of our survey on the topic.

Slate readers don’t always agree with one another. But there was one area of consensus among the readers who wrote in to our survey: When asked to name the most important internet governance debate, nearly all said that it is net neutrality. As Charles Kenny argued in Futurography, that opinion may not hold everywhere in the world—in developing regions, mere access is more important for many. Other issues our respondents identified as important included “transparency in tracking by private companies,” government spying, and open internet borders.

Advertisement

Most of our readers also agreed about which debates are overblown, pointing out that despite the conspiracy theories, it isn’t really worth talking about who runs the Internet Corporation for Assigned Names and Numbers. In line with Danielle Kehl, who laid out the ins and outs of ICANN for Future Tense, readers seemed to agree that the organization is largely clerical. As one observed, “Those who help govern the internet are not some kind of secret cabal,” but are instead mostly “industry experts and academics.”

Accordingly, it’s not surprising that most respondents also felt that individual countries shouldn’t have the right to control the internet within their borders—going against calls for internet sovereignty that have been pushed by countries such as China. While one reader held that the issue is too complex for a simple yes or no answer, most others seemed convinced that information should be able to flow freely between and within nations. One ambitious reader went so far as to propose, “The internet should be engineered in such a way to make it impossible for a government to even have the choice either way.” That may be easier said than done, though: “Realistically there’s nothing that can be done to change this,” one reader observed.

The question of practicality also haunted responses to a question about whether countries should work to preserve a free and open internet. Most felt that nations should, but only a few readers laid out ideas about how to do it. One suggested “requiring network neutrality and prohibiting zero-rating,” a type of service in which an internet provider "allows internet users to access certain sites or apps without accumulating data charges," as we explained on our cheat sheet. Another got more specific, suggesting that we “mandate the separation between content creators/owners (on the one hand) and ISPs/backbone (on the other).” As that same respondent noted, it may also be important to “work on international oversight so that companies can’t simply circumvent restrictions by hopping a border,” though that too presumably presents another set of complications.

When it came to which internet governance trends truly worried them, though, some readers didn’t seem especially concerned about nation states. To the contrary, many echoed Emily Taylor’s assertion that companies—not countries—control of the internet. “Vertical integration of content providers and ISPs” troubled one reader, and another wrote that “mega media mergers are a concern.” Still, a few did discuss issues originating in specific countries, as did one who pointed to the U.K. Investigatory Powers Act 2016—a recent bill that gave government authorities the right to hack and otherwise surveil individuals—and another who gestured more broadly to “China's ability and willingness to control IT content.”

The United States, for its own part, did not come out clean, with one reader claiming that we have a “Congress and SCOTUS who clearly have little or no understanding of how the internet … works.” We applaud the readers who took the time to find out over the past month.

Jill Stein’s recount effort isn’t going to change the results of the presidential election, but Frank Pasquale argues that it’s still a good idea. Voting machines in the U.S. are rife with vulnerabilities, Pasquale writes, and re-examining the vote could help make sure that future elections are secure and reliable. It’s better to find and fix problems now instead of waiting till 2020.

Advertisement

Though there is no evidence foreign actors interfered with the vote, it’s almost certain that Russia hacked the computers of U.S. officials and used its findings to sway the election. Maria Farrell examines why Russia has such a fraught relationship with Western countries when it comes to the internet. And while we’re talking about the election and its fallout: Will Oremus argues that we should be careful about how we use the term fake news. (But guess who else is worried about fake news? Russia.)

Law and order: I recapped our recent Future Tense event in Washington, D.C., in which experts discussed the potential for technology to prevent crime and improve the relationship between law enforcement agencies and the communities they serve. You can watch the full event on New America’s website.

Upcoming events:

Join Future Tense in New York the evening of Wednesday, Dec. 14, for a happy hour conversation with Tim Wu and New York Times writer (and Slate alumna) Amanda Hess to discuss the impact of advertisement and consequences we might face for leading more artificially moderated lives. For more information and to RSVP, visit the New America website.

RESCHEDULED: Will the internet always be American? On Tuesday, Jan. 24, Future Tense will host a live event in Washington, D.C., to explore the internet’s nationality, and the extent to which it’s an expression of American culture, and how that may be changing. You can RSVP to attend in person or watch online here.

In approximately 90 cities throughout the United States, ShotSpotter sensors—gunfire detection technology—time stamp and triangulate the location of shots fired in an effort to aid law enforcement in cracking down on gun violence. In many underserved communities, citizens are not likely to call 911 to report gunshots, so this technology is supposed to increase the awareness of police to instances of gun use in their jurisdiction.

Across the country law enforcement agencies are already deploying powerful technologies like ShotSpotter, surveillance systems, body-worn cameras for police, and predictive algorithms that attempt to anticipate where crime will happen. Proponents of the technologies argue that they have the potential to increase accountability and improve the ways police engage with the community.

Advertisement

On Nov. 30, Future Tense—a partnership of Slate, New America, and Arizona State University—convened experts for a live event in Washington, D.C., to discuss the potential for technology to prevent crime and improve the relationship between law enforcement agencies and the communities they serve.

Ralph Clark, president and CEO of ShotSpotter, says that his company’s product gives police the ability to proactively protect the public by responding to unreported gun violence. Yet in many communities, the problem isn’t necessarily whether police are responding to crime—it’s how they respond. Kami Chavis, director of the criminal justice program at Wake Forest University, says, “Police departments indeed have an obligation to use technology and anything they can to keep communities safe. At the same time, though, we have this very delicate balance of protecting the civil liberties of people who live in those communities as well.”

Critics argue that high-tech law enforcement tools, when left unchecked, can threaten privacy and perpetuate harmful biases. Predictive policing, for example, relies on historic crime data to anticipate where crime is most likely to take place or who is most likely to commit a crime. But this data shows an incomplete picture. Logan Koepke, an analyst at Upturn, points out that patterns of biased enforcement will be translated into the algorithms that predict future crime. The result is increased police presence and surveillance in already over-policed and over-surveilled communities.

Even when we know the problems with a law enforcement technology, it isn’t easy to find solutions. Lauren Kirchner, senior reporting fellow at ProPublica, warns, “The speed at which technology is advancing and the infrastructure of surveillance are building up so fast that oversight and regulation can’t possibly keep up. Even public knowledge can’t keep up.”

In many cases, the people being surveilled by these technologies aren’t aware of it or don’t understand its implications for their future. For example, body-worn cameras, which many called for after the shooting and protests in Ferguson, Missouri, have the potential to alter police-community relations for the better. But some companies are working to integrate facial recognition software into their bodycams—and without strong regulation, that could make them just another tool to catalog, track, and monitor individuals. It’s easy to think there’s nothing to worry about if you’re not doing anything criminal, but Jennifer Lynch, staff attorney for the Electronic Frontier Foundation, pointed out that many social movements were once seen as unacceptable, such as LGBT rights. Imagine how things might be different had those communities been under constant surveillance by people who considered their behavior unlawful.

That’s why it’s so important for members of communities to work with law enforcement to create fair structures and systems as departments adopt and deploy these technologies. And for that to happen, we need better police-community trust. Samuel Sinyangwe, co-founder of WeTheProtesters, reminds us, “It is impossible to have police-community trust in a black community when the majority of black youth, either themselves personally or somebody that they know personally has experienced police violence.” He suggests a good start to addressing this issue is increased accountability and transparency from law enforcement. And technology itself can be part of it: As Philadelphia City Councilman David Oh put it, “The technology is not a problem—the technology is a promise of transparency and a uniformed protocol.”

There are a few ways to fulfill that promise—and chief among them is bringing together citizens, law enforcement, and the companies that create these technologies to identify areas of improvement and understanding. Denice Ross, co-founder of the White House Police Data Initiative, believes “data transparency can shift the dialogue from one of confrontation to collaboration.” Comprehensive data documenting police conduct is necessary for police reform. With it, communities can hold police accountable for their actions. Without it, citizens are limited in their ability to review the behavior of their police, widening the trust gap between citizens and law enforcement.

Data and technology have the potential to do a lot of good—but unless there is an open dialogue with members of the community, they could also perpetuate critical problems within our justice system. NYPD Deputy Commissioner Tracie Keesee says that we must be careful about how much we rely on technology to mediate human interaction. Or as she put it: “Trust is a human component, not something technology can build for you.”

It’s nearly impossible to do anything today without being exposed to some form of branding, messaging, or sponsored content meant to capture our attention.

Every successful medium over the past century—from newspapers to radio and television to Google and Facebook—has attained commercial viability as advertising platforms. And the reach of advertising seems to know no limits—have you seen those ads on the bottom of Transportation Security Administration bins at airport security checkpoints? In his new book, The Attention Merchants: The Epic Scramble to Get Inside Our Heads, Tim Wu documents how the capture and resale of human attention has grown into the defining industry of our time. He argues that technologies—the devices and social media platforms that never leave our sides—are not the source of our distraction; rather they’re a conduit for the messaging that makes it more inescapable than ever before.

Advertisement

Join Future Tense on the evening of Wednesday, Dec. 14, in New York for a happy hour conversation with Tim Wu and New York Times writer (and Slate alum) Amanda Hess to discuss the impact of advertisement and consequences we might face for leading more artificially moderated lives. For more information and to RSVP, visit the New America website.

Watching a recenttrailer for the forthcoming Rogue One: A Star Wars Story, I caught a brief glimpse of an old-fashioned control panel, its huge, brightly lit buttons shining out against the imperial dark. For all the familiar futuristic trappings of the surrounding shots, that one image is immediately recognizable as the residue of an older fantasy. It is a trace not just of the earlier films in the series but also of the moments in which those films were made, evidence of a way we once imagined the world could someday be.

As I contemplated that shot, I found myself pulled back to Tom Gauld’s recent comic book, Mooncop. At once poignant and quietly comical, Gauld’s short novel, published in September, follows an unnamed police officer while he goes about his business during a peaceful lunar colony’s final days. Everything has an oddly outmoded quality, from the modular modernist buildings—which look like Le Corbusier dwellings as interpreted by NASA—to the fishbowl glass helmets people wear in place of more functional spacesuits. There’s no crime to solve, his therapist robot is on the fritz, and the satellite’s population is dwindling as the colonists give up and return to Earth, leaving him increasingly alone among its washed-out craters and crags. This is an earlier idea of the future, one that’s already largely been forgotten when it failed to line up with the unfolding present.

Tom Gauld/Drawn & Quarterly

Advertisement

Over email, Gauld told me that he’d gone back into his own past to find inspiration for the book. “I was thinking of the science fiction I ingested as a child and which, at the time, I perhaps didn’t entirely realize was fiction,” he wrote. More recently, he recalls seeing “a 1960s tin toy of a silver moon patrol police car with a figure driving from within a glass dome,” a reminder, as he puts it, that we once imagined “not only that we would colonize the Moon but that it would be so successful that we’d need a police force up there to maintain order.” Yet the box of that same toy showed the officer surveilling an empty lunar landscape: Much like the hero of the book Gauld would eventually write, the officer was patrolling just to patrol.

Gauld’s most resonant themes here are disconnection and isolation. In one elegant, melancholy sequence, our protagonist observes the distant Earth from a lunar bluff, watching enormous clouds kiss the continents. “It seems like the party’s over and everybody’s going home,” he complains later. That lament vibrates throughout the surrounding pages: A few notable exceptions aside, he is almost always the only character in any given panel. When others join him, he still typically stands apart, secure and distant in the glass bubble of his space helmet.

Tom Gauld/Drawn & Quarterly

Even the most dogged residents of Gauld’s moon aren’t quite sure why they’re doing on its inhospitable surface. “Whatever were we thinking? It seems rather silly now,” one of the earliest colonists wonders aloud, shortly before departing. Silly is exactly the right word: Despite the somber blue and gray shades of Gauld’s art, Mooncop’s tone is much like that of the science fiction he enjoyed as a child—“hopelessly naive, but often sweetly so,” as he puts it.

Of course, such naiveté isn’t always a design choice, just as it isn’t always sweet. In ways Gauld couldn’t have anticipated while working on the book, it’s tempting to read it in dialogue with Donald Trump’s anti-scientific agenda. At least one of the president elect’s advisers has suggested that the incoming administration plans to abandon NASA’s climate change research, and instead pump funds into space exploration. As with most of Trump’s nebulously-defined goals, that plan has more to with re-creating past glories—Space Age triumphs, in this case—than it does with building a better, more livable, tomorrow. He would rather reconstruct an obsolete vision of the future than protect against imminent threats in the present.

On a too facile reading, Mooncop’s retrofuturistic style might seem to align Gauld’s work with Trump’s perverse rejection of immediate, terrestrial realities. Like the aesthetics of Star Wars—aesthetics that inspired some of Gauld’s own design choices—this is a dream that we cling to, even as the waking world chips away at it. Ultimately, though, Mooncop insteadrevels in the persistent promise of possibilities that never quite found their way into the waking world. It hollows out a space within those evacuated visions and invites us to live within.

Mooncop, in other words, makes poetry from the act of dwelling on what could have been, however improbable it might seem in retrospect.This is a story about recommitting to dreams that others have abandoned. If Mooncop is a lonely book, it’s likely because dreaming is lonely work, even when we struggle—as Gauld's hero ultimately does—to dream together. But as Gauldshows, the very effort can also be surprisingly beautiful.

This erosion of civil rights didn’t start in the era of Donald “Burn the flag, go to jail” Trump, though he’s made clear his intentions to accelerate it (particularly when it comes to freedom of expression). And the United States isn’t the only place it’s happening.

Advertisement

Some of the most disappointing moves to limit people’s human rights have already taken place in Europe. The United Kingdom—which gave the world the Magna Carta—just enacted its so-called “snooper’s charter,” which will enshrine what can only be called the first full-on surveillance state among Western democracies. Among other notable intrusions into people’s privacy, it will give government vast new authority to hack people’s devices; force companies to decrypt personal information when the government asks for it; and require telecommunications companies to store users’ online activities, with a host of government agencies being able to fish around in that data without a warrant. As Jack Schofield observes in the Guardian, “It more or less removes your right to online privacy.”

Even before Trump takes office, the U.S. has been heading down some similar paths. As Edward Snowden showed, American surveillance services have been longstanding abusers of privacy and the law while Congress has, in general, winked at it all. For example, beginning Dec. 1—barring an unlikely last-second miracle in Congress—federal law enforcement will have a legal right to hack remotely into everyone’s devices with just a wink and nod from any magistrate who’ll authorize it. You probably haven’t heard about this, because our news media have largely failed to notice it except in passing.

In fact, Congress has, with few exceptions, been a champion of surveillance for years. Democrats and Republicans have supported all kinds of incursions on the privacy not just of foreigners but of Americans as well, in the name of protecting us. So it would be foolish to expect much pushback from our national lawmakers when Trump takes power, even though what’s coming should alarm anyone who believes in basic liberty. (Congress does occasionally do the right thing on freedom of speech, as it has with the just-passed legislation forbidding companies from gagging their customers’ right to post negative online reviews.)

If Congress is feckless about liberty, Trump looks reckless. He’s has been clear throughout his campaign: He will be as authoritarian as he’s permitted to be, using the levers of government—including our vast surveillance apparatus and law enforcement—to accomplish his goals.

His campaign tirade against Apple when the company was battling with the FBI over the San Bernardino shooter’s iPhone was telling. Apple and some other tech companies are moving to help users encrypt more of their data, a trend that governments, especially authoritarian ones, hate with a passion. It will not surprise me at all if Trump insists that Congress pass a law banning the use of encryption that the government cannot break. Never mind that security experts overwhelmingly agree that compromising strong encryption leaves everyone less safe in the end.

In a climate for liberty that can only be called chilly, we have to start planning for the worst even if we hope for the best. Given our climate of fear, I suspect most people don’t care enough. But what should you do if you do care?

As Schofield notes, there’s relatively little you can do if the government individually targets you. A nation-state has more than enough resources to overcome almost any measure you might take. But we do have some ways to make mass, pervasive surveillance and hacking more difficult—and thereby be safer from criminals, not just governments we may not trust.

Some of the measures are easy. We can be certain to keep our software up to date; unpatched operating systems and applications are a favorite way for criminals—and government—to penetrate our devices.

Activists and others whowant to challenge the people in power will face sterner test. As the Intercept explained in an article titled “Surveillance Self-Defense Against the Trump Administration,” it is vital to encrypt mobile devices; use encrypted messaging such as an app called Signal; move away from Facebook discussion groups for sensitive conversations, and much more.

But technology is not enough. As Snowden told European investigative journalists last month, they can’t win surveillance arms race against the National Security Agency or the U.K.’s GCHQ or other powerful government bodies. They have to lobby for privacy-protecting policies. So do everyday citizens who want to preserve liberty in this scary time.

The computer system that serves San Francisco’s Muni was hacked late last week, giving locals tens of thousands of free rides on the nation’s seventh-largest transit system. The ransom, according to correspondence between the San Francisco Examiner and the email address displayed on Muni employees’ hacked computer screens, was 100 Bitcoin, or about $74,000.

By Sunday, station ticket machines were up and running again, but the hackers indicated to Hoodline, a local news site, that they had compromised more than 2,000 computers in the Muni network in addition to agency-wide functions like payroll, email, and real-time bus locations. To cope, the transit agency was assigning routes to bus drivers via hand-written notes on bulletin boards, the Examiner reported.

The ransomware at work appears to be HDD Cryptor, also known as Mamba, which blocks access to compromised computers entirely. Its rapid takeover of Muni demonstrates—again—the extensive vulnerabilities of networked devices and “smart" infrastructure.