The hackers who dominate news coverage and popular culture -- malicious, adolescent techno-wizards, willing and able to do great harm to innocent civilians and society at large -- don't exist

The perceived threat landscape is a warped one, which directs attention and resources to battling phantoms, rather than toward preventing much more common data-security problems. According to the Privacy Rights Clearinghouse, the loss or improper disposal of paper records, portable devices like laptops or memory sticks, and desktop computers have accounted for more than 1,400 data-breach incidents since 2005 -- almost half of all the incidents reported. More than 180,000,000 individual records were compromised in these breaches, which included individuals' names, Social Security numbers, addresses, credit-card information and more. This is compared to the 631 incidents from the same period that the Clearinghouse assigns generically to "hacking or malware." Your private data is more likely to be put at risk by a factotum leaving a laptop on a train than by a wired teen with too much time on his hands.

Insider threats, otherwise known as frustrated grown-ups with real jobs, also constitute a significant challenge for information security. The Wall Street Journal recently reported on a survey which showed that 71 percent of IT managers and executives believe insider threats present the greatest risk to their companies.

And the recent high-profile security breach at LinkedIn shows that one of the greatest risks to our personal security is ourselves: more than two-thirds of the leaked LinkedIn passwords were eight characters or fewer in length, and only one percent used the mix of upper- and lower-case characters, numbers, and symbols that makes passwords difficult to crack.

But these more serious threats don't seem to loom as large as hackers in the minds of those who make the laws and regulations that shape the Internet. It is the hacker -- a sort of modern folk devil who personifies our anxieties about technology -- who gets all the attention. The result is a set of increasingly paranoid and restrictive laws and regulations affecting our abilities to communicate freely and privately online, to use and control our own technology, and which puts users at risk for overzealous prosecutions and invasive electronic search and seizure practices. The Computer Fraud and Abuse Act, the cornerstone of domestic computer-crime legislation, is overly broad and poorly defined. Since its passage in 1986, it has created a pile of confused caselaw and overzealous prosecutions. The Departments of Defense and Homeland Security manipulate fears of techno-disasters to garner funding and support for laws and initiatives, such as the recently proposed Cyber Intelligence Sharing and Protection Act, that could have horrific implications for user rights. In order to protect our rights to free speech and privacy on the internet, we need to seriously reconsider those laws and the shadowy figure used to rationalize them.

* * *

The hacker character in mainstream culture has evolved as our relationship with the technology has changed. When Matthew Broderick starred in War Games in 1983, the hacker character was childish, driven by curiosity and benign self-interest, and sowed his mayhem largely by accident. Subsequent incarnations, like those in Hackers, Sneakers, GoldenEye, and Live Free or Die Hard became more dangerous and more intentional in their actions, gleefully breaking into protected networks and machines and causing casual destruction incomprehensible to techno have-nots. The hacker in American film, almost always white, middle class, and male, is immature, socially alienated, vindictive, and motivated by selfish goals or personality problems. The plots of such films are built on apocalyptic techno-paranoia, reflecting a belief that hackers have supreme control over the technologies that make the world run.

News coverage parallels the pop culture frame. Basement-dwelling hackers remain a primary villain on the evening news and the front page, even at the cost of an accurate and rational portrayal of current events. "Hacking" is used as a catch-all term to describe almost any computer-related crime or "bad" action, no matter the skills or techniques involved. Coverage often confuses what could happen with what is actually happening, reporting on theoretical exploits of the type often presented at security conferences as if they were a clear and present danger. Recent media and government fixation on the prankster-protesters of Anonymous has stoked the fires of techno-paranoia and, as Yochai Benkler pointed out in a recent article in Foreign Affairs, has conflated modes of electronic civil disobedience with outright cybercriminality in ways that damage the cause of political speech online.

The hacker lurks in the network, a decentralized threat, able to cause harm far from his actual location. His relationship with technology is pathological, he is compulsive in his hacking activities, and therefore cannot be reformed. Because he is socially alienated, he lacks the normal social checks on his behavior, and is instead stuck in a feedback loop with other hackers, each trying to outdo the other in juvenile mayhem on the public internet. Add to all this the hacker's superhuman ability to manipulate anything running code, and you have a terrifying modern boogeyman that society must be protected from at all costs.

* * *

In the effort to protect society and the state from the ravages of this imagined hacker, the US government has adopted overbroad, vaguely worded laws and regulations which severely undermine internet freedom and threaten the Internet's role as a place of political and creative expression. In an effort to stay ahead of the wily hacker, laws like the Computer Fraud and Abuse Act (CFAA) focus on electronic conduct or actions, rather than the intent of or actual harm caused by those actions. This leads to a wide range of seemingly innocuous digital activities potentially being treated as criminal acts. Distrust for the hacker politics of Internet freedom, privacy, and access abets the development of ever-stricter copyright regimes, or laws like the proposed Cyber Intelligence Sharing and Protection Act, which if passed would have disastrous implications for personal privacy online. The hacker folk devil as depicted in popular culture and news coverage is the target of and the justification for these laws and regulations. But rather than catching that phantom, these laws invite guilt by association, confusing skill with computers with intent to harm. They snag individuals involved with non-criminal activities online, as happened in the case of Bret McDanel, who served 16 months in prison for sending a few emails, and leave the rest of us with legally crippled technology and a confused picture of our rights online.

Crafting governmental and corporate policy in reaction to a stereotyped social ghoul lurking in the tubes is ineffective at best, and actively malignant at worst. There are real threats in the online space, from the banal reality of leaving a laptop on the bus and sloppy personal security habits to the growing reality of inter-state cyberwar. However, focusing on the boys-in-the-basement hacker threat model drains attention and resources from discovering what and where the actual threats are. Taking down file lockers, criminalizing jail breaking, modding, and terms-of-service violations, and casting legal aspersions on anonymous and pseudonymous speech online is distracting fear mongering and wastes governmental and corporate resources. Recent court decisions, like the opinion handed down by the Ninth Circuit in US v. Nosal, work to narrow the scope of the CFAA, which gives hope to the idea that it is possible to regulate the Internet in a more reality-driven way.

In order to achieve that regulation, though, we must discard the hacker stereotype as a central social villain and legal driver. The past few years have seen the internet emerge as a central haven for political speech, domestically and internationally. The internet has been used to exchange ideas, organize protests, and overthrow dictators. We hold the right to free political speech dearly in this country, and, for better or for worse, the laws we pass regarding the regulation of the internet have a disproportionally large impact on the way this international resource operates. The question that we must ask ourselves is, do we want the next Arab Spring regulated out of existence by our fear of hackers who don't even exist?

Most Popular

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!

And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.

He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.

The new version of Apple’s signature media software is a mess. What are people with large MP3 libraries to do?

When the developer Erik Kemp designed the first metadata system for MP3s in 1996, he provided only three options for attaching text to the music. Every audio file could be labeled with only an artist, song name, and album title.

Kemp’s system has since been augmented and improved upon, but never replaced. Which makes sense: Like the web itself, his schema was shipped, good enough,and an improvement on the vacuum which preceded it. Those three big tags, as they’re called, work well with pop and rock written between 1960 and 1995. This didn’t prevent rampant mislabeling in the early days of the web, though, as anyone who remembers Napster can tell you. His system stumbles even more, though, when it needs to capture hip hop’s tradition of guest MCs or jazz’s vibrant culture of studio musicianship.

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”

A leading neuroscientist who has spent decades studying creativity shares her research on where genius comes from, whether it is dependent on high IQ—and why it is so often accompanied by mental illness.

As a psychiatrist and neuroscientist who studies creativity, I’ve had the pleasure of working with many gifted and high-profile subjects over the years, but Kurt Vonnegut—dear, funny, eccentric, lovable, tormented Kurt Vonnegut—will always be one of my favorites. Kurt was a faculty member at the Iowa Writers’ Workshop in the 1960s, and participated in the first big study I did as a member of the university’s psychiatry department. I was examining the anecdotal link between creativity and mental illness, and Kurt was an excellent case study.

He was intermittently depressed, but that was only the beginning. His mother had suffered from depression and committed suicide on Mother’s Day, when Kurt was 21 and home on military leave during World War II. His son, Mark, was originally diagnosed with schizophrenia but may actually have bipolar disorder. (Mark, who is a practicing physician, recounts his experiences in two books, The Eden Express and Just Like Someone Without Mental Illness Only More So, in which he reveals that many family members struggled with psychiatric problems. “My mother, my cousins, and my sisters weren’t doing so great,” he writes. “We had eating disorders, co-dependency, outstanding warrants, drug and alcohol problems, dating and employment problems, and other ‘issues.’ ”)

Jim Gilmore joins the race, and the Republican field jockeys for spots in the August 6 debate in Cleveland.

After decades as the butt of countless jokes, it’s Cleveland’s turn to laugh: Seldom have so many powerful people been so desperate to get to the Forest City. There’s one week until the Republican Party’s first primary debate of the cycle on August 6, and now there’s a mad dash to get into the top 10 and qualify for the main event.

With former Virginia Governor Jim Gilmore filing papers to run for president on July 29, there are now 17 “major” candidates vying for the GOP nomination, though that’s an awfully imprecise descriptor. It takes in candidates with lengthy experience and a good chance at the White House, like Scott Walker and Jeb Bush; at least one person who is polling well but is manifestly unserious, namely Donald Trump; and people with long experience but no chance at the White House, like Gilmore. Yet it also excludes other people with long experience but no chance at the White House, such as former IRS Commissioner Mark Everson.