As President Obama meets with his intelligence cabinet today, here's hoping that he asks a couple of pointed questions:

-- What does the National Counterterrorism Center actually do? Why is it staffed with people who are very junior?

-- Is there too much intelligence? Whose job is it to act on hunches and run them through the system?

-- Who is responsible for rapidly upgrading the intelligence community's IT infrastructure?

-- How can the White House pressure Congress to do a better job of oversight?

-- If an ambassador, or a policy maker, senses something suspicious, why shouldn't he (or she) be held accountable for failing to flag said item as suspicious?

-- Why am I meeting with so many different people to figure out what went wrong? Does that give me an indication of the problem itself?

For the sake of the future of U.S. intelligence, let's ban the dot connecting metaphor. The Christmas Day terrorist attempt was fundamentally not about a failure to connect the dots. The dots were already connected. An elaborate portrait was drawn, with chiaroscuro shading and an antique frame. All that was needed is a policy maker to act on the information -- someone whose job it is to look at the pictures and decide how to change policy accordingly. This could have been the administrator of the FAA. It could have been the chief of station in Lagos. It could have been Mike Leiter, the director of the National Counterterrorism Center. It could have been a senior FBI agent working counterterrorism. It could have been a senior SIGINT analyst at the National Security Agency. It could have been a regional security officer in the Diplomatic Security Service. It could have been Janet Napolitano, or someone working for the TSA's intelligence division. All of them had plenty of information. No one felt compelled to shake the tree.

"In some ways, this case is worse than 9/11," said Amy Zegart, the highly regarded UCLA researcher of national security. When an FBI agent suspected the presence of two Al Qaeda terrorists inside the United States, she made the decision to call the FBI's Bin Laden unit in New York and impress upon its head that the information in her possession was important. "In this case, no one seems to have even picked up the phone."

Before 9/11, Zegart says, agencies refused to share information even when they were sitting in the same room. After 9/11, information sharing has become routine -- too routine -- paper shuffling. What good is a State Department cable if its originator does not believe it is important? Why have continually updated terrorist watch lists that don't trigger anything in particular? When the National Security Agency, which routinely intercepts about 200,000,000 communications per day, actually finds a diamond in the rough -- chatter about a Nigerian who is going to attempt an attack on an airplane -- what happens when everyone knows about it but no one feels compelled to act upon it?

One person who advises the administration on intelligence issues compared the process of producing intelligence nowadays to Gulliver, constrained by 1,001 nodes for dot collection. Fusion, for the sake of fusion, provides policy makers with no advantage.

The National Counterterrorism Center is responsible for all-source analysis. As envisioned by the 9/11 commission and then by Congress, its highly trained, experienced analysts would fuse data from all 16 intelligence agencies and produce "products" that policy makers would then use to make decisions. Critics worried that the NCTC's establishment would displace the decision makers themselves; they would rely on others to connect dots and tell them what to do. Centralizing intelligence analysis when fighting a broadly distributed and evolving enemy did not make sense; separating the function of high-level counterterrorism analysis from intelligence collectors -- the folks at the CIA's Counterterrorism Center -- was like a blueprint for a game of telephone.

There have been worthy initiatives from the ODNI -- science-based analytical guidelines, new joint-duty requirements for managers -- heck, former DNI Mike McConnell even managed to eliminate one very large and not very useful technical collection program.

But the DNI is not yet the enabler, the facilitator, of intelligence collection and production that Congress intended. And you can blame Congress that. They've lost focus on intelligence reform; the professional staffs on the intelligence committees are talented, but many of their bosses don't have the time, or interest, to learn about intelligence. And committee term-limits in the House make it difficult for members to specialize.

Congress bears responsibility for making sure that the information spigot is filtered; for making sure that the departments and agencies it creates are working properly; for making sure that the DNI has the power and budget authority to back up the statutory obligations; for being forward-looking, and not just reactive. When Congress holds its hearings on the Christmas Day plot, here's hoping that it asks questions about its own oversight functions.

The intelligence community's information infrastructure remains out of date and duplicative. A bevy of talented chief technology officers and CIOs are trying to figure out what to keep and what to ditch, but it is an "absolute disaster" when compared to what private companies routinely build, said AEI's Norm Ornstein, who has been worried about the issue for years. Again, the problem isn't information sharing. It's that human beings can only do so much. Why it is that the TIDE database isn't regularly, instantly, cross-checked with the Border Patrol's APIS database, or the State Department's Visa and Passport Watch List -- why the computers can't talk to each other and flag potential problems instantaneously -- cannot be blamed on a "need to know" culture. Google can do this; FedEx DOES this. Why can't the IC? Blame leadership for failing to prioritize information technology.

Inevitably, because he is the director, Adm. Dennis Blair is receiving the lion's share of the blame. Blair is not popular at the White House, and he has picked several battles with the CIA that turned out to be battles he could not win. (First rule of institutional politics: don't pick battles you can't win!) Really, though, any DNI would be in Blair's position. He cannot de-layer the intelligence bureaucracy -- he hasn't the power to do so. He can't force the CIA to better integrate its operations with the national intelligence strategy. Hiring "better" people won't work; there are great people throughout the intelligence agency who do incredible work. The problem, as one former CIA case officer explained to me, is this:

"Journalism is a lot like spying since it involves gathering information and talking to people. Imagine if you had to relay every word you published at The Atlantic through six or more people, each of whom edited what you had written. Some of them like to sit on things for a while, others like to wait overnight in order to sleep on it. Some are out for the afternoon and can't get to it until the next day. None of these people actually produce anything themselves, they just edit your work. Your work would eventually be published, but it would take a long time, and in some cases the final product would no longer resemble what you had originally written."

One wagers that, if there is reform to be had, Congress will insist that it consist of a new layer of bureaucracy. A Senior Overseer of Intelligence Prioritization, reporting directly to the Deputy Director of National Intelligence for Foreign and Domestic Intercepts and Threat Conditions. Why not solve this whole business with a few new business cards? It's easier than getting people to accept responsibility.

"It does not appear from public comments that the administration is forward leading on intelligence improvements," Zegart said. "They're in a reactive posture."

Most Popular

About 10 years ago, after I’d graduated college but when I was still waitressing full-time, I attended an empowerment seminar. It was the kind of nebulous weekend-long event sold as helping people discover their dreams and unburden themselves from past trauma through honesty exercises and the encouragement to “be present.” But there was one moment I’ve never forgotten. The group leader, a man in his 40s, asked anyone in the room of 200 or so people who’d been sexually or physically abused to raise their hands. Six or seven hands tentatively went up. The leader instructed us to close our eyes, and asked the question again. Then he told us to open our eyes. Almost every hand in the room was raised.

And there could be far-reaching consequences for the national economy too.

Four floors above a dull cinder-block lobby in a nondescript building at the Ohio State University, the doors of a slow-moving elevator open on an unexpectedly futuristic 10,000-square-foot laboratory bristling with technology. It’s a reveal reminiscent of a James Bond movie. In fact, the researchers who run this year-old, $750,000 lab at OSU’s Spine Research Institute resort often to Hollywood comparisons.

Thin beams of blue light shoot from 36 of the same kind of infrared motion cameras used to create lifelike characters for films like Avatar. In this case, the researchers are studying the movements of a volunteer fitted with sensors that track his skeleton and muscles as he bends and lifts. Among other things, they say, their work could lead to the kind of robotic exoskeletons imagined in the movie Aliens.

Four decades ago Jimmy Carter was sworn in as the 39th president of the U.S., the original Star Wars movie was released in theaters, and much more.

Four decades ago Jimmy Carter was sworn in as the 39th president of the United States, the original Star Wars movie was released in theaters, the Trans-Alaska pipeline pumped its first barrels of oil, New York City suffered a massive blackout, Radio Shack introduced its new TRS-80 Micro Computer, Grace Jones was a disco queen, the Brazilian soccer star Pele played his “sayonara” game in Japan, and much more. Take a step into a visual time capsule now, for a brief look at the year 1977.

In the media world, as in so many other realms, there is a sharp discontinuity in the timeline: before the 2016 election, and after.

Things we thought we understood—narratives, data, software, news events—have had to be reinterpreted in light of Donald Trump’s surprising win as well as the continuing questions about the role that misinformation and disinformation played in his election.

Tech journalists covering Facebook had a duty to cover what was happening before, during, and after the election. Reporters tried to see past their often liberal political orientations and the unprecedented actions of Donald Trump to see how 2016 was playing out on the internet. Every component of the chaotic digital campaign has been reported on, here at The Atlantic, and elsewhere: Facebook’s enormous distribution power for political information, rapacious partisanship reinforced by distinct media information spheres, the increasing scourge of “viral” hoaxes and other kinds of misinformation that could propagate through those networks, and the Russian information ops agency.

More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.

One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”

Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”

The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.

It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.

How a seemingly innocuous phrase became a metonym for the skewed sexual politics of show business

The chorus of condemnation against Harvey Weinstein, as dozens of women have come forward to accuse the producer of serial sexual assault and harassment, has often turned on a quaint-sounding show-business cliché: the “casting couch.” Glenn Close, for instance, expressed her anger that “the ‘casting couch’ phenomenon, so to speak, is still a reality in our business and in the world.”

The casting couch—where, as the story goes, aspiring actresses had to trade sexual favors in order to win roles—has been a familiar image in Hollywood since the advent of the studio system in the 1920s and ’30s. Over time, the phrase has become emblematic of the way that sexual aggression has been normalized in an industry dominated by powerful men.

A driver, a transportation official, and a transit advocate explain why Seattle recently saw one of the biggest citywide increases in passenger numbers.

Almost every major U.S. city has seen years of decline in bus ridership, but Seattle has been the exception in recent years. Between 2010 and 2014, Seattle experienced the biggest jump of any major U.S. city. At its peak in 2015, around 78,000 people, or about one in five Seattle workers, rode the bus to work.

That trend has cooled slightly since then, but Seattle continues to see increased overall transit ridership, bucking the national trend of decline. In 2016, Seattle saw transit ridership increase by 4.1 percent—only Houston and Milwaukee saw even half that increase in the same year.

Bus service is crucial to reducing emissions in the Seattle region. According to King County Metro, which serves the region, nearly half of all greenhouse gas emissions in Washington state come from transportation and its operation displaces roughly four times as many emissions as it generates, by taking cars off the road and reducing traffic congestion. The public transit authority has been recognized for its commitment to sustainability and its bus fleet is projected to be 100 percent hybrid or electric by 2018.

The president managed to cause a brief firestorm by falsely accusing predecessors of neglecting slain soldiers, but real answers about why four men were killed are still elusive.

On October 4, four American Special Forces soldiers were killed during an operation in Niger. Since then, the White House has been notably tight-lipped about the incident. During a press conference Monday afternoon, 12 days after the deaths, President Trump finally made his first public comments, but the remarks—in which he admitted he had not yet spoken with the families and briefly attacked Barack Obama—did little to clarify what happened or why the soldiers were in Niger.

Trump spoke at the White House after a meeting with Senate Majority Leader Mitch McConnell, and was asked why he hadn’t spoken about deaths of Sergeant La David Johnson and Staff Sergeants Bryan Black, Dustin Wright, and Jeremiah Johnson.

For the first time, astronomers have detected visible light and gravitational waves from the same source, ushering in a new era in our attempt to understand the cosmos.

In September of 2015, astronomers detected, for the first time, gravitational waves, cosmic ripples that distort the very fabric of space and time. They came from a violent merger of two black holes somewhere in the universe, more than a billion light-years away from Earth. Astronomers observed the phenomenon again in December, and then again in November 2016, and then again in August of this year. The discoveries confirmed a century-old prediction by Albert Einstein, earned a Nobel prize, and ushered in a new field of astronomy.

But while astronomers could observe the effects of the waves in the sensitive instruments built to detect them, they couldn’t see the source. Black holes, as their name suggests, don’t emit any light. To directly observe the origin of gravitational waves, astronomers needed a different kind of collision to send the ripples Earth’s way. This summer, they finally got it.