Facebook’s proposed solution was not to use social media less, necessarily, but rather to use it differently. What followed over the next year was a series of steps designed to get people to use Facebook more “actively” — increasing the number of comments, while decreasing number of stories and videos from professional publishers in the News Feed.

Each of 143 participants completed a survey to determine mood and well-being at the study’s start, plus shared shots of their iPhone battery screens to offer a week’s worth of baseline social-media data. Participants were then randomly assigned to a control group, which had users maintain their typical social-media behavior, or an experimental group that limited time on Facebook, Snapchat, and Instagram to 10 minutes per platform per day.

For the next three weeks, participants shared iPhone battery screenshots to give the researchers weekly tallies for each individual. With those data in hand, Hunt then looked at seven outcome measures including fear of missing out, anxiety, depression, and loneliness.

Participants who reduced their time on social sites saw a statistically significant decrease in depression and loneliness, according to the study. The control group did not report an improvement.

The study’s authors present this as a milestone. Their study concludes:

The results from our experiment strongly suggest that limiting social media usage does have a direct and positive impact on subjective well-being over time, especially with respect to decreasing loneliness and depression. That is, ours is the first study to establish a clear causal link between decreasing social media use, and improvements in loneliness and depression. It is ironic, but perhaps not surprising, that reducing social media, which promised to help us connect with others, actually helps people feel less lonely and depressed.

The study’s lead author, psychologist Melissa G. Hunt, told Science Daily that she did not recommend that people stop using social media. But limits can be helpful, she said.

Facebook didn’t respond to my request for comment. On the subject of time spent in its apps, the company has arguably already capitulated. In response to the Time Well Spent movement, the company voluntarily introduced in-app screen time limits earlier this year. (Well, it announced those limits, anyway. They still haven’t shipped for reasons no one will tell me.) Apple and Google, who control Facebook’s key distribution channels, shipped screen time management features of their own.

While the study finds evidence that social media usage can make us depressed, it doesn’t offer any thoughts on why. Hunt has offered some theories in interviews, primarily the idea that seeing other people’s happiness can create negative comparisons with our own experiences. But if we are to better understand how to manage our relationship with social networks, we need to understand those mechanics much better. To know that social media often makes us feel lonely now seems like a given. Knowing why feels like an important next step.

The president is attempting to undermine confidence in the midterm election by repeatedly tweeting baseless claims of voter fraud. I find it all tremendously alarming. Jane Lytvynenko walks us through it:

Trying to imagine a logical explanation for this, from Joe Uchill, and coming up short:

French President Emmanuel Macron released an international agreement on cybersecurity principles Monday as part of the Paris Peace Forum. The original signatories included more than 50 nations, 130 private sector groups and 90 charitable groups and universities, but not the United States, Russia or China.

I’m not sure it’s fair to say, as Dustin Volz and Robert McMillan do, that Russia “largely skipped the midterms.” They were running an online influence campaign, and Facebook has continually removed pages related to it. But:

Voting largely came and went without major incident, according to U.S. officials and cybersecurity companies looking for evidence of Russian interference.

Several factors may have reduced Moscow’s impact. Clint Watts, a senior fellow with the Foreign Policy Research Institute, said the diffuse nature of congressional and state races makes them a harder target than a single presidential election.

Facebook’s relationship with law enforcement is growing closer, partly due to the threat of foreign interference in our elections, report Sarah Frier, Selina Wang, and Alyza Sebenius. Feels like this could be a double-edged sword:

Now, communication lines have started to open between Facebook and federal agencies including the Department of Homeland Security and the Federal Bureau of Investigation, according to the company. Facebook also established relationships with state election boards, so it could be alerted to problems as they occurred. Those connections are likely to strengthen ahead of the 2020 presidential election, when foreign interest in election manipulation may be higher. Twitter Inc., too, has strengthened its relationship with federal law enforcement agencies, seeking to protect against foreign influence.

YouTube CEO Susan Wojcicki has more to say about Article 13, the European Union’s proposed copyright directive. In this op-ed, she warns that if passed it would deprive the internet of the music video for “Despacito.”

“This video contains multiple copyrights, ranging from sound recording to publishing rights,” Wojcicki wrote. “Although YouTube has agreements with multiple entities to license and pay for the video, some of the rights holders remain unknown. That uncertainty means we might have to block videos like this to avoid liability under article 13. Multiply that risk with the scale of YouTube, where more than 400 hours of video are uploaded every minute, and the potential liabilities could be so large that no company could take on such a financial risk.”

In a novel collaboration, Facebook has invited French regulators to embed with the company and examine how it moderates content, Tony Romm and James McCauley report:

Under a six-month arrangement announced on Monday, French investigators will monitor Facebook’s policies and tools for stopping posts and photos that attack people on the basis of race, ethnicity, religion, sexuality or gender. From there, aides to French President Emmanuel Macron hope to determine “the necessary regulatory and legislative developments” to fight online hate speech, a government official said.

In recent months, France and its fellow European countries have adopted a much harder line against Facebook and its social-media peers, demanding they police their platforms more aggressively to stop a range of digital ills — including conspiracy theories, fake news and terrorist propaganda. Macron pressed Facebook chief executive Mark Zuckerberg on the issue when the two met in Paris earlier this year.

This story about the firing of Palmer Luckey caused a stir over the weekend. But as best as I can tell, from my own reporting and others’, Luckey’s firing had little to do with his politics and was much more about the fact that he lied to Facebook executives about his role in funding anti-Clinton memes.

A new Harris Poll finds that consumers’ trust in Facebook is at a low:

Facebook is the least trustworthy of all major tech companies when it comes to safeguarding user data, according to a new national poll conducted for Fortune, highlighting the major challenges the company faces following a series of recent privacy blunders.

Only 22% of Americans said that they trust Facebook with their personal information, far less than Amazon (49%), Google (41%), Microsoft (40%), and Apple (39%).

Last month, New York wrote about parents protesting Summit Learning, a personalized learning curriculum that has been championed by Facebook. (The company has donated engineers to the project.) Now a New York high school is joining the opposition, Susan Edelman reports:

Brooklyn teens are protesting their high school’s adoption of an online program spawned by Facebook, saying it forces them to stare at computers for hours and “teach ourselves.”

Nearly 100 students walked out of classes at the Secondary School for Journalism in Park Slope last week in revolt against “Summit Learning,” a web-based curriculum designed by Facebook engineers, and bankrolled by CEO Mark Zuckerberg and his wife Priscilla Chan.

The simplest explanation for these swelling run times is straightforward business. As a new study from the Pew Research Center demonstrates, YouTube has been quietly shifting its recommendation system to reward lengthy videos. YouTube’s algorithm is famously opaque. So to reverse-engineer it, Pew researchers took more than 170,000 “random walks” through YouTube over a period of six weeks, letting the site’s recommendations be their guide. They ended up watching over 300,000 unique videos, but just two patterns emerged in the recommended video’s statistics: First, the recommendation algorithm drove viewers toward more and more popular creators. But the researchers also noticed that YouTube’s recommendations consistently increased in length: At first, they were nine and a half minutes, then 12, then 15.

I enjoyed this Q&A with the Times’ Max Fisher about why he turned to covering social media for his excellent newsletter with Amanda Taub, The Interpreter. (Its most recent edition, about what the midterms mean for the health of our democracy, is also highly recommended.)

Even after reporting with Amanda Taub on algorithm-driven violence in Germany and Sri Lanka, I didn’t quite appreciate this until I turned on Facebook push alerts this summer. Right away, virtually every gadget I owned started blowing up with multiple daily alerts urging me to check in on my ex, even if she hadn’t posted anything. I’d stayed away from her page for months specifically to avoid training Facebook to show me her posts. Yet somehow the algorithm had correctly identified this as the thing likeliest to make me click, then followed me across continents to ensure that I did.

It made me think of the old “Terminator” movies, except instead of a killer robot sent to find Sarah Connor, it’s a sophisticated set of programs ruthlessly pursuing our attention. And exploiting our most human frailties to do it.

Last week, ballot initiatives to improve the functioning of democracy fared very well. In Florida — a state divided nearly equally between right and left — more than 64 percent of voters approved restoring the franchise to 1.4 million people with felony convictions. In Colorado, Michigan and Missouri, measures to reduce gerrymandering passed. In Maryland, Michigan and Nevada, measures to simplify voter registration passed. “In red states as well as blue states,” Chiraag Bains of the think tank Demos says, “voters overwhelmingly sent the message: We’re taking our democracy back.”