The Facebook news thread

Facebook is doing some soul-searching. The social media giant acknowledges the possibility that social media can have negative ramifications for democracy. This comes after repeated criticism that it didn't do enough to prevent the spread of fake news that had the potential to impact the 2016 U.S. presidential election.

"Facebook was originally designed to connect friends and family – and it has excelled at that," writes Samidh Chakrabarti, Facebook's Civic Engagement Product Manager. "But as unprecedented numbers of people channel their political energy through this medium, it's being used in unforeseen ways with social repercussions that were never anticipated. In 2016, we at Facebook were far too slow to recognize how bad actors were abusing our platform. We're working diligently to neutralize these risks now."

This is a marked change in tone from the week of the 2016 election, when Facebook CEO Mark Zuckerberg said it's a "pretty crazy idea" that fake news could have influenced the poll. "There's a profound lack of empathy in asserting that the only reason why someone could have voted the way that they did is because they saw some fake news," Zuckerberg said in November 2016.

Since then Facebook has slowly shifted its view. Zuckerberg "is fast coming to terms with the power of his platform to cause harm," Aarti reported. In September, Zuckerberg wrote: "Calling that crazy was dismissive and I regret it. This is too important an issue to be dismissive." Facebook has been reluctant to wade into the business of sorting fact from fake news, though last year it introduced a system relying on third party fact checkers to flag particularly egregious examples.

The platform also was the target of a concentrated influence campaign from Russian entities. According to Facebook, "Russian actors created 80,000 posts that reached around 126 million people in the US over a two-year period." Facebook says that a few years ago, it was easier to say that social media was clearly positive for democracy. It cited the Arab Spring – where many protests were organized via Facebook – as an example. Now it's less clear, says Chakrabarti. "If there's one fundamental truth about social media's impact on democracy it's that it amplifies human intent – both good and bad. I wish I could guarantee that the positives are destined to outweigh the negatives, but I can't."

Tech firms including Facebook have faced increasing scrutiny on Capitol Hill. Executives from Facebook, YouTube and Twitter appeared before a Senate committee last week to discuss the steps social media platforms are taking to combat the spread of extremist propaganda over the Internet.

With two questions, Facebook is deciding the future of news
Facebook is going to ask who you trust when it comes to news. That’s dangerous.
Ian Sherr, CNet, Jan 23 2018 5:00 AM

One day soon, Facebook may ask you two seemingly straightforward questions that could decide the future of news on your feed. 1. "Do you recognize the following websites?" (Yes/No) 2. "How much do you trust each of these domains?" (Entirely/A lot/Somewhat/Barely/Not at all) These are, in fact, some of the actual questions written by teams at Facebook. The questions stem from a decision by Mark Zuckerberg, Facebook's CEO, who said last week that he's going to seek the wisdom of the crowd -- that is, the 2 billion monthly users of his service -- to determine which media organizations are writing honest and trustworthy stories worthy of appearing in your feed.

The world's largest social network, with a population greater than that of any country on Earth, by default won't consider facts, honesty or professionalism when judging news organizations. Instead, Zuckerberg and his team are going to survey random people, maybe some of your friends, maybe not, who'll decide what publications are most trustworthy. Whatever Facebook learns from us -- and a Facebook spokesman told me it won't make any of those details public -- will filter down into how often you see my stories in your feed. Yes, your ranting Uncle Ed may help determine whether you see the next big scoop from The New York Times or Wall Street Journal or CNN or Fox News.

"People who use Facebook have made clear that they want to see accurate, informative and relevant news on Facebook, as well as news from sources they trust," a Facebook spokesman told me. "The question was how to measure that. We could try to make that decision ourselves, but that's not something we were comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask the community and have their feedback determine the ranking. We decided that having the community determine which sources are broadly trusted would be most objective."

Welcome to Facebook's vision of journalism in the 21st century. No wonder many people are calling out Zuckerberg and saying, with a strong twang of irony, "What could go wrong?"

"Flick" supposedly comes from "frame-tick" -- so shouldn't the word be "frick"? The engineer misspelled the fricking word!

Facebook invents new unit of time called a flick
BBC, Jan 23 2018

Facebook engineer Christopher Horvath has invented a new unit of time called a flick. The flick has been designed to help developers keep video effects in sync, according to a description on the code-sharing site GitHub. A flick, derived from "frame-tick," is 1/705,600,000 of a second -- the next unit of time after a nanosecond.

A researcher at Oxford University said the flick wouldn't have much general impact but may help create better virtual reality experiences. Flicks are defined in the programming language C++, which is used to generate visual effects for film, television and other media. Flicks give programmers a way to measure the time between media frames without using fractions. Matt Hammond, lead research engineer at BBC Research and Development, said this can reduce errors such as stutters in graphics. "When the numbers used are not integers, errors can gradually creep into computer calculations. These errors can build up over time, eventually causing inaccuracies that become noticeable," he said.

Nick Bilton, a Vanity Fair correspondent and former New York Times columnist, says Facebook is in big trouble and Mark Zuckerberg knows it. Bilton noted, "During the past six months alone, countless executives who once worked for the company are publicly articulating the perils of social media on both their families and democracy. Chamath Palihapitiya, an early executive, said social networks 'are destroying how society works.' Sean Parker, its founding president, said 'God only knows what it’s doing to our children’s brains.' Just this weekend, Apple CEO Tim Cook said he won’t let his nephew on social media." Here is Bilton's essay:

'This is serious': Facebook begins its downward spiral
Facebook was always famous for the sign that hung in its offices, written in big red type on a white background, that said 'Move Fast and Break Things.' Every time I think about the company, I realize it has done just that -- to itself.

"Do you think Facebook's two-question survey is ridiculous?" Yes. "How much do you think the survey responses will enable Facebook to identify untrustworthy news sources?" Not at all. Thank you for asking.

In a series of tweets this week, the head of Facebook's News Feed defended the company's two-question survey that aims to cut down on the spread of fake news. Industry observers, journalists and other critics argued in recent days that the survey could be manipulated and that it would fail to accurately gauge the quality of news outlets. According to Adam Mosseri, new Facebook users will be sampled each day, with only their responses incorporated in the company's evaluations of trustworthiness. It's unclear how many users will be surveyed and what other information will be pulled into the assessment.

The Facebook survey, which is part of the crowdsourcing initiative that Facebook chief executive Mark Zuckerberg unveiled last week, consists of two short questions with a choice of responses:

Do you recognize the following websites?
Yes
No

How much do you trust each of these domains?
Entirely
A lot
Somewhat
Barely
Not at all

Facebook has not disclosed how or where those questions would appear to users. Facebook declined to comment beyond Mosseri's tweets. Some people were startled by the simplicity of the survey. They also noted the potential for it to reward partisan news outlets with loyal audiences or punish niche outlets and start-ups. But Mosseri says the survey was designed to recognize news outlets with broad recognition and trust among their users.

I do not have a Facebook page and I have never wanted to have a Facebook page. "Here is a picture I took of myself. Here is another picture I took of myself. I had a cheese omelet for breakfast. I'm going to Cancun next month and you're not. LOL. LOL. LOL. OMG. ROFL." Sheesh!

Facebook CEO Mark Zuckerberg says recent changes to the site have reduced the amount of time users spend there — a development he says he expected and one he welcomes as good for both his business and the health of society at large. In a Facebook post on Wednesday, Zuckerberg says the social media company is working to encourage "meaningful connections between people rather than passive consumption of content." Early parts of that shift, including changes to video recommendations, went into effect last fall. As a result, he said, Facebook saw a roughly 5% decline in total time spent on the site in the last quarter of 2017. That works out to roughly 50 million hours per day that people are no longer spending on Facebook.

Tech Crunch writer Josh Constantine claims "big news outlets stupidly sold their soul to Facebook" and Facebook readers were "brainwashed" into getting all their news and information from the site. He adds, "Now Facebook is pushing into local news but publishers should be wary of making the same crooked deal. It might provide more exposure and traffic for smaller outlets today, but it could teach users they only need to visit Facebook for local news in the future." Constantine's essay is quite cynical and alarmist but he makes some good points about the "Facebook news business" and why we shouldn't rely on it.

Actor Jim Carrey tweeted today that he plans to dump his Facebook stock and delete his Facebook page because, he said, the social network "profited" from Russian interference in US elections: "I’m dumping my @facebook stock and deleting my page because @facebook profited from Russian interference in our elections and they’re still not doing enough to stop it. I encourage all other investors who care about our future to do the same. #unfriendfacebook"

Facebook testified to Congress in October that Russian-backed content reached as many as 126 million Americans through its network during and after the 2016 presidential election. Facebook did not immediately respond to a request for comment on the matter.

In less than two years, Hun's Facebook page has gotten 9,000,000 likes and is the third-most-active Facebook page in the world. Rainey is accused of defamation for saying Hun bought most of those likes. Now he's determined to find proof.