Cracking the Code

It's the friendly Facebook question which lets you share what you're thinking and what you've been up to. It's also the question that unlocks the details of your life and helps turn your thoughts into Facebook's profits.

"They are the most successful company arguably in human history at just gathering people's time and turning that time into money."Reporter

On Monday night, Four Corners explores the world of Facebook and how your data is being mined to drive the huge success of the social media giant.

"Facebook's very well aware of our sentiment, our mood…it can put all that data together and start to understand who our ex's are, who our friends are, who our old friends are, who our new friends are, and that's how it really works."Marketing Executive

Reporter Peter Greste examines the Facebook business model and shows why your private life is making them billions.

The program investigates how Facebook has the ability to track much of your browsing history, even when you're not logged on, and even if you aren't a member of the social network at all.

"Even if you close your account, even if you log out of all of your services, the way that they're set up, with their sharing buttons, they're still going to be able to build a profile for you. It's very difficult to opt out of Facebook's reach."IT Security Consultant

And shows how the methods used to deliver targeted advertising also drives what 'news' appears in your Facebook feed, and why you are unlikely to see anything that challenges your world view. This feedback loop is fuelling the rise and power of "fake news".

"We're seeing news that's tailored ever more tightly towards those kinds of things that people will click on, and will share, rather than things that perhaps are necessarily good for them."Media Analyst

With more than 16 million Australian Facebook accounts, joining more than a billion other users, Four Corners investigates how much we are giving up to be part of the social network.

"If somebody was going to build a dossier on me based on what Facebook knows about me, what would it look like? I should be able to know that so that I can make informed decisions at how I'm going to use the platform."Internet Privacy Advocate

Cracking the Code, reported by Peter Greste and presented by Sarah Ferguson, goes to air on Monday 10th April at 8.30pm EDT. It is replayed on Tuesday 11th April at 10.00am and Wednesday 12th at 11pm. It can also be seen on ABC News 24 on Saturday at 8.00pm AEST, ABC iview and at abc.net.au/4corners.

Transcript

plusminus

Transcript

Monday 10 April 2017 - Cracking the Code

SARAH FERGUSON: Welcome to Four Corners.

Since the social media juggernaut that is Facebook was launched 13 years ago, for many of us it's become an integral, almost inescapable part of our lives.

There are more than 16 million Facebook accounts in Australia.

Worldwide the company claims more than 1.8 billion users.

But how much do we know about how Facebook works - and how it turns our information into vast profits - last year almost 10 billion US dollars.

The closely guarded secret of its success are the complex algorithms Facebook uses to track our preferences, even our moods to keep us online longer, to consume the ads that turned Facebook into the biggest advertising platform in history.

Tonight reporter Peter Greste's sets out to crack the Facebook code…

PETER GRESTE, REPORTER: In 2014, Facebook scientists published the results of one of the biggest psychological experiments ever conducted.

They took almost 700-thousand Facebook profiles and deliberately skewed their news feeds to be either more positive or more negative.

Then, they used the company's sophisticated algorithms to measure any shift in people's mood. Their findings? The more positive the feed, the happier Facebook users seemed to be.

The more negative, the more depressed they became. It proved the power of Facebook to affect what we think and how we feel.

DR SUELETTE DREYFUS, INFORMATION SYSTEMS EXPERT, UNIVERSITY OF MELBOURNE: Facebook has very cleverly figured out how to wrap its self around our lives.

It's the family album.

It's your messaging to your friends.

It's your daily diary.

It's your contact list.

It's all of these things wrapped around your life.

PETER GRESTE: This is the story of how one of the World's biggest and most powerful private corporations is turning our lives and our data into vast profits, and in ways we have no control over.

JOHN HERRMAN, JOURNALIST, NEW YORK TIMES: They are the most successful company arguably in human history at just gathering people's time and turning that time into money.

PETER GRESTE: Like his company, Facebook's founder hardly needs introducing. Mark Zuckerberg started the social media platform 13 years ago when he was just 19 as a site for Harvard undergraduate students.

MARK ZUCKERBERG: When we first launched, we were hoping for maybe 400 to 500 people.

Harvard didn't have a Facebook so that was the gap were trying to fill and now we're at 100,0000 people so who knows where we are going next.

PETER GRESTE: Within its first month, more than half the students had joined, setting the trend for the membership explosion that followed.

Now, almost a quarter of the world's population has signed on. It is bigger than any country. Facebook is now a global colossus.

It is one of the world's most valuable corporations, worth over 400 billion dollars.

He's making decisions about your life on Facebook, what the rules are, and he's a benevolent dictator.

You can't say this is accountable governance or a participatory governance in any particular way.

PETER GRESTE: But almost two billion users still isn't enough for Facebook.

Mark Zuckerberg is aiming for the next billion.

There is a limit to how much they can grow in established markets like North America and Australia.

But the 32-year-old businessman sees huge potential in the developing world.

MARK ZUCKERBERG: There are a billion people in India who do not have access to the internet yet and if what you care about is connecting everyone in the world then you can't do that when there are so many people who don't even have to access to basic connectivity.

PROF. RAMESH SRINIVASAN, TECHNOLOGY, POLITICS AND SOCIETY, UCLA: There's a term that's being used ah by folks ah connected to Facebook and Google called the last billion where they're basically trying to figure out a way to spread internet access, but the internet that they're going to spread is an internet that's shaped by Facebook and Facebook's agendas.

That's actually part of the long game here.

REBECCA MACKINNON: Most people in a lot of the developing world are accessing the internet through their mobile phones and there are these programs that are known as zero rating or Facebook Zero so when you get smart phones you get free data if your using Facebook and so people stay on Facebook.

They don't go anywhere else and so that their whole world on the internet becomes very much the same as you know they don't know any other kind of internet.

PETER GRESTE: Facebook is a free service, but that's because Zuckerberg has learned how to turn our data into dollars. Lots of dollars.

Last year his company earned 27-and a half billion US- just under 16 dollars for each user, and he's buying even more internet real estate.

INTERVIEWER: Clearly there's one topic we have to start with. You bought WhatsApp for 19 billion dollars. Why did you do it and what does it mean?

MARK ZUCKERBERG: You know you can look at their messaging apps that are out there whether it's Cacao or Line WeChat and they're already monetizing at a rate of two to three dollars per person with pretty early efforts and I think that shows if we do a pretty good job at helping WhatsApp to grow that is just going to be a huge business.

PETER GRESTE: Facebook has What's app Facebook has Instagram.

Facebook has um Oculus Rift not necessarily mainstream but these are very big corners of the internet.

Should we be concerned about a monopoly?

REBECCA MACKINNON: We should always be concerned about monopoly.

We should always be concerned about concentration of power.

We should always be concerned about that and we need to hold their feet to the fire at all times.

MARK ZUCKERBERG: Facebook is all about community, right.

What people all around the world are coming to do - connecting with friends and family, informing those communities.

PETER GRESTE: Facebook presents its self as a digital platform - a neutral stage upon which life plays out.

It says it is a company that develops digital technology - not social engineering.

For all the talk about community, Facebook is neither democratic nor transparent.

PROF. RAMESH SRINIVASAN: Any place we go to that is not truly open, that's not governed by us as users, that's not governed by sort of some democratic accountability, is actually a place that is not truly ours.

It's a place that we can use, it provides great value in many ways, don't get me wrong, to its users. But it is incorrect to see it as a neutral place.

JOHN HERRMAN: It can it can do things like a government and indeed it has sort of inherited some government-like functions, but I don't think that passes the smell test to imagine that that Facebook or any online platform is really democratic, they're, they're not.

DR SUELETTE DREYFUS: If we tell the computer to look at two numbers and compare them and put the larger number on one side and the smaller one on the other then, with a series of steps we will be able to reorder it.

PETER GRESTE: To understand how Facebook works, we need to understand what goes on under the hood.

The engine that drives the system is built on algorithms - sets of instructions that Facebook's engineers use to determine what we see in our News Feed.

Dr Suelette Dreyfus, an information systems expert, demonstrates how a basic algorithm works

DR SUELETTE DREYFUS: Typically, an algorithm might be for processing some data or doing some arithmetic, summing something for example, or it might be to try and recreate the decision-making process that we use in our human brain on a more sophisticated level.

PETER GRESTE: Facebook's algorithms were originally configured to help Harvard University students stay in touch with one another.

They exploited the way the students had a small group of close friends, and a wider, looser social circle.

The algorithms are now vastly more complex, but exactly how they work is a closely guarded commercial secret.

We do know that they are designed with one aim in mind - to keep us online for as long as possible.

DR SUELETTE DREYFUS: The algorithms are designed to be helpful and give us information that's relevant to us, but don't for a minute assume that the algorithms are just there to help us.

The algorithms are there to make a profit for Facebook.

PETER GRESTE: And that is Facebook's genius. It is a giant agency that uses its platform to deliver us advertising.

By tracking what we do, who we associate with, what websites we look at, Facebook is able make sophisticated judgements about the stories we see, but also advertising that is likely to move us to spend.

We will probably always live in a world with old fashioned display ads.

Times Square simply wouldn't be the same without it.

But these ads nudge towards products with all the subtlety of a punch in the nose.

Facebook on the other hand uses the extraordinary amounts of data that it gathers on each and every one of us to help advertisers reach us with precision that we've never known before.

And it gives anybody in the business of persuasion power that is unprecedented.

Depending on what we post at any given moment, Facebook can figure out what we are doing and thinking, and exploit that.

ADAM HELFGOTT, MADHIVE DIGITAL MARKETING: Facebook's very well aware of you know our sentiment, our mood and how we talk to people and it can put all that data together and start to understand like who our exes are and who our friends are and who our old friends are who our new friends are and that's how it really works to incentivise another post.

PETER GRESTE: What you're saying is Facebook has the capacity to understand our moods?

ADAM HELFGOTT: Yes.

PETER GRESTE: Could that be used to influence our buying behaviours?

ADAM HELFGOTT: Of course it can be used to influence our behaviour in general, not just buying.

MEGAN BROWNLOW, MEDIA STRATEGIST: You can be incredibly hyper targeted.

Can I give you an example? We don't always act our age or according to our gender stereotypes.

A middle-aged woman might like rap music.

She is sick of getting ads for gardening gloves and weight loss.

So she posts on her Facebook that she likes Seth Sentry's Waitress Song.

Now she gets ads for a streaming music service - something she might actually buy.

PETER GRESTE: Adam Helfgott runs a digital marketing company in New York.

He uses a tool called Facebook Pixel.

Facebook gives it to advertisers to embed in their sites. They can track anybody who visits their site and target them with ads on Facebook.

ADAM HELFGOTT: Well if you've ever logged into Facebook it'll- and with your, with any of your browsers it- it's a good chance it'll know it's you.

You don't have to be logged in, you just- you have to have been there at some point in time and if it's a brand new computer and you've never logged into Facebook, you know, Facebook at that moment in time won't know it's you, but based upon you know, their algorithms and your usage they'll figure it out.

PETER GRESTE: So, what you can then do is put this piece of script in- onto your website.

And then use Facebook data to find the people that looked at your website and then target ads to them.

ADAM HELFGOTT: That's correct.

PETER GRESTE: Through Facebook.

ADAM HELFGOTT: Yep.

PETER GRESTE: That feel a little bit creepy, I mean is- is that a- are there privacy issue involved with that?

I mean from a legal point of view there's no privacy issue, that's just you know, the internet today, and the state of it and using a- a product that generates a lot of revenue for Facebook.

PETER GRESTE: For advertisers it is a boon - giving them access to the most intimate details of our lives.

Megan Brownlow is a media strategist for Price Waterhouse Coopers in Sydney.

MEGAN BROWNLOW: When you change your status, for example, we might see something, a young woman changes her status to engaged.

Suddenly she gets ads for bridal services.

These sorts of things are clues about what her interests might really be.

The research from consumers is they don't like advertising if it's not relevant to them.

If it actually is something that they want, they don't mind it so much.

This is actually not a bad thing.

PETER GRESTE: Nik Cubrilovic is a former hacker turned security consultant, he's been using his skills to investigate the way our data is tracked.

One day Cubrilovic made a discovery that startled the tech world.

He found that even if you're not logged on to Facebook - even if you're not a member - the company tracks and stores a huge amount of your browsing history.

And you can't opt out.

PETER GRESTE: If you don't like Facebook, if you don't like the kinds of things you're describing, just close your account?

NIK CUBRILOVIC, IT SECURITY EXPERT: It's very difficult to opt out of Facebook's reach on the web. Even if you close your account, even if you log out of all of your services the way that they're set up, with their sharing buttons, and so forth, they're still going to be able to build a profile for you.

And it's just not going to have the same overall information associated with it.

REBECCA MACKINNON: They don't even tell us clearly what they're doing. They tell us some things but it, it's not specific enough to really answer the question, if somebody was going to build a dossier on me based on what Facebook knows about me, what would it look like?

I should be able to know that so that I can make informed decisions at how I'm going to use the platform.

PETER GRESTE: Facebook is not just influencing what we buy.

It's changing the world we live in.

REBECCA MACKINNON: Sure they want to kind of bring their service to everybody on the planet from a commercial standpoint that's obviously a goal.

Whether it makes the world a better place is another question.

JOHN HERMAN: Not only have you built this big business and this big social network, you now are determining the course of you know, you are possibly determining the course of world events.

PETER GRESTE: That's exactly what happened in the streets of Cairo.

In January 2011, millions gathered in the city demanding the resignation of the autocrat Hosni Mubarak.

It became known as the Facebook revolution.

The organizers used Facebook to rally vast crowds of protesters.

They were so effective that the government tried to shut down the internet.

It took just 18 days to topple Mubarak.

PROF. RAMESH SRINIVASAN: So what Facebook came to stand for several months I would say or at least in its early days of after the um the events of Tahrir Square in the Arab spring was a symbol of people's ability to organize and express and share information more widely.

It symbolised that so much so that I like to tell stories about how I could buy T-shirts in Tahrir Square which said Facebook, Tool of Revolution.

PETER GRESTE: I understand as well as anybody just how effective Facebook can be.

Three years ago, I was imprisoned in Egypt on trumped up terrorism charges.

My own family used Facebook as a wat of organizing supporters, and keep them informed.

It became one of the most vital tools in the campaign that ultimately got me out of prison.

The Facebook page became a place anybody could find the latest on our case.

The underlying algorithms helped push it to people who might have been interested in supporting our cause, even before they knew it existed.

Mohamed Soltan was also arrested for protesting. He was imprisoned in the same gaol as me.

MOHAMED SOLTAN, FORMER POLITICAL PRISONER: There was this medium that people just wanted to express themselves because they- there was no other avenue in the public space to express themselves and then they found this outlet and then they found this outlet to the outside world as well, where they would put how they feel about social justice issues, on just day to day inequalities that they've seen and then there was the second phase of that where they saw that there's quite a few of them that feel the same way about a lot of different things.

PETER GRESTE: It took a prolonged court case, a 500-day hunger strike and intense international pressure to get him released.

For him too, Facebook became an invaluable tool.

MOHAMED SOLTAN: Facebook unlike other platforms and social media outlets, it allowed for us to put out the reports, the medical reports, ah it allowed for us to share our- my family, to share stories and it establish credibility.

JOHN HERRMAN: So Facebook provides this place that is almost ideal for finding like-minded people, whether that means finding people who live in a certain place who are interested in a certain thing or people who are in the thrall of a dangerous ideology.

PETER GRESTE: In Australia, Facebook has also become a powerful political tool for mainstream causes, and groups on the fringe.

They want to represent the great global international agenda that corrupts our people from the inside, builds mosques in our communities and floods our beautiful country with third world immigrants.

But is that what we want?

PETER GRESTE: Blair Cottrell leads a group called the United Patriots Front - a right-wing movement built on Facebook, that campaigns against Muslim and immigrants across Australia.

BLAIR COTTRELL: Facebook's been extremely effective for us.

That's indispensable to the development of our organisation.

Without it, we would probably be a separatist cult, where no one would be able to relate to us, because no one would actually hear us directly, they would only hear about us through established media corporations.

Islam can only pose a threat to our nation if our weak leadership, or rather lack of leadership, is allowed to continue.

PETER GRESTE: Facebook has helped turn a disparate group of individuals into political force that some say is dangerous.

BLAIR COTTRELL: It gives us the ability to cut out the middleman, to go directly to the people, to the audience with our message, to speak directly to the Australian people which is a power that hitherto has only been held by established media corporations and anybody who speaks through such media corporations.

But now anybody has that power.

Anybody has access to that power.

PETER GRESTE: Some of the UPF's more inflammatory statements have been censored - he's been prosecuted for staging a mock beheading that they filmed and posted on Facebook.

Facebook removed some of their posts, including the original beheading video, and threatened to suspend the page.

BLAIR COTTRELL: Sometimes Facebook has removed or censored certain posts of our because we've used the world Muslim for example.

Not in a negative way at all. If we'd explained an incident or a point of view and we've used the world Muslim, sometimes that registers in Facebook's computer and they automatically delete for, for some reason.

I don't know if it's a person who deletes it or a computer but that can be a bit problematic.

We actually started altering the way we spelt Muslim in order to have our posts remain up when we were speaking about the Muslim people of the Islamic faith.

PETER GRESTE: Facebook has been criticized for the way it censors controversial posts.

Whenever someone flags a post as offensive, it gets sent to a human moderator who decides if it should be taken down.

The company says it reviews a hundred million pieces of content every month.

REBECCA MACKINNON: People are under a lot of pressure to re-review a great deal of content very quickly and I certainly hear about what appear to be mistakes quite frequently and some of them are, are kind of ridiculous like at the end of the end of 2015 a bunch of women named Isis had their accounts deactivated because clearly somebody went and flagged them as being terrorists.

Facebook says it was to protect my privacy. We believed I'd been labelled a terrorist, violating what Zuckerberg calls its community standards.

PROF. RAMESH SRINIVASAN: You can't have a common standard for 1.8 billion people.

Our diversity is actually our strength right.

Part of what makes us a global community is the reality of that what forms a global community are our incredibly fundamental differences.

PETER GRESTE: In one infamous example, Facebook removed a post showing one of the most powerful images of the Vietnam war.

The photograph of a naked girl violated its community standards.

REBECCA MACKINNON: The community standards are developed by his staff.

The community didn't develop those standards.

They're called community standards but they were developed by Facebook and yes they've had input here and there over time, they also get input from governments about you know recently a number of governments told them you need to amend your community standards to be harder on extremist content you know.

And so they amended their community standard.

It's not like the community got together and developed these standards.

PETER GRESTE: Facebook is also transforming politics as we know it.

Politicians have been used social media for years of course, but in this last election political campaigners used big data to radically transform American politics.

In the words of some observers, they weaponized the data.

We're on our way to Washington DC to find out what impact Facebook has had on the fight for political power.

At the heart of political power is information.

That's why government security agencies go to extraordinary lengths to vacuum up data.

But increasingly it is also becoming the key to winning power.

ADAM SCHRADER, FORMER FACEBOOK JOURNALIST: I think that there's you know a legitimate argument to this that Facebook influenced the election, the United States Election results.

I think that Facebook and algorithms are partially responsible if not, you know, the main reason why, there's this shift toward hyper partisan belief systems these days.

PETER GRESTE: When Donald Trump became the presidential frontrunner in last year's US election, few pundits predicted that he'd actually win.

One of the Trump campaign's secret weapons was an ability to research social media data in extraordinary detail.

It helped him understand and target his voters with a precision we've never seen before.

PATRICK RUFFINI, REPUBLICAN PARTY STRATEGIST: By using Facebook's ad targeting engine for example, they know if some of those Independent voters have actually liked Republican pages or liked the Bernie Sanders page or like a Donald Trump page so you can go to them to spend money, to target advertising specifically to those voters and it is a much more reliable ultimately form of targeting than many of the other online vehicles out there.

PETER GRESTE: Political strategist Patrick Ruffini runs a company that mines big data for the republican party.

He produces social media maps that help them make sure their political messages hit their targets.

PATRICK RUFFINI: What it does give us is much greater level of certainty and granularity and precision down to the individual voter, down to the individual precinct about how things are going to go.

It used to be we could survey eight hundred, a thousand registered voters nationwide, but you couldn't really make projections about understanding from that understanding how an individual State would go or how an individual voter would ultimately go.

DONALD TRUMP: I Donald John Trump solemnly swear that I will faithfully execute the office of president of the United States.

PETER GRESTE: It is one thing to know your voters of course, and quite another to get them to change their minds. Facebook can help with that too.

DR SUELETTE DREYFUS: The ability to take the pools of big data that we've got and do really deep analysis of it to understand small groups of customers' preferences, can be applied in a political setting in a way that is potentially worrying because it allows politicians to potentially lie better.

NIK CUBRILOVIC: For instance, you can't make a political statement on television, without it being disclosed who it was being paid for.

Those same controls on the web are very lax. For instance, I could see a story about, certain XYZ politician has done a great thing, produced on a completely different third party news site.

I cannot know that, that ad was placed by a political operation who are specifically targeted me, because that information is not being disclosed, anywhere.

PROF. RAMESH SRINIVASAN: I am understanding yet troubled by the by the by the data driven advertising and targeting ads that occur, but I'm even more uncomfortable by the reality that our elections and how our elections are structured and configured can be hijacked by these forces that are not transparent to us.

PETER GRESTE: One of the most important parts of any democracy is news - almost half of all Americans get theirs from Facebook.

But the last US election also saw the explosion in fake news, turbocharged by sharing on Facebook.

JOHN HERRMAN: These things look like news, they function like news, they're shared like news, they don't match up with traditional ideas of what news is for and what it should do.

Facebook is in the middle of this, they are the company that can see all of this and make judgements about it, I think they would prefer not to have to do that.

PETER GRESTE: Adam Schrader is a journalist who used to edit stories for Facebook's Trending News section.

Part of his job was to filter out fake news.

ADAM SCHRADER: Essentially, we operated like a newsroom, it was structured like a newsroom, copy editors had, you know, would make sure that the topics met standards, make sure that they were unbiased, checked facts.

There were often times that a- fake articles would appear and present themselves as possibly being a- a legitimate trending topic.

And our job was, you know, identifying those and, the original term was blacklisting.

PETER GRESTE: In the heat of the campaign, right-wing commentators accused the team of bias.

Facebook sacked it and handed the job to an algorithm.

ADAM SCHRADER: An algorithm cannot do the job of a- of a trained journalist.

They don't have the ability to reason, artificial intelligence hasn't gotten to the point where it can, you know, really function like a human brain and determine what has news value and what is good for the public and what- what is not.

PETER GRESTE: Schrader says after the team was sacked, fake news really took off.

ADAM SCHRADER: Yeah after the trending news team was let go, there was a big problem with sensational or factually incorrect or misleading news sources and trending topics.

It was just a disaster.

MEGAN BROWNLOW: The more partisan news sources you consume, the less likely you are to believe fact checkers or experts.

And so, this can create some really dangerous divisive believers of alternative facts.

PETER GRESTE: During last year's election campaign, a news site published a story alleging that Hillary Clinton and her campaign chairman John Podesta were running a child sex ring out of the Comet Ping Pong Pizza restaurant in Washington DC.

The story was widely shared on Facebook. It was utterly fake, but it gained so much traction that a 28-year-old man finally went to the restaurant armed with an assault rifle, a revolver and a shotgun to find and rescue the children.

He fired several shots into the restaurant before police arrested him. The story still circulates on line and on Facebook.

PETER GRESTE: One study found that in the closing months of the US election, Facebook users shared the top 20 fake news stories over a million times more than the top 20 stories from major news outlets.

Fake stories are often written either for political advantage, or to make money.

DR SUELETTE DREYFUS: One study found that in the closing months of the US election, Facebook users shared the top 20 fake news stories over a million times more than the top 20 stories from major news outlets.

PETER GRESTE: Fake stories are often written either for political advantage, or to make money.

One study found that in the closing months of the US election, Facebook users shared the top 20 fake news stories over a million times more than the top 20 stories from major news outlets.

Fake stories are often written either for political advantage, or to make money.

DR SUELETTE DREYFUS: There are a lot of people out there who aren't journalists and aren't publishers who are publishing.

They don't have the same sense of obligation so we are really in unchartered territory.

I think one of the most important things is that we actually need a big public debate about this because its change the nature of news, and in doing so it's changed our reality as a society.

PETER GRESTE: Mark Zuckerberg initially dismissed the notion that Fake News somehow skewed the election, but he is rolling out a system that allows the Facebook community to flag suspect stories.

ADAM SCHRADER: Mark Zuckerberg has said that, you know, he's not in the news publication business right like it's- they're not a media company.

But I think that's a- a mistake, a kind of a denial right.

So, they're definitely a media company and I think that they should try and treat themselves more as one in the future.

PETER GRESTE: As more and more consumers get their news from digital sources and Facebook in particular the old-fashioned world of newspapers and TV stations is collapsing.

Like most news businesses, the New York Times is struggling. Facebook sends plenty of readers to their stories, but most advertising dollars go to Facebook.

Newsrooms are shrinking along with the resources for serious journalism.

JOHN HERRMAN: You'll hear this from small publications and large ones that their absolute audience is larger than it's ever been and that surely has to mean something but it certainly hasn't meant profits.

I don't think there's a major news company in the world that that hasn't changed its operations around Facebook in a real way and I mean that both in the way that it produces stories and approaches stories and the way that it makes money.

PETER GRESTE: If Facebook is playing an increasingly important role in how we understand the world, its social mood study showed it affects how we feel about it.

When its researchers explained how they manipulated the news feeds of some 700,000 users, they were criticized for playing with people's psychology without telling them.

And yet it's algorithms do that every day. They give us stories they know we want to see, to keep us online, and help advertisers send us more ads.

DR ETHAN KROSS, PSYCHOLOGY DEPT., UNIVERSITY OF MICHIGAN: I don't think we can treat Facebook as benign.

I think it has enormous implications for how we experience our lives, and how we live our lives, and I think that's the, simultaneously, that's one of the things that makes that network and others like it so phenomenally interesting and important.

PETER GRESTE: But Mark Zuckerberg's plans would have Facebook do far more.

He wants its users to rely on the network for the way we organize society, discover the world and conduct politics.

He wants the community to inform Facebook, while Facebook watches over us.

MARK ZUCKERBERG: The philosophy of everything we do at Facebook is that our community can teach us what we need to do and our job is to learn as quickly as we can and keep on getting better and better and that is especially true when it comes to keeping people safe.

PETER GRESTE: In a post on his own profile in February, he outlined his plan for the company - a kind of manifesto for the central role he wants the platform to play in societies around the world.

MARK ZUCKERBERG: We're also gonna focus on building the infrastructure for the community, for supporting us, for keeping us safe, for informing us, for civic engagement and for inclusion of everyone.

JOHN HERRMAN: So it's a document that really felt like an attempt to take some responsibility but it wasn't apologetic.

It was fairly bold and it seems to suggest that the solution to Facebook's problems is more Facebook.

PETER GRESTE: Zuckerberg's has great ambition. "Progress needs humanity to come together as a global community," he writes.

Facebook can help build it. It wants to 'help fight terrorism', while it's news service can show us 'more diverse content'.

JOHN HERRMAN: He is behaving recently in ways more befitting of a politician than a CEO. there's a lot of speculation that he may run for office and to my mind, if Facebook continues to be as successful as it is, not just through Facebook but through its other products, through Instagram and WhatsApp, if it continues to be so central in people's lives, he doesn't need to run for office, he will be presiding over platform and a venue where people conduct a real portion of their lives.

PETER GRESTE: It seems there is almost no limit to Facebook's intrusion into our lives, for better or for worse.

MARK ZUCKERBERG: I want to thank all of you for joining us to hear more about some of the things we're working on at Facebook to help keep our communities safe.

PETER GRESTE: Zuckerberg convened what he called the Social Good Forum to help people whose Facebook posts indicate that they might be at risk of harming themselves or committing suicide.

MARK ZUCKERBERG: We're starting to do more proactive work.

Like when we use artificial intelligence to identify things that could be bad or harmful and then flag them so our teams can review them.

Or when someone shares a post that makes it seem as though they might want to harm themselves.

We can give them and their friends suicide prevention tools that they can share to get the help that they need.

DR SUELETTE DREYFUS: The ability to get big data and to gather data about us in all aspects of our lives creates a particular type of power among the people or organizations that have that data.

They can say, oh these are your daily habits.

There's been some research done that nearly half of what we do is just repeating patterns of what we did the day before.

From that, you can also predict potentially how people will behave in the near future as well. And that's perhaps a little bit concerning for people who care a lot about their privacy.

They are from little things, such as ad targeting, giving away what your girlfriends bought you for your birthday, and doesn't want you to know about through to a 14-year-old boy who hasn't come out as gay yet, and his parents discover the fact because of the advertising that's being targeted to him, through to potential future harms.

One of the problems in the privacy realm is that we only have one identity, and we can't take back what we've already handed over.

PETER GRESTE: Facebook's is far more intimately involved in our lives than any company we've ever seen before.

REBECCA MACKINNON: Facebook has a responsibility to inform people of what is happening to their data and so then there can be a conversation also with "the community" about whether people agree that this is an appropriate use and right now they're not providing enough information for that conversation to take place.

NIK CUBRILOVIC: There's no take-back on private data.

The implications that are going to occur in five or 10 years' time, we need to protect against that now.

And to an extent, it almost feels like, I'm reminded of Einstein's letter to Eisenhower warning about the potential dangers of nuclear holocaust or whatever, with the dangers in uranium - not to say that it's that severe, but we are at the point now, where we know that there' s danger, but we don't know the extent of it and we don't know the potential implications of it.

PETER GRESTE: Our lives are now measured in data.

What we look at, who we talk to, what we do is all recorded in digital form. In handing it to Facebook, we are making Mark Zuckerberg's company one of the most powerful in history.

And the question is, at what cost to ourselves?

SARAH FERGUSON: For the sake of transparency, the ABC also uses Facebook pixel as a tool for audience research.