On May 1st of this year TheAustralian wrote a since redacted article saying that Facebook claimed it could recognise, in real time, anxiety in teens and use this for advertising. Facebook disputed this, claiming its aggregate data 'intended to help marketers understand how people express themselves... never used to target ads'. Assume what Facebook said is true and they don't have targeted ads, but what's stopping them?

Cartoon by Tam Charlwood

Let's backtrack to the story of a small UK based research team. Michal Kosinski attended Cambridge University to do his PhD at the Psychometrics Centre. He began working with fellow student David Stillwell about a year after Stillwell had launched the MyPersonality app. The app allowed users to fill out different psychometric questionnaires, including a handful of questions from the classic Big Five personality questionnaire. Users received a "personality profile” and had the option to share their Facebook profile with researchers.

They expected a small sample size based on friends and their students, but millions of people ended up using the app, and giving away their likes and data. The app had gone viral.

The team used their data set to make a profile consisting entirely of likes, testing against psychometric results and the hard data provided by the user. For example, men who "liked" the cosmetics brand MAC were slightly more likely to be gay; one of the best indicators for heterosexuality was "liking" Wu-Tang Clan. While each piece of such information is too weak to produce a reliable prediction, with individual data points aggregated en masse, the model produces a chillingly precise doppelganger.

In 2012, Kosinski proved that on the basis of an average of 68 Facebook "likes" by a user, it was possible to predict their skin colour (with 95 percent accuracy), their sexual orientation (88 percent accuracy), and their affiliation to the Democratic or Republican party (85 percent). By 2014, he was able to evaluate a person better than the average work colleague, merely on the basis of ten Facebook "likes;" 150 were enough to outdo what a person's parents knew about them, and 300 "likes" trumped their partner.

The same analytics were used by Cambridge Analytica, first for Brexit, then by Ted Cruz, then by the eventual 46th President of the United States.

This is just Facebook. Consider: Google knows when you're hungry by what you search for. Other companies know your typing style. They know that you're quick on the shift key and occasionally miss the space bar. Advertisers know this too, because the platforms they utilise package it as demographic data.

Facebook is sticky, it wants to keep you on the platform, and it wants you to scroll slowly. It can stop you dead in your tracks with a well placed article (which by the way, opens within the Facebook app) Think about what you click on. Your Facebook isn't an echo chamber, you're smart. Hell, you're a JD! It's true, people don't want to read just stuff they agree with, they want to be challenged. This proves salient when you look to your feed. You'll see articles that nudge you. Some of them will ask you if you're progressive enough, others might show you a great injustice. You'll want to click. Facebook knows this because you know this. Youtube knows it too. Watch a video on vegetarianism. It will show you a recommended video for veganism. Its that nudge that keeps us interested, in the most classical sense of the word.

Anyone in an appropriate radius of Evvia will tell you it’s not their fault, it’s how customers choose to use their product. That's a facile argument. Autoplay of videos is not a user choice, it is the default. The default position is the stickiest. It is Sisyphus and his rock: Silicon Valley execs assure us, “he is happy”.

What does that mean for advertising? With real time emotional data and accurate personality profiles, Facebook can populate your feed with exactly what you need – what it wants you to need. You can see completely tailored content, that is possibly so unique it won't be seen by the anyone else.

Expanding this analysis to the political sphere, we face a challenge to the traditional orthodoxy. In the old days of television (the thing you stream to your laptop), political communication was open to the public and broadcast widely. This means things must be in broad strokes. It also gives your opponent right of reply. Its competitive. It keeps them honest.

Facebook has the potential to ignore the standard procedure. Articles and ads can be targeted and pinpointed, eliminating the wide berth of political messages and leaving critics with only slivers to respond to.

As most readers will know, and some will learn come week 9 of consti, Australia has an implied right of political communication. McGinty v Western Australia shows that this balance between communication and information is key to this right.

A curated, digital news delivery will hamper this. To apply the two pronged test of Lange is no longer enough as the majority in that case notes its application must be to ”what is necessary for the effective operation of the system of representative and responsible government.”.

​A lack of this seems to cut through the dearth of the political noise: without equality of platform, how can we be sure citizens are well informed when they step into the voting booths?Using mass psychology and the data we hand them, large tech companies can go unfettered into creating a decidedly polemic political landscape. I don't know about you, but I'd give that an angry react.

A bold claim that Facebook and other websites may be causing a violation of an implied right (although; they are under no Constitutional obligation, right? Really, the Australian Government is the only one constrained by the right, and do they have a "positive" obligation to step in?) In any case, one worth considering. Thank you for the article.