Learning Resources

The results of this report by World Wide Web Foundation illustrate how algorithms have been delegated the function of curating content in a way that defines online users’ information diets and what can best be called the online ‘public square’.

The way we experience the web today is largely through algorithms. Search algorithms determine the results we see. Targeting algorithms decide which ads we are shown. Algorithms on social media services select what content makes it to our news feeds — and what is hidden.

This role of curation gives tech companies a huge degree of power over our public discourse. Yet, the opaque nature of these algorithms means we have little comprehension of how they work and how they are affecting our information diets.

Seeking to better understand how these algorithms curate content, this research focuses on Facebook’s News Feed — one of the world’s most important algorithms, selecting content for Facebook’s nearly two billion users.

The researchers ran a controlled experiment — based in Argentina — setting up six identical profiles following the same news sources on Facebook and observed which stories each profile received.

Findings

The results illustrated how algorithms have been delegated the function of curating content in a way that defines online users’ information diets and what can best be called the online ‘public square’:

Large gaps between the number of stories published and seen by the profiles: the profiles were shown an average of one out six posts from across all the pages that they were following.

No exposure to certain types of stories:When algorithms curate content, critical social and political posts may not be shown to users. For example, during the period observed, none of the stories published about femicide were surfaced on the feeds of the monitored profiles, while stories about homicides were shown.

Different levels of exposure to different articles:Users with the exact same profile details, following the same news sites, were not exposed to the same set of stories.

As we grow ever more dependent on digital platforms for information, the control they have over the public discourse is becoming a greater liability to our social and political systems. We must work to ensure people who use these platforms have more control over their information diets.

The World Wide Web Foundation envisions this report as a first step toward these ends, allowing the general public to better understand how platform algorithms work. In doing so, it hopes to trigger a conversation regarding the roles that online users, governments and platforms have in defining the values embedded in algorithms.

To explore this research and read the recommendations, download the full report: