Zeynep Tufekci: How Is Our Attention Packaged And Sold As A Commodity?Why is it so easy to burn through an hour on YouTube or Facebook? Sociologist Zeynep Tufekci explains how advertisting algorithms have turned our attention into a valuable commodity.

TUFEKCI: Everybody has 24 hours in the day. You sleep some, you work some, and what time you have free is one of the most important things you have. Getting your attention and putting in front of you can change your opinions. It changes what you prioritize. It affects politics. It affects your social interactions. I think that in an age where you have too much information, the crucial resource is that which information consumes, which is attention. Your attention is being battled over and being packaged and sold.

RAZ: Attention is a commodity.

TUFEKCI: Absolutely. And we have a digital economy that is essentially based on making sure that we are not in control of our attention.

RAZ: On the show today - competing for your attention, ideas on the value of our awareness and why in an age of infinite distractions whoever can capture our attention holds a lot of power. Zeynep Tufekci explains from the TED stage.

(SOUNDBITE OF TED TALK)

TUFEKCI: Do you ever go on YouTube meaning to watch one video and an hour later you've watched 27? You know how YouTube has this column on the right that says up next, and it autoplays something? It's an algorithm picking what it thinks that you might be interested in and maybe not find on your own. It's not a human editor. It's what algorithms do. It picks up on what you have watched and what people like you have watched and infers that that must be what you're interested in, what you want more of and just shows you more. It sounds like a benign and useful feature, except when it isn't.

So in 2016, I attended rallies of then-candidate Donald Trump to study as a scholar the movement supporting him. I studied social movements, so I was studying it, too. And then I wanted to write something about one of his rallies so I watched it a few times on YouTube. YouTube started recommending to me and autoplaying to me white supremacist videos in increasing order of extremism. If I watched one, it served up one even more extreme and autoplayed that one too. If you watch Hillary Clinton or Bernie Sanders content, YouTube recommends and autoplays conspiracy left, and it goes downhill from there.

Well, you might be thinking, this is politics, but it's not. This isn't about politics. It's just the algorithm figuring out human behavior. I once watched a video about vegetarianism on YouTube, and YouTube recommended and autoplayed a video about being vegan. It's like you're never hardcore enough for YouTube.

(SOUNDBITE OF MUSIC)

TUFEKCI: So what's going on here isn't YouTube engineers are out to wreck the world, right? But they have set loose an algorithm that's optimized to grab your attention for as long as possible to keep you on the site under, their word for it, engagement while YouTube serves the ads. And the algorithm has sussed out that humans are particularly susceptible, especially young people are particularly susceptible, to the idea that they're discovering a secret, that they're being told something edgier - right? - something more extreme because it's kind of like, ooh, this is novel. I'm interested in this, right? It's sort of seducing you. It's sort of trying to play to your appetites. So the algorithm automatically plays more and more.

So if you just watch some political stuff, you end up with Alex Jones, who has all these horrible conspiracy theories. You watch some, you know, science stuff and three recommended autoplays later you're in the moon landing never happened. You watch something about Trump, and a little bit later, the algorithm is playing the Holocaust never happened stuff. So by optimizing for grabbing your attention, we have in effect through YouTube's recommender algorithm created this engine of extremism that is deployed globally.

RAZ: Zeynep, this is really bad.

TUFEKCI: I agree.

RAZ: Like, this is - this should never have been allowed to happen.

TUFEKCI: Well, I absolutely agree it is a big problem, but I'm an optimist, right? Just like we can deal with the other things that come with the 21st century and just like, you know, industrial revolution has brought us a lot of good things, this is something we can deal with. We just have to say, look, let's make explicit what the problem is. Let's realize that our attention is a crucial resource. And it's an equal resource, right? Every human being on the planet only has so many hours. It doesn't matter if you're rich or not. Right? You still have so many hours. It's one of the most human of things.

And we have to treat our attention and our time as the crucial resource it is and change. And we're being told a story by Silicon Valley that this is inevitable, this is good or that this stuff has to come in combination. You know, if you're going to use digital stuff, you're going to have your attention manipulated and sold. It's just not true. They package it this way, and we don't have to.

We finally are paying attention to the question of attention. The next step is, we need tools and regulations and industry self-regulation and individual awareness and all those things together to say, how do we grab back control of this most precious resource?

(SOUNDBITE OF MUSIC)

RAZ: Zeynep Tufekci - she's a sociologist at the University of North Carolina and a columnist for The New York Times. You can find all of her talks at ted.com.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.