in defense of millennials

It was announced recently that Air France is reviewing what to do with Joon, its boutique airline geared towards Millennials. With the admission from Air France that the brand was “difficult to understand from the outset for customers, for employees, for markets and for investors” there will undoubtedly be many that pile on the airline whose days seem numbered.

While this will be added to the list of misguided efforts to market to Millennials, Air France’s efforts were not without merit. Air travel is generally a horrible experience. Trying to create a niche brand focused on experience isn’t a bad idea. Neither is creating a product/service specifically for a segment of customers. But…

When setting out to create an experience-based brand for a specific segment, the full experience needs to resonate with that group. One of the aspects of airline travel that makes it so terrible is the other passengers. Presumably, Millennials would prefer an experience filled with other Millennials. In order to create this Millennial-only environment, Joon would need Millennials to be willing to pay more (like a brunch place selling $30 avocado toast) or make the experience terrible for other generations (like the Firefly Festival). Joon offered low fares (remember, Millennials are poor) and an all around pleasant experience for everyone. So why wouldn’t a Baby Boomer booking a flight from Paris to Capri book this cool, new low-cost airline?

So was Joon really a Millennial’s dream when it comes to air travel? Or was it Ryan Air with craft beer? They get points for trying to improve the air travel experience, but it was probably doomed to fail, as Millennials are not going to shell out more money to upgrade their air travel experience. Until this cohort gets out from under student loans and is no longer confronting skyrocketing housing costs, it seems unlikely for a Virgin-for-Millennials airline to take flight.

This also begs the question of whether or not Millennials are enough of a homogeneous cohort to represent an actionable segment for businesses to market to. Of course, when we say that “market to Millennials” we are referring to the selfie-taking, brunch-loving, job-hopping hipsters society loves to hate. But the reality is that a lot of Millennials are currently leading normal, suburban lives driving kids to school and commuting to office parks. Me lugging my two year old onto a Joon flight isn’t what anyone had in mind for the Millennial airline.

I recently published a list mocking the many lists of things Millennials have killed. Pretty far down on that list was the Toyota Scion. Although the ranking methodology put it low on the list, it does fall into my favorite category – failed attempts to market to Millennials. Scion was supposed to be a fun car brand that teens and twenty-somethings who wanted to eschew automotive norms, would flock to. What it really ended up being, was a value mobile that younger and older drivers warmed up to.

For anyone unfamiliar, the Scion is the cube-like, futuristic car launched by Toyota in 2002, known for its transformer-inspired commercials. It was supposed to be cool and different, appealing to Gen Y (the term Millennials hadn’t been popularized yet) when we were barely old enough to drive. 2006 was the most successful year for the Scion, selling 173,000 cars.

So what happened? A few things. First, cars are a mass-market product. For the economics of a line of cars to work, it needs to appeal to the masses (like the Camry). Or if not all of the masses, a niche that represents a pretty big chunk of the masses (like the Mini Cooper). Or if not a sizable niche, then it had better command a high price point (like Jaguar). The Scion found a niche, but it wasn’t large enough. If anything, Toyota succeeded at what they tried to do – the average age of a Scion buyer was 36. The cohort it appealed to was just not large enough.

The other issue was the or economics didn’t work. The business model for the Scion was that the MSRP alone would not sustain the car, so Millennials were supposed to add lots of features to it (a la Pimp My Ride). This never really panned out, and the Scion became a value brand to its customers. Not a problem if the car’s MSRP was enough to sustain it economically, but that was not the case.

Another issue with the Scion was that it was supposed to be cool. It sort of was for a time, but then Baby Boomers noticed this bargain-box on wheels. It wasn’t a gas guzzler, got you from point A to point B, and it wouldn’t break the bank if you didn’t have Xzibit installing an Xbox in the trunk. So what happens when a 20 year old sees their mom driving the same car that is supposed to make everything epic? They start to view the car as responsible and safe. Probably not what the makers of the epic commercial had in mind.

I referenced earlier that the niche that Scion captured wasn’t big enough. It also grew up. Millennials are moving out of their urban locales and need bigger cars to lug around their growing families. Gen Z behind them didn’t pick up the slack, and sales dipped.

It’s hard to draw any broad conclusions about designing products for Millennials based on the fact that Scion only lasted for little over a decade. One could even argue that it succeeded by getting Millennials into the Toyota family as they entered the workforce during the financial crisis. I do, however, think it warrants a moment’s hesitation before anyone that sells mass market products, such as cars, designs a product for a niche market. It is important to make sure that the niche is large enough to sustain the economics of the product. Also – understand your value proposition – a fuel efficient car at a low price point had tons of value to drivers of all ages during the financial crisis.

Recently, I published the dumbest list the internet has ever seen. All 82 things that Millennials have been accused of killing, ranked according to an appropriately dumb methodology. Topping that list was the beloved napkin. The innocent, dutiful paper product is having the life snuffed out of it by those malicious Millennials.

I’m a Millennial. I usually just use a paper towel when I’m at home. So I get it, we probably are not the paper napkin industry’s favorite segment. But why is this the number one thing that Millennials are accused of killing? It appears on almost every listicle that lays out the Millennials’ victims. There must be something about napkins and Millennials that makes us true mortal enemies.

In all of these articles, Millennals come off like Walter White, fully broken bad out of greed and laziness. While napkins are Hank, the heroic figure with an uncompromising moral code. This didn’t seem quite right. So I decided to do some sleuthing to get to the root of the accusation that Millennials have killed off napkins.

On the Case

Like any good Millennial detective, my hunt started with Google. I Googled “Millennials ruined napkins” to see if I could figure out what was really up, and the most prominent search result that wasn’t a “Millennials killed” listicle was from Business Insider. It referenced a Washington Post article, which in turn referenced a study conducted by Mintel. This study appears to be ground zero for the Millennial-Napkin caper.

Mintel is a Market Research firm. It makes sense that it would conduct research on Millennials, as marketers are very vocal about how Millennials won’t play by traditional marketing rules.

So Mintel must have uncovered some pretty conclusive data to link the death of paper napkins to Millennials, right? Well, not really. What Mintel does say is that that only 56% of consumers have purchased paper napkins compared to 86% that have purchased paper towels in the past six months. Also – according to a Marketing Director from Georgia Pacific, 15 years ago 6 in 10 households used paper napkins, compared to just 4 out of 6 today. Notice that no one has even mentioned Millennials yet?

So paper napkins do appear to be on a downward trend. To get to the bottom of this, The Post reached out to a real live Millennial, who informed them that napkins aren’t on recent college graduates’ radars, but paper towels are! Fair enough.

They also reached out to someone who claims to have never used a paper napkin in their lives, bringing the perfect blend of snobbishness and an irrelevant opinion to an article about how people previously used paper napkins. The article then took us through many twists and turns with input from Martha Stewart on how we need to use linen or cloth napkins, and someone from Apartment Therapy who chimed in with thoughts about how in-home entertaining has become more informal over the years.

The article then turns to our marketing friend from Georgia Pacific, maker of several paper napkin brands. He informs us that paper napkin use has been on the decline for 20 years – a trend that would appear to indicate paper napkins have been slowly becoming less and less relevant to Americans’ lives over the past two decades.

He also mentions that Millennials are more likely to eat meals on the go, and just grab a paper towel if it happens to be on a kitchen counter or island. This makes sense. I’m seeing how you could indict a Millennial for this crime, but proving guilt beyond a reasonable doubt is still going to be a challenge. The Business Insider trail has gone cold. Back to Google.

Next up in our Google searching is an article from Need A Mom NYC, which quoted and linked to an article in New Republic about The Myth of the Millennial as Cultural Rebel (great article by the way). It did mention the accusation that Millennials killed napkins, but what did it reference? Our original Business Insider article! The trail has gone cold again. Back to Google.

The next several tabs of Google searches are all just listicles. Well, listicles are how we got here in the first place. Let’s go back to our list 11 “Millennials are killing” lists that got us started. Guess where eachandeverysingleone of them linked to. Either our original Business Insider article, the Washington Post article, or both. Credit to Marketwatch who also linked to an article about how Millennials are eating less at home. For whatever reason, they also decided Martha Stewart was a good person to weigh in on Millennials with jabs such as “they don’t have the initiative to go out and find a little apartment and grow a tomato plant on the terrace”. Because every young adult has a terrace, we are just too lazy to grow vegetables on ours.

Has a crime even been committed?

I think I’ve done enough chasing my tail to determine that the Millennial/Napkin conspiracy is based solely on the research conducted by Mintel. Remember, they never even linked their study to Millennials.

The paper napkin industry has been in steady decline for several decades now. Millennials have come of age, entered the workforce, started living on their own, and making their own purchase decisions during this same time. In this same time the work day has gotten longer, people are cooking at home less frequently, and we all seem more likely to eat on the go. These things all correlate together nicely, but doesn’t necessarily mean that there is something inherent about Millennials that have caused them to reject napkins. We didn’t volunteer to work 60 hour weeks just to make ends meet!

So what really happened here? Likely, Mintel was hired by a paper products company to conduct research on consumer trends in their industry. They noticed the trend of declining napkin use. The Washington Post saw the study, noticed it correlates nicely with the rise of Millennials, and put together an article seeking to explain why napkin use is in decline. Like Mintel’s report, The Post didn’t even mention Millennials in its headline. This, of course, did not prevent Business Insider from sniffing out another good “Millennials killed” story. Then, the cottage industry of listicles about things Millennials have killed scooped up the story and added it to their lists.

So that is how the infamous Millennial murderers got their bad name. An industry slips into decline. There is a sliver of data suggesting the trend correlates with Millennials coming of age. Anecdotal evidence is applied, and the court of public opinion is whipped into a frenzy ready to convict an entire generation for the murder of something that was always just a dead tree anyways.

As Ranker astutely pointed out, aren’t the companies that make paper napkins the same ones that make paper towels?

It seems as if Millennials have created a cottage industry of sorts. It has become lucrative to write about all the things Milliennials are killing. So, in true Millennial fashion, I am here to kill the lists about Millennials killing things.

Dumb lists deserve dumb ranking systems.

What have I decided to do? Create a list of course. If you believe in Nate Silver’s approach to aggregating polls, as I do, it would seem that the appropriate thing to do here is to use a dumb version of his approach. I have aggregated 11 “Millennials are killing” lists from a variety of sources, some serious, some less so.

Once I combined things that are really the same thing (“Applebee’s” is the same as “chain restaurants”), this gave me 82 unique things that Milliennials have been accused of killing. I then ranked each thing that met its demise at the hands of a Millennial based on how many lists each item appeared on, and assigned point values based on where in each list it appeared.

Yes, that’s right. You just read a list of 82 things that Millennials are accused of ruining.

Dumb lists deserve dumb analysis.

So I have my master list of things Millennials have either killed or ruined. Naturally, my inclination is to group these “things” into categories that explain why it is that Millennials have killed them. I categorized them into Behavioral Change (Millennials behaving differently than previous generations), Financial (Millennials are poorer than previous generations or something costs more now), Taste Changes (Millennials tastes are different than previous generations), Societal Shift (society has changed, not just Millennials), Technology Advancements (changes in technology are the reason for Millennials’ behavior differing from previous generations), Political Opinion (it is really just a matter of political opinion that Millennials have ruined a particular thing, usually the way they voted), Misguided Marketing (a thing was created to please Millennials and it failed), and External Forces (some other thing has caused a shift in consumption).

Categories of things Millennials are accused of killing

So what does this tell us? Well, for starters Millennials do exhibit some behavior that has upended industries, products or social norms. This isn’t unique to Millennials. Remember how the Greatest Generation used to eat real cheese from cows and goats? Baby Boomers then came along and started buying gelatinous substances shaped like cheese. The rise of fake cheese singles represented a shift in behavior. Well, Millennials are no different.

There are also a large number of these that are financially related. Why aren’t Millennials buying houses? Is it because they don’t believe in home ownership? They’d rather be brunching? No. Is it some combination of them being poor and home prices skyrocketing to prices that are significantly higher than they were for previous generations, even when accounting for inflation? Yes

There are also several that reflect society, technology or other external factors playing a role. Nobody shops at department stores anymore, but because their demise coincided with the rise of “Millennials killed everything” lists, Millennials get the blame. This is especially ironic when you consider the fact that Jeff Bezos isn’t even a Millennial.

There are two categories that are especially dumb. The first is political opinion. Millennials get blamed by both sides of the political spectrum because they didn’t vote in line with which ever bloggers political affiliations go. To be fair, these were mostly relegated to fringe blogs, but they made it onto lists from Mashable and others, so we’re counting them. Basically, a blogger is mad about the way that America or the American Dream is going and applies a heavy dose of confirmation bias to the way they already feel about Millennials, and decide the generation is to blame for the country becoming too progressive or not progressive enough.

Also, especially dumb are the misguided marketing efforts. Items like the McWrap and Toyota Scion that were created for and marketed to Millennials that really just missed the mark. McDonald’s figured out that Millennials weren’t buying Big Macs because they were health conscious, so they created a healthier alternative for them. The only issue is that health conscious customers aren’t going anywhere near a McDonald’s, so they aren’t going to try some carb-reduced chicken sandwich.

Please, no more lists

So there you have it. The dumbest thing you’ve ever read. If this does somehow lead to the demise of the lists of things Millennials have killed, I will proudly own up to that one. Until then, I will continue to roll my eyes every time someone talks about all the things that Millennials have killed.

My last Facebook post was on June 19. My last tweet was on May 4. My last Instagram post was on April 6. I think it is fair to say that I am no longer a contributor to social media.

This was part conscious decision, part lack of interest. After reading Jaron Lanier’s terrific book Ten Arguments for Deleting Your Social Media Accounts I really wanted nothing to do with Facebook. I learned that even by trying to combat the “evils” I saw on Facebook, I was merely feeding its “BUMMER” algorithm designed to make us angrier and angrier. It’s a modern day cable news, rage is the business model.

I also found Facebook less and less enjoyable. When I switched Twitter back to a reverse-chronological timeline, it became much less addictive.

Once I stopped posting, I was no longer chasing the dopamine rush that comes along with likes, retweets and comments. There was no expectation of a retweet, so I had no desire to check my phone for notifications. I consider all of this to be a net-positive in my life.

This does, however, create a dilemma for someone that is trying to write semi-consistently about the impacts of social media. I am not a social scientist, so I am not conducting my own experiments or surveys. I could peruse the zeitgeist for social media trends and discuss those, but a lot of that feels like regurgitation of someone else’s work, which does not interest me.

So I am going to broaden my focus, for now at least. I still plan to discuss social media, but less exclusively. I’m curious about things that didn’t exist prior to social media – pop ups like Color Factory or The Museum of Pizza and SantaCon – would these exist without social media? The business model of rage interests me as well – Cable news networks invented it, but social networks perfected it. I also often find myself wondering if social media is to blame for my declining eyesight.

Like any good millennial, I have spent plenty of time glued to my smartphone. I have also walked around midtown Manhattan at lunch time on a weekday, and I know that it isn’t just one generation that can’t peel their eyes away from their phones. There is something about our phones that has taken hold of us, limiting our own free will to put our phones down.

We do not yet fully understand what our smartphones or social media are doing to our brains. Without understanding the problem, we cannot completely solve the problem. I do believe, however, that there is one immediate remedy we can all employ immediately to lessen social media’s influence over us.

LIMIT YOUR NOTIFICATIONS.

This is where the addictive nature of smartphones crosses paths with the addictive nature of social media. Our smartphones provide us with immediate access to limitless information. This is a good thing. It does not, however, mean that all of this information needs to be pushed to us in real-time. Every single news update, sports score, like, or retweet does not need to interrupt our daily lives. This, of course, requires that we define our lives as our interactions with the real world around us, of which our online interactions are as subset.

Since some of the time we spend engaging with our phones and social media is a valid part of our real lives, some of the intrusions we experience are warranted. Therefore, I propose a hierarchy (see image below) for how to approach the notifications settings on your smart phone. Yes, you are in control of the notifications you receive from your phone. The hierarchy includes four levels, each of which is designed to match the intrusiveness of the notification with the importance of the information that is delivered.

Level 1

First, if the failure to receive a notification will result in damage to your career or real-world relationships, then it should cause your phone to ring or vibrate. This should be limited to texts and phone calls for most of us. Career-related notifications may be required for some. This does not include emails. The people who are capable of sending you an “urgent” email should also have your phone number.

Additionally, if the failure to reply to a DM or tag on Instagram or any other social media platform in real-time is going to damage a real-world relationship, you may need to have a conversation with this friend or reconsider the friendship.

Level 2

The second level in the hierarchy are notifications that appear on your screen when it is locked. This should be limited to the same apps as the previous level, with the addition of a single news source. No social media. I’d really argue against email here too, but we all have jobs with bosses who have different expectations, but if you can – or are in the position to impact this for others – turn off the email.

Level 3

The third level is unlocked banner notifications. These appear at the top of your screen when your phone is unlocked. I prefer these to banners that appear when your phone is locked because they don’t entice you to pick up your phone. I also like the fact that they are accompanied by a few lines of information, so the notification isn’t a complete mystery. If it is major breaking news, you can click on it. If there is an injury to someone on your fantasy football team, it can probably wait (or not, your call).

At this level, in addition to email and text, consider a few more news sources includeing sports or entertainment. I also have LinkedIn notifications coming in, because I usually ignore them, and can often glean as much information as I need from the banners. The reason I recommend keeping some limitations is that these can cause us to fall into a common smartphone trap. You open your phone to send a text, and the next thing you know you are in a political argument with your cousin’s friend on Facebook.

Level 4

The next level of notifications is the red dot in the corner of the app on your screen. This is where our social media additions kick in. We crave that red dot. Maybe someone liked our post. Maybe they commented. Maybe my friend’s uncle who I totally just owned with a comment on his political rant has responded. Maybe none of these things are important, and just induce a dopamine rush that we crave.

The general rule that I employ at this level is that any non-social media app can display these notifications. This doesn’t mean that you need to enable these notifications for all apps, but if an app may have significant information, and you otherwise wouldn’t open it regularly, these notifications can be useful.

Beyond the hierarchy

Social media apps are noticably missing from all four levels of the hierarchy. There is of course a fifth level to the hierarchy, which consists of apps that never display a notification. These should include all social media apps, or any app that we find ourselves unable to resist (gaming apps apply here too). The reason they should never display a notification is that we should make conscious decisions that determine when we access these social networks. We shouldn’t let algorithms decide when and how we are pulled in. Social media companies have many brilliant people working with sophisticated technology, data and algorithms all designed to “increase engagement”. This means they want you to use their platforms more. Given the limited number of hours in the day, they are competing with the time we spend dealing with the real world around us. It also means we will inevitably reach a point at which we lose control over how frequently we access their platform. Notifications are a major tool they use to get us to open their apps.

I will close with the personal recommendation that you consider deleting several social media apps from your phone. Maybe not all of them, but try a week without the apps you find yourself reflexively opening up in moments of boredom. Those that you open up without consciously thinking about it. For me it was first Facebook and then Twitter. I’m still on both social networks (although I try to limit Facebook to once a week), but just not on my phone. I can’t recommend it enough.

Several years ago, Radiolab had an episode about addiction called The Fix. It featured Dr. Anna Rose Childress, who discussed the idea that those who suffer from addiction are simply more receptive to rewards from the pleasure center, or, “they are the fittest of the fit in evolutionary terms”. The theory is that we, as humans, evolved to be very responsive to the things that trigger the pleasure center, because these are things that kept us alive and helped us find a mate. Sweet fruit is safer to eat than bitter berries. Humans that were driven to get up first thing in the morning and beat everyone else to the orange tree after tasting the sweet fruit were more likely to survive than those who would eat whatever berries they found on the ground. Those that were most motivated by the feeling of being hugged by a loved one, would work harder to find a mate. So as a species, we evolved to be responsive to our pleasure center.

This is a feature that is no longer crucial to survival in the modern world. We go to a grocery store for our food, where everything is safe to eat. But this hasn’t changed the fact that we are still very responsive to our pleasure center, some more than others. So when a drug is able to trigger that pleasure center, we are also very responsive – and continue to seek it out. Those that are most susceptible to drug addiction, are those that are most attuned to their pleasure center. Historically, they would have been the individuals that would have worked the hardest for the best food, been best at finding a sexual partner, or built the safest house. Up until recently, humans have been rewarded for being overly receptive to the rewards of the pleasure center. Now, as Jad Abumrad put it, “it is a weakness born of a strength”.

Another Evolutionary Trait

In a study published in Frontiers in Psychology, Samuel Veissière and Moriah Stendel, both of McGill University’s Department of Psychiatry, argue that evolution plays a similar role in explaining our behavior on social media. Historically, it was important for humans to constantly watch and be watched by others to pick up on social cues and cultural norms. By constantly monitoring those around us, we learned how to interact with others, forming bonds that could save our lives. We respond strongly to feedback on our interactions, because this helps confirm we are behaving according to societal norms. This is how we formed groups that were able to act as one (culture), overtaking groups that could not behave cohesively. Jonathan Haidt does an excellent job of explaining why this is important in The Righteous Mind, using a team of rowers that actg as one as his metaphor.

These evolutionary tendencies have kicked into overdrive with access to social media. Like those who are extremely attuned to rewards from the pleasure center, many of us constantly crave the ability to monitor our friends’ every move, and to have our friends monitor and approve our actions. This is simply evolution. Those of us that are “the fittest of the fit in evolutionary terms” are more likely to crave the ability to monitor friends and have friends affirm our actions in the form of likes and retweets. Historically, this was a trait that kept us alive. Today, it is a trait that keeps us glued to social media – with other possible consequences we are only beginning to understand.

This is a step towards understanding what social media is doing to our brains. If we can understand the reasons why we are drawn to social media, we can begin to understand the degree to which it is impacting our brains. It is important not to draw sweeping conclusions. This does not mean that entire generations are addicted to social media or do not understand how to carry on meaningful relationships in the real world. It does, however, lead me closer to the conclusion that social media platforms are exploiting our evolutionary behaviors to increase our usage.

The definitive answer: this is not something you can cover in a single blog post. That doesn’t mean I didn’t initially set out to do it. Then I came across this Business Insider piece, which made me realize that there are too many moving parts here to write a single definitive post about whether social media is addictive.

First, a better question might be can social media be addictive? Can alcohol be addictive? Yes. Is it addictive for everyone? No.

While this article has has pushed me more towards the center in the debate over whether or not social media is addictive, I still believe that it can be. I think that there is a lot that we don’t know, and we need to stay vigilant on this topic, especially with regards to what it can do to the minds of young people.

The author of this article ultimately comes to the conclusion that we aren’t addicted to social media. I take no issue with this conclusion, especially when considering the definition of addiction in the clinical sense. However, there are several aspects of this article that I take issue with, that I believe only muddy the waters of an already complex topic.

Conflation of smartphones and social media – I will admit this is a very difficult thing to avoid. However, it does need to be confronted. There are elements of our smartphones – the constant buzzing, notifications, bright screens – that might be addictive. There are also elements of social media – FOMO, feeling being connected, sharing – that might be addictive.

Correlation Does Not Equal Causation – anyone that has been to college has heard this statement. Some people take this to mean that correlation never equals causation. Just because a correlation exists, doesn’t mean that a causal link could never be established. Or both things are caused by a common factor. Just because we don’t understand the relationship between the inverted yield curve and recessions, does not mean that we should ignore it. This was in reference to the rise of smartphones correlating with an increase in teen depression. Just because we don’t have a causal link established yet does not mean that it is a correlation worth looking into further (to be fair, the author never mentions that we should stop looking into this).

Setting the bar too high – The article makes heavy use of quotes and research conducted by Dr. Andrew Przybylski. He criticizes, very fairly, many studies that attempted to prove that social media is addictive, for having too small of a sample size. He then goes on to say that screen time is not harmful for the majority of teens. A few issues with this. First, aren’t we reading an article about whether or not social media is addictive? Not whether or not screen time is harmful? Also, every single teen in the world doesn’t need to be addicted to social media for it to be considered addictive. Alcohol doesn’t harm everyone that uses it, but it can be very harmful.

Where I ultimately land after reading this article is that alcohol is a great metaphor or social media. It should be used in moderation. It can enhance experiences, relationships, and help awkward teens come out of their shells. You shouldn’t drive (or walk) while using it. It can impact those who are predisposed to depression or anxiety more acutely. It can make reasonable people behave unreasonably. You shouldn’t use it every day. And ultimately, I still believe that people can become addicted to it. Maybe not everyone.

The bulk of my concerns about social media can be broken into two fundamental areas: its psychological impacts on us and the centralization of our data. Almost everything else (and most of what you’ll read about on this blog) are offshoots of these fundamental concerns. This is a brief summary of both of them.

I think the answer to all of these questions is that we really don’t know yet. Researchers have established links between unhappiness and social media use, but we really don’t understand the extent to which social media impacts us. We are also still in early days of social media, so there is no way to understand the full extent of long-term psychological impacts of social media.

Centralization of our data

This might seem like a two-parter, but it’s really two sides of the same coin. Due to network effects, the world of social media has always been destined to consolidate to a few platforms. These are Facebook, Instagram, Whatsapp, Twitter, and Snapchat – at least in the US. Due to the concentration of where we congregate online, these firms now have outsized power, and are essentially monopolies. When a firm is a monopoly, it doesn’t have an incentive to fight back against abusive trolls or fake news.

It is also in the position to collect vast amount of data on its users, which is can then sell to the highest bidder. When we learn that a company is abusing our data, it might create a brief PR storm, but that will soon blow over. The truth is that social media users do not have the ability to choose an alternative provider who does not abuse their data.

Steven Johnson wrote an excellent piece in the New York Times in early 2018 about blockchain and the benefits of a decentralized internet. The (long) article is well worth a read. It speaks to the benefits of decentralization, verification, open protocols and the lack of an “owner”. Why bring up an article on blockchain in a blog about social media? Look at this quote:

The true believers behind blockchain platforms like Ethereum argue that a network of distributed trust is one of those advances in software architecture that will prove, in the long run, to have historic significance.

The Bitcoin bubble has become a distraction from the true significance of the blockchain. If we look past the speculative bubble, we can see the potential of a “network of distributed trust”. This could lead to a decentralized and democratized version of the internet we know today. This would prevent our identities from being housed in Facebook or Google’s walled gardens. We could have an identity that exists based on open protocols, and we can take it from platform to platform as we please. There would be penalties for abusing our data.

It is helpful to understand the two layers of the internet. The first is based on open protocols that were developed in the 1970s, which still exist today – email and web browsing still works on these. This layer is decentralized. The second layer of the internet are the platforms that we use to access the internet today. These are private companies such as Facebook or Twitter. This layer is private and highly centralized.

Johnson argues that keeping smartphones away from kids and government regulation are commendable, but will not cure all of societies ills. This belief – that there is no silver bullet to protect us from social media – is a fundamental reason for the existence of this blog.

Johnson paints a vision for a decentralized future with open protocols overtaking the highly lucrative private platforms that exist today. Blockchain, afterall, has shown us that it is possible for everyone to agree on the contents of a database without the database having an “owner”.

I hope he is right. But his vision requires the success of swashbuckling punk rockers that are driven purely by a mission to restore the internet to its original utopian vision. This would require everyone to turn a blind eye to the gobs of money that will be thrown at him to keep the internet closed.

Until then, the social media platforms we use will remain closed and highly centralized. Our data lives with these private companies, which earn their revenue by harvesting and selling our data. While some are perfectly comfortable with the sale of our personal data by fortune 500 companies, it is important to remember that these private companies are vulnerable to attacks, and our personal information can get into shadier hands.

This all leaves me thinking about a very prescient tweet that I saw once, which I would love to attribute to its author, but cannot remember:

The only businesses that refer to their customers as “users” are tech companies and drug dealers

This is what drives this blog. There are many questions about what social media is doing to us. Most of the efforts to understand social media have to do with monetizing our attention. How can businesses advertise to users as they spend time on these platforms? And how can the social media platforms monetize the time we spend using their technology? It’s fine that many smart people are dedicated to answering these questions – because if they weren’t, bad actors would fill that void.

However, the conversations about what social media does to our brains seems limited to fragmented academic studies. Discussions on social media’s centralization and its impact on society have become more mainstream recently, but do not seem to have impacted the fundamentals of how social media platforms operate.

Those of us that seek to understand what is happening to us as individuals and society as a whole will never come to a satisfying conclusion. We seem destined to only uncover more questions. Questions I am happy to continue asking.

The bulk of social media companies were founded and are headquarted in the United States. In the US, we enjoy and believe strongly in our freedom of speech, and go to great lengths to protect it. It is not surprising then, that freedom of speech is a topic that is raised frequently in discussions about how companies should police their social networks when it comes to trolls.

Trolls are an issue across pretty much every social media platform. Wikipedia defines the term internet troll as “Someone who posts inflammatory, extraneous, or off-topic messages in an online community, such as a forum, chat room, or blog, with the primary intent of provoking readers into an emotional response or of otherwise disrupting normal on-topic discussion.” While this definition makes internet trolls seem like a mere distraction, trolling can take on many forms, often devolving into misogynistic, homophobic, racist, and otherwise ethnocentric harassment. It is nearly universally accepted that internet trolls are bad, but approaches to stamp them out are varying.

For instance, Facebook and Instragram take a more aggressive approach than Twitter or Reddit. Victims of harassment on the two former platforms might argue to the contrary, but Facebook does at least require you to use your real identity. On the other side, Twitter, and of course the wild west of the internet – Reddit, willfully allow users to exist in a cloak of anonymity.

Reddit and Twitter have long viewed themselves as bastions of free speech. Free speech is an important right, but as we all learned in school, we cannot yell “fire” in a crowded theater. Freedom of speech has its limits. When it comes to speech on Reddit, pretty much anything goes. Reddit is compiled of groups called subreddits. The subreddits are monitored by “redditors”, who are just users of Reddit. The term “monitor” is applied very loosely, as they merely upvote or downvote a comment, to determine how much visibility it gets. Therefore, a subreddit filled with hateful redditors frequently has hateful comments bubbling up to the top, going viral. This all happens in plain sight with no intervention from the powers that be within Reddit.

Some will argue that trolls have always existed, and the trolls on social media platforms are nothing to worry about. But in earlier times, trolls had to show their faces and use their actual voices to harass someone. This type of behavior invited well-deserved shame and obviously didn’t scale well. We now live in a world where an army of Twitter eggs (users with no avatar) can say whatever they want to anyone they want with no repercussions because they are free to exist as anonymous users on the internet. Twitter can suspend an individual account, but how hard is it to create a new one with a different email?

Twitter deserves credit for at least trying to take on the trolls. It recently revised its approach to taking on trolls, citing a 4% decrease in reports of abuse. It’s new approach referred to many as “out of sight out of mind”, decreases the visibility of tweets from users who display behavior consistent with that of trolls. Such behavior could include signing up for multiple accounts at once or repeatedly tagging users that don’t follow them back. As I’ve argued before, there is no silver bullet to many of the problems that have accompanied the rise of social media. It’s great they are trying a new tactic, from which we will likely learn more about policing trolls. However, this is not going to end trolling on Twitter. If Jack Dorsey and his team at Twitter are truly dedicated to furthering free speech, and I believe they are, they would be wise to stay vigilant in their pursuit against trolls.

Good intentions aside, the success of this effort really hinges on the business aspect of it. If Twitter believes, and its investors agree, that curtailing trolls is good for business, then there is hope. The problem is that it would be difficult for Twitter to effectively police trolls on its platform without impacting the free speech of all of its users. Difficult, but not impossible. More accurately, it would be expensive.

The issue is that an algorithm can’t solve it all – which is often the first, second and third approach by most Silicon Valley companies. Think back to school. The school had rules about how to behave and language we could use. But in the cafeteria, it relied on monitors – real people – to ensure that we adhered to those rules. This same approach would be required by Twitter and other social media companies to rid their sites of trolls.

The problem with real people is that they are expensive. Deploying them en masse to stamp out trolls is not conducive to the kind of margins enjoyed by large tech firms and demanded by their investors. Policing the trolls would need to show that it not only has an impact on abuse, but that it also results in higher revenues for the company. If Twitter sees that fewer trolls leads to more users and greater engagement, they will conclude that trolls are bad for business. If they conclude that the presence of trolls is not keeping users away, policing the trolls will only result in greater costs, and have a negative impact on Twitter’s bottom line. Until the link between trolls and a company’s bottom line is established, attempts to stamp out the trolls will be mere PR fodder.