Verified accounts turning themselves into bots, millions of fake likes and comments, a dirty world of engagement trading inside Telegram groups. Welcome to the secret underbelly of Instagram.

In late February, an Instagram account called Viral Hippo posted a photo of a black square. There was nothing special about the photo, or the square, and certainly not the account that posted it. And yet within 24 hours, it amassed over 1,500 likes from a group that included a verified model followed by 296,000 people, a verified influencer followed by 228,000, a bunch of fitness coaches, some travel accounts, and various small businesses. “I really love this photo,” one commented.

The commenter wasn’t a bot; nor were any of the accounts that liked the black square. But their interest in it wasn’t genuine. These were real people, but not real likes — none of them clicked on the like button themselves. Instead, they used a paid service that automatically likes and comments on other posts for them.
…
Renee DiResta, policy lead at Data for Democracy, believes this level of manipulation is deeply corrosive. “The risk of realizing that the internet is massively manipulated is that the cognitive overhead to process even the most basic interactions increases, suspicion increases, polarization potentially increases,” she told BuzzFeed News. “The engagement is fundamentally manipulated; the content you’re seeing, you’re seeing because someone gamed an algorithm; the products people are pushing to you are inauthentic and most of the comments under them are fake. That’s not the system that we want to live in.”

]]>
By: Karel Donk's Blog » No Mark Zuckerberg, I don’t need your help https://blog.kareldonk.com/persuasion-influencing-and-selling-are-mind-manipulation/#comment-141895
Thu, 12 Apr 2018 15:08:38 +0000https://blog.kareldonk.com/?p=6598#comment-141895[…] that’s unique to Facebook; most tech companies are guilty of it these days, as I discussed in a previous post in more details. In that post I also mentioned some of the psychological tools that they have in their arsenal and […]
]]>
By: Karel Donk's Blog » On gender pronouns, gender identity and transgender people https://blog.kareldonk.com/persuasion-influencing-and-selling-are-mind-manipulation/#comment-138844
Tue, 20 Mar 2018 00:10:50 +0000https://blog.kareldonk.com/?p=6598#comment-138844[…] I know that it’s absolutely possible to value the truth and choose to live a life based on the truth while also being tolerant of other people, their views and their way of life. The way I see it is that every living organism in the universe — including human beings, obviously — has a right to life, granted to it by the universe. Like I wrote before: […]
]]>
By: Karel Donk https://blog.kareldonk.com/persuasion-influencing-and-selling-are-mind-manipulation/#comment-138837
Mon, 19 Mar 2018 14:30:58 +0000https://blog.kareldonk.com/?p=6598#comment-138837An article on The Guardian gives some rare insight into how technology is being used to persuade people and manipulate their minds to get a desired outcome. It’s being used in cyberwarfare by intelligence agencies, for example in this case to influence the outcome of elections.

It was Bannon’s interest in culture as war that ignited Wylie’s intellectual concept. But it was Robert Mercer’s millions that created a firestorm. Kogan was able to throw money at the hard problem of acquiring personal data: he advertised for people who were willing to be paid to take a personality quiz on Amazon’s Mechanical Turk and Qualtrics. At the end of which Kogan’s app, called thisismydigitallife, gave him permission to access their Facebook profiles. And not just theirs, but their friends’ too. On average, each “seeder” – the people who had taken the personality test, around 320,000 in total – unwittingly gave access to at least 160 other people’s profiles, none of whom would have known or had reason to suspect.

What the email correspondence between Cambridge Analytica employees and Kogan shows is that Kogan had collected millions of profiles in a matter of weeks. But neither Wylie nor anyone else at Cambridge Analytica had checked that it was legal. It certainly wasn’t authorised. Kogan did have permission to pull Facebook data, but for academic purposes only. What’s more, under British data protection laws, it’s illegal for personal data to be sold to a third party without consent.

“Facebook could see it was happening,” says Wylie. “Their security protocols were triggered because Kogan’s apps were pulling this enormous amount of data, but apparently Kogan told them it was for academic use. So they were like, ‘Fine’.”

Kogan maintains that everything he did was legal and he had a “close working relationship” with Facebook, which had granted him permission for his apps.
Cambridge Analytica had its data. This was the foundation of everything it did next – how it extracted psychological insights from the “seeders” and then built an algorithm to profile millions more.

For more than a year, the reporting around what Cambridge Analytica did or didn’t do for Trump has revolved around the question of “psychographics”, but Wylie points out: “Everything was built on the back of that data. The models, the algorithm. Everything. Why wouldn’t you use it in your biggest campaign ever?”

…

Is what Cambridge Analytica does akin to bullying?

“I think it’s worse than bullying,” Wylie says. “Because people don’t necessarily know it’s being done to them. At least bullying respects the agency of people because they know. So it’s worse, because if you do not respect the agency of people, anything that you’re doing after that point is not conducive to a democracy. And fundamentally, information warfare is not conducive to democracy.”

…

“Facebook has denied and denied and denied this,” Dehaye says when told of the Observer’s new evidence. “It has misled MPs and congressional investigators and it’s failed in its duties to respect the law. It has a legal obligation to inform regulators and individuals about this data breach, and it hasn’t. It’s failed time and time again to be open and transparent.”

…

The Facebook data is out in the wild. And for all Wylie’s efforts, there’s no turning the clock back.

Tamsin Shaw, a philosophy professor at New York University, and the author of a recent New York Review of Books article on cyberwar and the Silicon Valley economy, told me that she’d pointed to the possibility of private contractors obtaining cyberweapons that had at least been in part funded by US defence.

She calls Wylie’s disclosures “wild” and points out that “the whole Facebook project” has only been allowed to become as vast and powerful as it has because of the US national security establishment.

Fogg speaks openly of the ability to use smartphones and other digital devices to change our ideas and actions: “We can now create machines that can change what people think and what people do, and the machines can do that autonomously.”

…

While Fogg emphasizes persuasive design’s sunny future, he is quite indifferent to the disturbing reality now: that hidden influence techniques are being used by the tech industry to hook and exploit users for profit. His enthusiastic vision also conveniently neglects to include how this generation of children and teens, with their highly malleable minds, is being manipulated and hurt by forces unseen.

…

Ramsay Brown, the founder of Dopamine Labs, says in a KQED Science article, “We have now developed a rigorous technology of the human mind, and that is both exciting and terrifying. We have the ability to twiddle some knobs in a machine learning dashboard we build, and around the world hundreds of thousands of people are going to quietly change their behavior in ways that, unbeknownst to them, feel second-nature but are really by design.”
Programmers call this “brain hacking,” as it compels users to spend more time on sites even though they mistakenly believe it’s strictly due to their own conscious choices.

…

While social media and video game companies have been surprisingly successful at hiding their use of persuasive design from the public, one breakthrough occurred in 2017 when Facebook documents were leaked to The Australian. The internal report crafted by Facebook executives showed the social network boasting to advertisers that by monitoring posts, interactions, and photos in real time, the network is able to track when teens feel “insecure,” “worthless,” “stressed,” “useless” and a “failure.” Why would the social network do this? The report also bragged about Facebook’s ability to micro-target ads down to “moments when young people need a confidence boost.”

Persuasive technology’s use of digital media to target children, deploying the weapon of psychological manipulation at just the right moment, is what makes it so powerful.

With even the gentlest caress of the metaphorical dial, Facebook changes what its users see and read. It can make our friends’ photos more or less ubiquitous; it can punish posts filled with self-congratulatory musings and banish what it deems to be hoaxes; it can promote video rather than text; it can favour articles from the likes of the New York Times or BuzzFeed, if it so desires. Or if we want to be melodramatic about it, we could say Facebook is constantly tinkering with how its users view the world – always tinkering with the quality of news and opinion that it allows to break through the din, adjusting the quality of political and cultural discourse in order to hold the attention of users for a few more beats.
…
Facebook likes to boast about the fact of its experimentation more than the details of the actual experiments themselves. But there are examples that have escaped the confines of its laboratories. We know, for example, that Facebook sought to discover whether emotions are contagious. To conduct this trial, Facebook attempted to manipulate the mental state of its users. For one group, Facebook excised the positive words from the posts in the news feed; for another group, it removed the negative words. Each group, it concluded, wrote posts that echoed the mood of the posts it had reworded. This study was roundly condemned as invasive, but it is not so unusual. As one member of Facebook’s data science team confessed: “Anyone on that team could run a test. They’re always trying to alter people’s behaviour.”

There’s no doubting the emotional and psychological power possessed by Facebook – or, at least, Facebook doesn’t doubt it. It has bragged about how it increased voter turnout (and organ donation) by subtly amping up the social pressures that compel virtuous behaviour. Facebook has even touted the results from these experiments in peer-reviewed journals: “It is possible that more of the 0.60% growth in turnout between 2006 and 2010 might have been caused by a single message on Facebook,” said one study published in Nature in 2012. No other company has made such claims about its ability to shape democracy like this – and for good reason. It’s too much power to entrust to a corporation.
…Facebook would never put it this way, but algorithms are meant to erode free will, to relieve humans of the burden of choosing, to nudge them in the right direction. Algorithms fuel a sense of omnipotence, the condescending belief that our behaviour can be altered, without our even being aware of the hand guiding us, in a superior direction.

]]>
By: Karel Donk https://blog.kareldonk.com/persuasion-influencing-and-selling-are-mind-manipulation/#comment-119075
Fri, 06 Jan 2017 13:48:42 +0000https://blog.kareldonk.com/?p=6598#comment-119075Could 2017 be a turning point for online advertising? Here are two articles that caught my attention that highlight some of the problems I discussed above.

Upon further reflection, it’s clear that the broken system is ad-driven media on the internet. It simply doesn’t serve people. In fact, it’s not designed to. The vast majority of articles, videos, and other “content” we all consume on a daily basis is paid for — directly or indirectly — by corporations who are funding it in order to advance their goals. And it is measured, amplified, and rewarded based on its ability to do that. Period. As a result, we get…well, what we get. And it’s getting worse.

That’s a big part of why we are making this change today.
We decided we needed to take a different — and bolder — approach to this problem. We believe people who write and share ideas should be rewarded on their ability to enlighten and inform, not simply their ability to attract a few seconds of attention.

However, there are other things about ads that I do mind enormously and most of them are due to the ad networks themselves. I don’t like the overhead of a whole other website being embedded into an iframe. I don’t like the total irrelevancy of much of the ad content. It could be tailored to my browsing habits, but then I’m not overly fond of the tracking. Oh – and I definitely don’t like being served either malware or really obtrusive behaviours such as ads viewed on iOS redirecting me to the app store in an attempt to have me download Clash of Clans. None of these things are fun and they’re all directly attributable to the way networks run.

]]>
By: Karel Donk https://blog.kareldonk.com/persuasion-influencing-and-selling-are-mind-manipulation/#comment-118389
Tue, 20 Dec 2016 13:58:27 +0000https://blog.kareldonk.com/?p=6598#comment-118389Yes education as we know it today is mind manipulation; it’s forced upon children worldwide at an early age and teaches them to become obedient and passive slaves. It programs them with a very distorted view of reality. I have lots more details on this in my post “Statism: A System for your Enslavement”. Look for the heading on education there.
This article is me sharing my knowledge and other information with everyone. I don’t force anyone to read and believe it, and as far as I can see I didn’t use any manipulative tactics to try to convince anyone that what I’m saying is true. I do hope that it gets people to think and question things, and if they find that it makes sense after their own independent consideration, to change their behavior accordingly so that we can all benefit.
]]>
By: Musashi https://blog.kareldonk.com/persuasion-influencing-and-selling-are-mind-manipulation/#comment-118382
Tue, 20 Dec 2016 12:33:07 +0000https://blog.kareldonk.com/?p=6598#comment-118382I’m a little confused. At the top of your article you same “mind manipulation” and put “education” in (), leading me to believe that you thought all education was mind manipulation and wrong (and this article is a form of education, so it would be manipulative). But in other parts of the article you say that it’s only manipulation if your not being “clinically objective” and using some the manipulation tactics you listed (in which case you may not be manipulating by writing this article). Just wondering if you could clarify so we’re on the same page.
]]>
By: Karel Donk https://blog.kareldonk.com/persuasion-influencing-and-selling-are-mind-manipulation/#comment-118282
Sat, 17 Dec 2016 12:18:10 +0000https://blog.kareldonk.com/?p=6598#comment-118282I’m not telling people to manipulate their own thoughts. I’m sharing information (without using manipulative tactics and exploits) and leaving it up to people to think about it and decide what they want to do.
]]>
By: Musashi https://blog.kareldonk.com/persuasion-influencing-and-selling-are-mind-manipulation/#comment-118266
Sat, 17 Dec 2016 02:56:11 +0000https://blog.kareldonk.com/?p=6598#comment-118266Yes, but manipulate their thoughts according to how your suggesting they do. Telling people to manipulate their own thoughts is a way of manipulating people by telling them to not be manipulated, and form their own opinions. Are you not manipulating me from holding the view that I should be manipulated?
]]>
By: Karel Donk https://blog.kareldonk.com/persuasion-influencing-and-selling-are-mind-manipulation/#comment-118225
Thu, 15 Dec 2016 21:46:15 +0000https://blog.kareldonk.com/?p=6598#comment-118225An article on Forbes, “Reddit For Sale: How We Made Viral Fake News For $200” details how easy it is to manipulate people online these days, especially through social media. Make sure to check the GCHQ handbook mentioned for tactics used by intelligence agencies.

Our first port of call was to go back to Mike and without hesitation he offered upvotes in two flavours; automated or organic. Automated votes use a bot along with purchased accounts, but their effectiveness “can’t be guaranteed”.

Organic votes would be manually logged by human labour somewhere inside Pakistan. The cost? $40 would get you more than 200 organic upvotes. Given how many upvotes are needed to reach the number one spot on big subreddits – easily in the thousands – 200 would not be enough.

…

We needed something to swing our new axe at, so for inspiration we took a few tips from GCHQ. A handbook they had commissioned to help train ‘Cyber Magicians’ to ‘discredit, dissuade, deceive, disrupt, delay, deny, degrade, and deter’ online conversations proved to be incredibly useful…

The document suggested that ‘disruptive’ communication would utilise ‘obedience’, because an individual will often comply with an authority figure. So we made up a character; Espen Lunde, a Norwegian Professor at the invented Bergen School of Economics. We could hide our political manipulations behind this completely fictional character.

Silicon Valley distrusts marketing, demeans it, and devalues the people who practice it.

On any given day in tech, you might:

– Hear a well-respected startup investor deride marketing as “what you do when your product or service sucks.”

– Watch another leading investor advise their portfolio companies to “ignore marketing and PR for a long time,” and lump marketing in with “ads” and “PR” as an “inorganic means to maintain your growth.”

These “influencers,” as they’re known, are media properties unto themselves, turning good looks and taste into an income stream: Brands pay them to feature their wares.

…

I would do everything possible within legal bounds to amass as many followers as I could. […] The ultimate goal: to persuade someone, somewhere, to pay me cash money for my influence.

…

That night, I signed up for a service recommended to me by Socialyte called Instagress. It’s one of several bots that, for a fee, will take the hard work out of attracting followers on Instagram. For $10 every 30 days, Instagress would zip around the service on my behalf, liking and commenting on any post that contained hashtags I specified. […] I also wrote several dozen canned comments—including “Wow!” “Pretty awesome,” “This is everything,” and, naturally, “[Clapping Hands emoji]”—which the bot deployed more or less at random. In a typical day, I (or “I”) would leave 900 likes and 240 comments. By the end of the month, I liked 28,503 posts and commented 7,171 times.

Most committed influencers, including Socialyte’s clients, use bots in one way or another, but it should be said that this is an ethical gray area. Instagram doesn’t explicitly ban bots, but its terms of service do prohibit sending spam, which, when viewed in a certain light, is exactly what I was doing. On the other hand, except for a single user who somehow had me pegged and accused me of being a bot, nobody with whom I interacted seemed to mind the extra likes or comments. In fact, most of them would respond immediately with comments of their own. “Thanks dude!” they’d say. Or they’d simply give me a “[Praying Hands emoji].” I got hundreds of comments like this.

Misleading people into thinking you were liking and posting on their profile is “an ethical gray area”???? Seriously. And this apart from the completely fake lifestyle that’s being promoted as being genuine. Note how all of this apparently fits within the “legal bounds” too. Another example showing that what’s legal doesn’t mean that it’s morally right too.

]]>
By: Karel Donk https://blog.kareldonk.com/persuasion-influencing-and-selling-are-mind-manipulation/#comment-117744
Mon, 05 Dec 2016 22:01:43 +0000https://blog.kareldonk.com/?p=6598#comment-117744It’s not much use for the long term to ask people to be less manipulative and transparent; they have to understand why it’s better. For that you need a good moral foundation to build on. And so far respect for the right to life is the best one I’ve seen, and it’s universal. It’s basically what true love is, as I wrote about in my post “The Difference between Love and Lust.”

This is not an easy concept for people to wrap their heads around, especially not in the current societies around the world where everyone has been trained from early childhood to accept the opposite. But unfortunately there are no easy fixes to this mess that we’re in. We have to tell it like it is, and people will have to think about it and ultimately convince themselves to change.

]]>
By: Tim vV https://blog.kareldonk.com/persuasion-influencing-and-selling-are-mind-manipulation/#comment-117734
Mon, 05 Dec 2016 13:29:09 +0000https://blog.kareldonk.com/?p=6598#comment-117734Changes are occurring. In the Netherlands transparency is a growing trend. This would not have been if it wasn’t for the manipulative tricks. People become smarter. Eventually businesses will acknowledge the benefits to become more genuine. I think, eventually, transparency will be key for a successful business.

However, I think it’s to idealistic to tie this to ‘everyone’s right to life’. In my case it would create an opposite effect because of it. I would just stick to the manipulative tricks and the negative effect on the or society or something. You could use my first paragraph as a motive for people to take this step.

]]>
By: Karel Donk https://blog.kareldonk.com/persuasion-influencing-and-selling-are-mind-manipulation/#comment-117656
Sat, 03 Dec 2016 22:49:12 +0000https://blog.kareldonk.com/?p=6598#comment-117656The reason why the world is such a mess right now is precisely because people are playing games with each other. Consequently people get tricked, misled, hurt and even die, while others win (the game) and benefit/profit. I don’t think this is a sustainable way of living for the long term, and things will progressively become worse. We can change if we want to. Like Jacque Fresco said, if you think we can’t change the world, it just means you’re not one of those that will.
]]>
By: Adnan https://blog.kareldonk.com/persuasion-influencing-and-selling-are-mind-manipulation/#comment-117652
Sat, 03 Dec 2016 20:07:55 +0000https://blog.kareldonk.com/?p=6598#comment-117652This is an interesting perspective about persuasion, not a helpful one however.

As with computer games, one can’t play a game while objecting to its rules. Persuasion is what people do 24/7 – consciously and unconsciously – and it’s not all bad and should be avoided. Parents raising their kids, teachers painting science in alluring lights to make students subscribe and commit, and the list goes on,

Persuasion is a two-way street, you can be aware of it, everyone should so you don’t fall victim of bad manipulation. But, suggesting that the situation can be reversed is giving people false hopes for something that clearly isn’t going anywhere.