Education Technology and the New Behaviorism

Audrey Watters

Perhaps it’s no surprise that there was so much talk this year about education, technology, and emotional health. I mean, 2017 really sucked, and we’re all feeling it.

As support services get axed and the social safety net becomes threadbare, our well-being – our economic and emotional well-being – becomes more and more fragile. People are stressed out, and people are demoralized, and people are depressed. People are struggling, and people are vulnerable, and people are afraid. And “people” here certainly includes students.

All the talk of the importance of “emotion” in education reflects other trends too. It’s a reaction, I’d say, to the current obsession with artificial intelligence and a response to all the stories we were told this year about robots on the cusp of replacing, out-“thinking,” and out-working us. If indeed robots will excel at those tasks that are logical and analytical, schools must instead develop in students – or so the story goes – more “emotional intelligence,” the more “human” capacity for empathy and care.

There has been plenty of speculation in the past few years that the latest law governing K–12 education in the US, ESSA (the Every Student Succeeds Act, signed into law by President Obama in December 2015), will promote programs and assessments thatfocus on “social and emotional learning,” not simply on the traditional indicators of schools’ performance – academics and test scores. Schools should “foster safe, healthy, supportive, and drug-free environments that support student academic achievement,” the law says, which might include “instructional practices for developing relationship-building skills, such as effective communication” or “implementation of school-wide positive behavioral interventions and supports.”

Of course, when all these narratives about “social emotional learning” get picked up by education technologists and education entrepreneurs, they don’t just turn policy mandates or even into TED Talks. They turn into products.

“Every great cause begins as a movement, becomes a business, and eventually degenerates into a racket.” – Eric Hoffer

Hacking the Brain

The current (and long-running, let’s be honest) fascination with AI is deeply intertwined with science and speculation about the functioning of the brain and the possibilities for that organ’s technological improvement. There’s a science fiction bent as well and a certain millennialism, an AI apocalypticism, to much of the invocation of “hacking the brain” – a religious fantasy about the impending merger of “man and machine.”

While new “neurotechnologies” are primarily being developed to help those with disabilities regain speech, sight, and mobility, there is still plenty of talk about “linking human brains to computers” as consumer-oriented “enhancements” that everyone will want to pursue. A “symbiosis with machines” as Bryan Johnson puts it. He’s put $100 million of his own money into his company Kernel (which I guess means we’re supposed to believe it’s a real, viable thing) with the promise of developing computer chip implants that will bolster human intelligence. Elon Musk – a totally reliable person with his predictions of the future of the business of science – has founded a company called Neuralink that does something similar. It too will link human brains to machines. (There’s a cartoon explainer, which I guess means we’re supposed to believe it’s a real, viable thing). In eight to ten years time, Musk assures us, brain implants will enable telepathy.

(I’m keeping track of all these predictions. It isn’t simply that folks get it right or get it wrong. It’s that the repetition of these stories, particularly when framed as inevitabilities, shapes our preparations for the future. The repetition shapes our imaginations about the future.)

“Neuroeducation Will Lead to Big Breakthroughs in Learning,” Singularity Hub proclaimed this fall. Singularity Hub is a publication run by Singularity University, a for-profit (non-accredited) school founded by Ray Kurzweil, one of Silicon Valley’s biggest evangelists for the notion we’ll soon be able to upload human minds into computers. We’re on the cusp of being able to upload skills directly into the brain, Futurism.com wrote this spring. (Futurism.com is a website run by the New York Chapter Leader of Singularity University.) All these promises, if kept, would indeed make for breakthroughs in teaching and learning. “If kept,” of course, is the operative phrase there.

If kept. If possible. If ethical or desirable, even.

There are promises about “brain hacking pills” that will speed up learning. (Well, it turns out, they don’t actually work.) There are promises about “brain zapping” to speed up learning. (What researchers understand about the effectiveness of this procedure is pretty preliminary.) There are promises about “brain training” exercises that will keep the brain “fit” as one ages. (The research is iffyat best.)

Edsurge wrote about Brainco in October, a company that says its devices can monitor students’ brainwave activity in order to ascertain who’s paying attention in class. A few weeks later, Edsurge wrote about InteraXon, a company that says its devices can monitor students’ brainwave activity “for meditation and discipline.” An ed-tech trend in the making, investors hope.

Don’t let the dearth of data fool you though. Many people find all this a compelling story, data or no data – a long-running fantasy about “Matrix-style learning”. And when the story is accompanied with colorful images from fMRIs, it all seems to be incredibly persuasive.

It’s incredibly dangerous too, as Stirling University’s Ben Williamson cautions, as the kind of control that these devices promise should raise all sorts of questions about students’ civil rights and “cognitive liberties.” Williamson argues,

Neuroenhancement may not be quite as scientifically and technically feasible yet as its advocates hope, but the fact remains that certain powerful individuals and organizations want it to happen. They have attached their technical aspirations to particular visions of social order and progress that appear to be attainable through the application of neurotechnologies to brain analytics and even neuro-optimization. As STS researchers of neuroscience Simon Williams, Stephen Katz & Paul Martin have argued, the prospects of cognitive enhancement are part of a ‘neurofuture’ in-the-making that needs as much critical scrutiny as the alleged ‘brain facts’ produced by brain scanning technologies.

Marketing “Mindsets”

While the brainwave monitoring headsets hyped by some in ed-tech might seem too far-fetched and too futuristic for schools to adopt, they are being positioned as part of another trend that might make them seem far more palatable: “social emotional learning” or SEL.

I wrote about this company, along with its better known competitor ClassDojo, in an article in The Baffler. ClassDojo, which is also used by teachers and schools to manage student behavior, boasts that it’s been adopted in 90% of schools – a statistic that cannot be verified since this sort of data is not tracked by anyone other than the company making the claim. With this popularity, ClassDojo has done a great deal to introduce and promote “growth mindsets” and “mindfulness” to educators. (“To the masses” as Edsurge puts it.)

These apps – Hero K12, ClassDojo and other types of behavior management products – claim that they help develop “correct behavior.” But what exactly does “correct behavior” entail? And what does it mean (for the future of civic participation, if nothing else) if schools entrust this definition to for-profit companies and their version of psychological expertise? As Ben Williamson observes “social-emotional learning is the product of a fast policy network of ‘psy’ entrepreneurs, global policy advice, media advocacy, philanthropy, think tanks, tech R&D and venture capital investment. Together, this loose alliance of actors has produced shared vocabularies, aspirations, and practical techniques of measurement of the ‘behavioural indicators’ of classroom conduct that correlate to psychologically-defined categories of character, mindset, grit, and other personal qualities of social-emotional learning.” These indicators encourage behaviors that are measurable and manageable, Williamson contends, but SEL also encourages characteristics like malleability and compliance – and all that fits nicely with the “skills” that a corporate vision of education would demand from students and future workers.

Classroom Management and Persuasive Technologies

In that Baffler article, I make the argument that behavior management apps like ClassDojo’s are the latest manifestation of behaviorism, a psychological theory that has underpinned much of the development of education technology. Behaviorism is, of course, most closely associated with B. F. Skinner, who developed the idea of his “teaching machine” when he visited his daughter’s fourth grade class in 1953. Skinner believed that a machine could provide a superior form of reinforcement to the human teacher, who relied too much on negative reinforcement, punishing students for bad behavior than on positive reinforcement, the kind that better trains the pigeons.

As Skinner wrote in his book Beyond Freedom and Dignity, “We need to make vast changes in human behavior…. What we need is a technology of behavior." Teaching machines, he argued, would be one such technology.

By arranging appropriate ‘contingencies of reinforcement,’ specific forms of behavior can be set up and brought under the control of specific classes of stimuli. The resulting behavior can be maintained in strength for long periods of time. A technology based on this work has already been put to use in neurology, pharmacology, nutrition, psychophysics, psychiatry, and elsewhere.

The analysis is also relevant to education. A student is ‘taught’ in the sense that he is induced to engage in new forms of behavior and in specific form upon specific occasions. It is not merely a matter of teaching him what to do; we are as much concerned with the probability that appropriate behavior will, indeed, appear at the proper time – an issue which would be classed traditionally under motivation.

Skinner was unsuccessful in convincing schools in the 1950s and 1960s that they should buy his teaching machines, and some people argue that his work has fallen completely out of favor, only invoked when deriding something as a “Skinner’s Box.” But I think there’s been a resurgence in behaviorism. It’s epicenter isn’t Harvard, where Skinner taught. It’s Stanford. It’s Silicon Valley. And this new behaviorism is fundamental to how many new digital technologies are being built.

There’s a darker side still to this as I argued in the first article in this very, very long series: this kind of behavior management has become embedded in our new information architecture. It’s “fake news,” sure. But it’s also disinformation plus big data plus psychological profiling and behavior modification. The Silicon Valley “nudge” is a corporate nudge. But as these technologies are increasingly part of media, scholarship, and schooling, it’s a civics nudge too.

Those darling little ClassDojo monsters are a lot less cute when you see them as part of a new regime of educational data science, experimentation, and “psycho-informatics.”

Personalized Learning and the Nudge

In May, The Australian obtained a 23-page document from Facebook’s Australian office detailing how the company had offered to advertisers the ability to target some 6.4 million young people (some as young as 14) during moments of emotional vulnerability – when they felt “worthless,” “insecure,” “defeated,” “anxious,” “silly,” “useless,” “stupid,” “overwhelmed,” “stressed,” or “a failure.” Facebook denied the reporting, stating that The Australian article was misleading and insisting that the company “does not offer tools to target people based on their emotional state.” Of course, five years ago, the company did conduct a mass experiment on the manipulation of users’ emotions. It published those findings in an academic journal.

Facebook is the largest social network in the world. As of June, it boasted some 2 billion active monthly users. The manipulation of users’ social and emotional experiences should not be minimized or dismissed. And for educators, it’s important to recognize that interest in social and emotional experience and behavioral design is not just something that happens on the Facebook platform (or with other consumer-facing technologies).

“More personal means more equitable and just,” Shelton insisted in that Medium essay. And I do not doubt that the Chan Zuckerberg Initiative and Mark Zuckerberg and Facebook all believe that. They believe that they have our best interests at heart, and they will guide us – algorithmically, of course – to “good” academics and “good” thoughts and “good” feelings and “good” behavior, defining and designing, of course, what “good” looks like.