In the recent film “The Robot & Frank,” an elderly Dad, played by the fine actor Frank Langella, is slipping into dementia so his adult son and daughter debate how best to help him out: get him into an assisted care facility, says daughter. Get him a domestic robot, a caregiver that cooks, cleans up and converses with Frank, says son.

Son wins and brings a robot to his Dad’s home to start care-giving. The sharp tensions between Frank and the mechanical caregiver dissolve as Frank realizes that he can resume his previous career as a cat burglar with the aid of the robot. So with this comedic story-line dominating the film, the serious moments of Frank realizing that he will no longer be the person he was—-Langella captures those emotions without saying a word–are lost. Thus, what could have been an insightful film, a study of the crushing consequences of dementia on a person and family get twisted in the writers’ failure to decide whether they were doing a comedy or serious film.

But that film is not the point of this post.

The point is that while there are tasks that robots can do to help infirm elderly, ill patients, and students the connection between a machine and human being cannot replicate the fundamental cognitive and emotional bonds between humans that sustains caregiving, doctor-patient and teacher-student relationships.

Professor at Massachusetts Institute of Technology, Sherry Turkle, has written often about machine-human interactions in articles and books (see here). She has raised questions about robots as caregivers and was called by one writer a “technology skeptic.” She responded in a letter to the editor in the New York Times.

I had written that after a 72-year-old woman named Miriam interacted with a robot called Paro, Miriam “found comfort when she confided in her Paro.”

But I still believe that robots are inappropriate as caregivers for the elderly or for children. The robots proposed as “caring machines” fool us into thinking they care about us. Maintaining eye contact, remembering our names, responding to verbal cues — these are things that robots do to simulate care and understanding.

So, Miriam — a woman who had lost a child — was trying to make sense of her loss with a machine that had no understanding or experience of a human life. That robot put on a good show. And we’re vulnerable: People experience even pretend empathy as the real thing. But robots can’t empathize. They don’t face death or know life. So when this woman took comfort in her robot companion, I didn’t find it amazing. I felt we had abandoned Miriam.

Being part of this scene was one of the most wrenching moments in my years of research on sociable robotics. There were so many people there to help, but we all stood back, outsourcing the thing we do best — understanding each other, taking care of each other.

Now consider robots and teaching. There are tasks that robots can do to help teachers teach and children learn (see here, here, here, and here). But these tasks, as important as they may be in helping out homebound students or grading simple five-paragraph essays, such tasks and others do not add up to what is the core of teaching: the emotional and cognitive bonds that grow over time between teachers and students and are the basis for learning not only what is taught in the classroom but also learning close and personal–beg pardon for using an outdated word– the virtues (trustworthiness, respect, fairness, reliability, loyalty) of character. And, yes, the flip side of those virtues can be learned from a few teachers as well. That is the personal side of teaching that “social robotics” cannot capture. In short, teaching is far more that seeing children and youth as brains on sticks.

David Kirp, professor of public policy at University of California, Berkeley, made a similar point in a recent op-ed piece.

Every successful educational initiative of which I’m aware aims at strengthening personal bonds ….. The best [schools] … create intimate worlds where students become explorers and attentive adults are close at hand…. The process of teaching and learning is an intimate act that neither computers nor markets can hope to replicate. Small wonder, then, that the business model hasn’t worked in reforming the schools — there is simply no substitute for the personal element.

14 responses to “Robo-teachers?”

Having studied this rather extensively (this being human-machine interaction and education), I want to toss a couple of obstacles in front of Larry’s argument (and by extension Turkle and Kirp). While obviously in the ideal abstract world of education and care-giving, it’s hard to argue against fully present, fully engaged humans as the best teachers. But as we all know, from observation and anecdote, that fully engaged and present is not what most humans do most of the time. In fact, horrendous as it may sound, some times actual humans act in vindictive and hurtful ways (in addition, to often exhibiting robotic absence through fatigue, distraction, or fear). My grandmother while suffering through alzheimer’s in a residential treatment home had her arm and collar bone broken by attendants who apparently took personally her verbal attacks (which actually were pretty mild in comparison to others with the same condition) and tendency to drift away down the hallway without warning. A robotic assistant in these instances would have been preferable to a human unable to quell their own anger in a professional manner (and please don’t dismiss as an exception and as a simple skill set – I see it as directly connected to the issue of empathy that is discussed in the post). Finally, Professor Turkle having written extensively on psychoanalysis, I would think that she might be more attentive to other aspects of her previous writings (as opposed to the most recent book where she seems to retreat into a quasi-luddite position – understandable but negated by her earlier work and the work of others in critical theory). If the illusion of compassion satisfies the elderly patient, then who is to deny the phantasy of contact through a perhaps equally illusionary notion of human presence? Psychoanalysis and other less esoteric psychotherapies would suggest that the phantasy can have the solidity of what commonly passes for reality and that it’s effects are not simply a negation of the real of being human. If a robotic worker, teacher, counselor, what-have-you will be patient and attentive without projecting anger, envy, or fear back at me, perhaps that is preferable in a range of one-to-one interactions. I wouldn’t want to be so quick to close the door on that possible future. Just a thought for an early Sunday.

Thanks, Dan, for taking the time to comment. I am sorry to hear of your grandmother’s treatment in the residential treatment home. I do not dismiss your experience. In fact, I do argue that there are certain tasks and certain situations where “social robotics” may be the best option for physical and psychological reasons. Nor do I “close the door on that possible future.” I do accept a future filled with robots in helping professions especially in one-to-one settings. Not, however, the replacement of knowledgeable professionals which the current unrestrained, unalloyed rush to (and crush on) technological wizardry encourages. With increasing access to smart machines,wearables, and robots fueled by an insistent rhetoric of reducing complicated tasks to algorithm-laced software, raising these issues of machine-human interactions are, in my opinion, essential.

My current work is providing distance learning services employing immersive training for developing decision making in future managers. This seems a world away from the care of young children and adolescents, I readily admit. Yet is it? What I see in my ‘industry’ is the appeal of automated education to provide cost effective education in many developing economies. Likewise having worked in the care industry in the UK, the economic reality is the art of the doable and if artificial intelligence raises the overall standards of care, allowing fewer but substantially better trained and motivated (better paid) human carers to concentrate on patients where and when they are needed then I also see that as an improvement. That was my aim 10 years ago in developing nursing management systems that employed sensors to help determine when an ad-hoc home intervention was necessary and how soon (a wet bed, a bandage needing replacing, an increase in temperature and reduced activity indicating an infection and so forth) rather than routinised and frequently unnecessary intervention (“wake up, it’s time for your sleeping pill”) or what one might call mass production type care.
Likewise far too much teaching tends to be mass production rather than bespoke and you do point this out.
Right now I am working on integrated software for determining indicators of student attitudes, aptitudes, preferences for modes of learning, learning difficulties and so forth, that will help the educator determine who of their students and what priority (when) an intervention is necessary. When one has thousands of students remotely taking a subject in East as well as West Africa rather than a cosy classroom face to face with a highly trained and motivated teacher, this becomes a necessity.
I agree with you but would like to say that technology can and should be developed to supplement and empower the professional (whether a carer or an educator) rather than replace them.