From TIME November 4, 2019 special story on robots and therapy, these two paras caught my attention:
“One thing we are learning as robots join us on this planet is that, just as there are situations when it’s easier for people to bond with an animal than another person (hence the value of pet therapy) there are also situations in which some people feel more comfortable bonding with an artificially intelligent companion than a human one.
It’s not uncommon to feel gratitude or warmth toward a person or a thing that helped you through a difficult situation. US troops deployed in Iraq and Afghanistan who were issued an IED-finding robot grew very attached to the machines, naming them, awarding them medals, and becoming distraught when they were damaged beyond repair in combat. It appears that a tool that comforts people through the upheavals of aging may elicit a similar response.”
This points to the one-way therapeutic value of talking to the equivalent of an inanimate, unfeeling object. Turkle’s early work on this subject is prescient as I recall when Paro was all the rage:
“People are capable of the higher standard of care that comes with empathy,” writes psychologist and MIT professor Sherry Turkle in her 2011 book Alone Together: Why We Expect More From Technology and Less From Each Other. A robot, in contrast, “is innocent of such capacity.” Upon seeing an elderly research subject speak warmly to a robotic baby seal named Paro—designed as a therapy tool for people with dementia—Turkle writes:
“Paro took care of Miriam’s desire to tell her story—it made a space for that story to be told—but it did not care about her or her story. This is a new kind of relationship, sanctioned by a new language of care. Although the robot understood nothing, Miriam settled for what she had.”
Is it better to be alone and end up talking with a robot that doesn’t really care for you? Or is it better to just settle with being alone?
I think the former has merit — especially if we consider the ethics behind how the unempathetcic product or service came to be, which at least changes the intent behind it all as meaningful or simply meaningless.
2 Comments
While I’m certainly not a cheerleader for so-called ‘affective robots’ I do think that there’s an important role in our lives for agents (whether people or things) that don’t care for us. This is why people tell their deepest secrets to strangers (the “strangers on the train” phenomenon). Product designers – particularly designers of smart speakers / virtual personal assistants – mostly misunderstand the meaning of empathy, and load up their products with heart-warming phrases. But empathy is about listening, not speaking, and strangers – innocent of your back-story – often do a better job of listening than does your spouse or other near ones. So, I disagree with Sherry that this is a new kind of relationship. On the contrary it’s a very old kind of relationship, and one that’s necessary – just as necessary as is being alone without a robot.
Thanks for this different viewpoint, Professor Cassell.
> Empathy is about listening, not speaking.
That is gold. And your early research work on conversing with avatars remains stuck in my mind all these years later. Lovely.