From TIME November 4, 2019 special story on robots and therapy, these two paras caught my attention:
“One thing we are learning as robots join us on this planet is that, just as there are situations when it’s easier for people to bond with an animal than another person (hence the value of pet therapy) there are also situations in which some people feel more comfortable bonding with an artificially intelligent companion than a human one.
It’s not uncommon to feel gratitude or warmth toward a person or a thing that helped you through a difficult situation. US troops deployed in Iraq and Afghanistan who were issued an IED-finding robot grew very attached to the machines, naming them, awarding them medals, and becoming distraught when they were damaged beyond repair in combat. It appears that a tool that comforts people through the upheavals of aging may elicit a similar response.”
This points to the one-way therapeutic value of talking to the equivalent of an inanimate, unfeeling object. Turkle’s early work on this subject is prescient as I recall when Paro was all the rage:
“People are capable of the higher standard of care that comes with empathy,” writes psychologist and MIT professor Sherry Turkle in her 2011 book Alone Together: Why We Expect More From Technology and Less From Each Other. A robot, in contrast, “is innocent of such capacity.” Upon seeing an elderly research subject speak warmly to a robotic baby seal named Paro—designed as a therapy tool for people with dementia—Turkle writes:“Paro took care of Miriam’s desire to tell her story—it made a space for that story to be told—but it did not care about her or her story. This is a new kind of relationship, sanctioned by a new language of care. Although the robot understood nothing, Miriam settled for what she had.”
Sign up for the #CX Briefing with no more than 2019 characters, zero images, and all in plain-text.
As a small courtesy the 2019 #DesignInTech Report PDF link and the 2018 #DesignInTech Report PDF link will be sent to you soon after you sign up! —@johnmaeda
Is it better to be alone and end up talking with a robot that doesn’t really care for you? Or is it better to just settle with being alone?
I think the former has merit — especially if we consider the ethics behind how the unempathetcic product or service came to be, which at least changes the intent behind it all as meaningful or simply meaningless.