A Robot That Thinks Out Loud
A Robot That Thinks Out Loud
Written by: Lily Song
Artificial intelligence, also known as AI, is a process that aims to replicate the human thought process in machine application. This is achieved through numerous algorithms built into the computer. Artificial intelligence has quickly become an invaluable part of society, and is utilized today in various fields, including client service, surveillance, autonomic vehicles, navigation, and many more. Some AI products that you may have heard of include your car’s GPS, Amazon’s Echo, the Uber app, and any virtual assistant. But, have you ever wondered why your Alexa or Siri didn’t understand your questions, or why your navigation app led you to the side street instead of the highway? To better understand why this process occurs, a study published April 21st in the journal, iScience, Italian researchers designed a robot that “thinks out loud” so that users can hear its thought process and better understand the robot’s motivations and decisions.
Pepper the Robot
“If you were able to hear what the robots are thinking, then the robot might be more trustworthy,” says co-author Antonio Chella, describing first author Arianna Pipitone’s idea that launched the study at the University of Palermo. Inner speech, which is the expression of conscious thought to oneself without audible speaking, can be used to gain clarity, seek moral guidance, and evaluate situations in order to make better decisions. Now, by replicating the human capability of inner speech in artificial intelligence and displaying it, people will be able to better understand a robot’s actions and motives. When the robot faces issues, users can more effectively solve them by communicating and collaborating with the robot. To explore how inner speech may impact a robot’s actions, the researchers built a robot named Pepper that “speaks to itself”. They asked a number of people to set the dinner table with Pepper according to table etiquette rules to study how Pepper’s inner speech skills influence human-robot interactions, then tested Pepper’s decision making skills by asking the robot to go against the etiquette rules it was taught. By observing how Pepper would respond through its expression of inner-thought, the researchers were able to better understand Pepper’s thought process.
Napkin Experiment
The scientists found that Pepper was better at overcoming obstacles through the use of inner speech. In the experiment, the user asked Pepper to place a napkin at the wrong spot on the table, contradicting the etiquette rule Pepper had previously learned. Pepper started asking itself a series of self-directed questions and concluded that the user might be confused. To be sure, Pepper confirmed the user’s request, which led to further inner speech. “Ehm, this situation upsets me. I would never break the rules, but I can’t upset him, so I’m doing what he wants,” Pepper said to itself while placing the napkin at the requested spot. Through Pepper’s inner voice, the user can trace its thoughts to learn that Pepper was facing a dilemma and solved it by prioritizing the human’s request. The researchers suggest that this transparency could help establish human-robot trust.
Figure 1
An Image of Pepper the Robot
Source: extremetech.com
Outcome
By observing Pepper’s performance with inner speech, the researchers in the study discovered that the robot had a higher task-completion rate when engaging in inner speech. Through its use of inner speech, Pepper highly outperformed the international standard functional and moral requirements for collaborative robots — the guidelines that all machines, from humanoid AI to mechanical arms at the manufacturing line, must follow. “People were very surprised by the robot’s ability,” says Pipitone. “The [inner speech] approach makes the robot different from typical machines because it has the ability to reason, to think. Inner speech enables alternative solutions for the robots and humans to collaborate and get out of stalemate situations”.
Conclusion
Hearing the inner voice of robots significantly enriches the human-robot interaction. However, some might find it inefficient because the robot spends more time completing tasks when it talks to itself. The robot’s inner speech is also still limited to the knowledge that researchers give it. Still, researchers Pipitone and Chella say their work provides a strong framework to further explore how inner speech can help robots focus, plan, and learn. The exploration of the use of inner speech in robots will undoubtedly enhance the human-robot relationship and merge our two worlds, as well as provide a new generation of robots that are able to reliably evaluate a situation. The implementation of this novel feature will not only benefit the various parts of society that already utilize artificial intelligence, but also provide new ways for robots to enter and help other fields in the future.
References and Sources
Arianna Pipitone, Antonio Chella. What robots want? Hearing the inner voice of a robot. iScience, 2021; 102371 DOI: 10.1016/j.isci.2021.102371
Cell Press. (2021, April 21). Pepper the robot talks to itself to improve its interactions with people. ScienceDaily. Retrieved April 28, 2021 from www.sciencedaily.com/releases/2021/04/210421124654.htm
Merriam-Webster. (n.d.). Inner speech. In Merriam-Webster.com dictionary. Retrieved April 28, 2021, from https://www.merriam-webster.com/dictionary/inner%20speech