You might think that critical comments from robots that are only saying what they've been programmed to say, with no consciousness or feelings of their own. It is highly possible that you would laugh at those who feel affected by comments given by these robots. However, a new research from Pittsburg, Pennsylvania, proved the otherwise to be true, after the researchers unveiled that trash-talking droids actually make human beings sad and unproductive.
Researchers from Carnegie Mellon University (CMU) published their study from its Robotics Institute. Fei Fang, a computer scientist from CMU, said their investigation is one of the first studies of human-robot interaction in an environment where "home assistants" are not cooperating.
The study included 40 participants, who played a game called Guards and Treasures 35 times over with a robot named Pepper. The game is the same with Stackelberg game, with a defender and an attacker, and intended to teach rationality.
The robot insulted the participants who did not hit the same level of scores than those who were getting praised. Those participants who were commended by the robots all improved in terms of rationality during the test. Players met with a critical robot had a more negative view of it, too.
Verbal abuse dole out by Pepper included remarks such as "You are a terrible player" and "your play throughout the game has become confused."
The results of this small-scale study matched with a previous research explaining that 'trash talk' really can hurt the gameplay process. However, the talk this time this time is coming from artificial intelligence.
Humans need to understand how humans react to these seemingly personable machines as human-robot interactions are getting more frequent. Programmers need to know how to best handle those arguments when coding droids in situations where robots might think they know better than us, such as getting directions from A to B or buying something from a shop.
The researchers behind the study would need to observe the non-verbal cues given out by robots.
The scientists reported that some of the study participants were "technically sophisticated." However, the participants were still affected by the programmed responses despite being fully aware that the statements that were putting them off were coming from a pre-programmmed robot that was putting them off.
The research would yet be published in a peer-reviewed journal, but has been presented at the IEEE International Conference on Robot & Human Interactive Communication in India.
So as human beings come to become more dependent on robots and AI assistants such as Alexa, knowing how humans react to negative feedback will be incredibly important in making their future interactions smoother and more productive.
Or, at the very least, it'll make humans more dependent in the face of our future robot overlords. In other words, the study proves that Pepper's smack-talk actually affect the humans enough to have an effect on their confidence levels and decrease their score.