Does Removing Stereotype Priming Remove Bias? A Pilot Human-Robot Interaction Study
Robots capable of participating in complex social interactions have shown great potential in a variety of applications. As these robots grow more popular, it is essential to continuously evaluate the dynamics of the human-robot relationship. One factor shown to have potential impacts on this critical relationship is the human projection of stereotypes onto social robots, a practice that is implicitly known to effect both developers and users of this technology. As such, in this research, we wished to investigate the difference in participants' perceptions of the robot interaction if we removed stereotype priming. This has not yet been a common practice in similar studies. Given the stereotypes of emotions among ethnic groups, especially in the U.S., this study specifically sought to investigate the impact that robot "skin color" could potentially have on the human perception of a robot's emotional expressive behavior. A between-subject experiment with 198 individuals was conducted. The results showed no significant differences in the overall emotion classification or intensity ratings for the different robot skin colors. These results lend credence to our hypothesis that when individuals are not primed with information related to human stereotypes, robots are evaluated based on functional attributes versus stereotypical attributes. This provides some confidence that robots, if designed correctly, can potentially be used as a tool to override stereotype-based biases associated with human behavior.
READ FULL TEXT