TY - JOUR
T1 - The impact of intergroup bias on trust and approach behaviour towards a humanoid robot
AU - Deligianis, Christopher
AU - Stanton, Christopher
AU - McGarty, Craig
AU - Stevens, Catherine J.
PY - 2017
Y1 - 2017
N2 - As robots become commonplace, and for successful human-robot interaction to occur, people will need to trust them. Two experiments were conducted using the "minimal group paradigm" to explore whether social identity theory influences trust formation and impressions of a robot. In Experiment 1, participants were allocated to either a "robot" or "computer" group, and then they played a cooperative visual tracking game with an Aldebaran Nao humanoid robot as a partner. We hypothesised participants in the "robot group" would demonstrate intergroup bias by sitting closer to the robot (proxemics) and trusting the robot's suggested answers more frequently than their "computer group" counterparts. Experiment 2 used an almost identical procedure with a different set of participants; however, all participants were assigned to the "robot group" and thee different levels of anthropomorphic robot movement were manipulated. Our results suggest that intergroup bias and humanlike movement can significantly affect human-robot approach behaviour. Significant effects were found for trusting the robot's suggested answers with respect to task difficulty, but not for group membership or robot movement.
AB - As robots become commonplace, and for successful human-robot interaction to occur, people will need to trust them. Two experiments were conducted using the "minimal group paradigm" to explore whether social identity theory influences trust formation and impressions of a robot. In Experiment 1, participants were allocated to either a "robot" or "computer" group, and then they played a cooperative visual tracking game with an Aldebaran Nao humanoid robot as a partner. We hypothesised participants in the "robot group" would demonstrate intergroup bias by sitting closer to the robot (proxemics) and trusting the robot's suggested answers more frequently than their "computer group" counterparts. Experiment 2 used an almost identical procedure with a different set of participants; however, all participants were assigned to the "robot group" and thee different levels of anthropomorphic robot movement were manipulated. Our results suggest that intergroup bias and humanlike movement can significantly affect human-robot approach behaviour. Significant effects were found for trusting the robot's suggested answers with respect to task difficulty, but not for group membership or robot movement.
KW - human-robot interaction
KW - group identity
KW - trust
KW - compliance
UR - http://handle.westernsydney.edu.au:8081/1959.7/uws:45205
U2 - 10.5898/JHRI.6.3.Deligianis
DO - 10.5898/JHRI.6.3.Deligianis
M3 - Article
SN - 2163-0364
VL - 6
SP - 4
EP - 20
JO - Journal of Human-Robot Interaction
JF - Journal of Human-Robot Interaction
IS - 3
ER -