Using Augmented-Reality for Visualizing a Social Robot’s Internal State

Detta är en Master-uppsats från KTH/Skolan för elektroteknik och datavetenskap (EECS)

Sammanfattning: Humans are very good at conveying when something is lost or misinterpreted in communication by using social cues like facial expressions or changes in prosody. However, these methods are usually not applicable for most robots, which are appearance constrained and vocality constrained. This is also one of the key factors that restrain the efficiency of Human-Robot Interaction (HRI). In this project, we explore a novel paradigm for enhancing the perception of the robot’s internal states using augmented reality (AR). A series of visualization interfaces augmenting either the environment, robot, or target object are implemented and evaluated through a user study. We found that AR visualization improved efficiency and motion predictability over a control group in which there is no visualization. The project shows not only the potential of AR visualizing as a bridge coordinating human and robot, but also a promising future of applications visualising robot’s internal states. 

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)