Evaluating Generated Co-Speech Gestures of Embodied Conversational Agent(ECA) through Real-Time Interaction

Detta är en Master-uppsats från KTH/Skolan för elektroteknik och datavetenskap (EECS)

Sammanfattning: Embodied Conversational Agents (ECAs)’ gestures can enhance human perception in many dimensions during interactions. In recent years, data-driven gesture generation approaches for ECAs have attracted considerable research attention and effort, and methods have been continuously optimized. Researchers have typically used human-agent interaction for user studies when evaluating systems of ECAs that generate rule-based gestures. However, when evaluating the performance of ECAs that generate gestures based on data-driven methods, participants are often required to watch prerecorded videos, which cannot provide an adequate assessment of human perception during the interaction. To address this limitation, we proposed two main research objectives: First, to explore the workflow of assessing data-driven gesturing ECAs through real-time interaction. Second, to investigate whether gestures could affect ECAs’ human-likeness, animacy, perceived intelligence, and humans’ focused attention in ECAs. Our user study required participants to interact with two ECAs by setting two experimental conditions with and without hand gestures. Both subjective data from the participants’ self-report questionnaire and objective data from the gaze tracker were collected. To our knowledge, the current study represents the first attempt to evaluate data-driven gesturing ECAs through real-time interaction and the first experiment using gaze-tracking to examine the effect of ECA gestures. The eye-gazing data indicated that when an ECA can generate gestures, it would attract more attention to its body. 

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)