Using Imitation Learning for Human Motion Control in a Virtual Simulation

Detta är en Master-uppsats från Karlstads universitet/Institutionen för matematik och datavetenskap (from 2013); Karlstads universitet/Avdelningen för datavetenskap

Sammanfattning: Test Automation is becoming a more vital part of the software development cycle, as it aims to lower the cost of testing and allow for higher test frequency. However, automating manual tests can be difficult as they tend to require complex human interaction. In this thesis, we aim to solve this by using Imitation Learning as a tool for automating manual software tests. The software under test consists of a virtual simulation, connected to a physical input device in the form of a sight. The sight can rotate on two axes, yaw and pitch, which require human motion control. Based on this, we use a Behavioral Cloning approach with a k-NN regressor trained on human demonstrations. Evaluation of model resemblance to the human is done by comparing the state path taken by the model and human. The model task performance is measured with a score based on the time taken to stabilize the sight pointing at a given object in the virtual world. The results show that a simple k-NN regression model using high-level states and actions, and with limited data, can imitate the human motion well. The model tends to be slightly faster than the human on the task while keeping realistic motion. It also shows signs of human errors, such as overshooting the object at higher angular velocities. Based on the results, we conclude that using Imitation Learning for Test Automation can be practical for specific tasks, where capturing human factors are of importance. However, further exploration is needed to identify the full potential of Imitation Learning in Test Automation.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)