Audiovisual Cross-Modality in Virtual Reality

Detta är en Kandidat-uppsats från KTH/Skolan för elektroteknik och datavetenskap (EECS)

Sammanfattning: What happens when we see an object of a certain material but the sounds that it makes comes from another material? Whilst it is an interesting question, it is an area that is under researched. Though there has been some previous research in the field the visuals have been represented using textures on simple shapes like cubes or spheres. Since this is not how humans experience materials in the real world there is a possibility that the research that has been done is not generalizable or ecologically valid. We wanted to see what would happen if this type of test was performed using 3D models that looked like real-life objects that most people would be familiar with. In order to test this, we gathered impact sounds and 3D models to represent nine different materials and created a program in virtual reality that allowed us to test all the possible combinations of sounds and visuals. These tests were performed with 15 participants who selected which material they believed each audiovisual combination represented. Our results showed a higher tendency to rely on audio cues for material perception compared to previous tests. This is interesting since we increased the visual fidelity while the quality of the audio was comparable to the previous tests. One theory is that the increase in visual fidelity makes the visuals so much clearer that participants started focusing more on trying to understand the audio.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)