Finger-tracking based interactions in Augmented Reality: Exploring the usability of finger-tracking-based interactions and the effects of multimodal feedback in mobile augmented reality applications
Summary
Recent years have seen a rise of mobile Augmented Reality applications. The increasingly powerful mobile phones have not only brought us practical applications, such as direction overlays on a map, but also made way for mobile AR gaming. However, many of the current applications still make use of touchscreen gestures for interaction. In this thesis we delve into finger-tracking-based gestures with the purpose of delivering a more pleasurable and immersive experience to mobile phone users. In our first experiment we compare a finger-tracking based implementation to a touchscreen-based implementation in a mobile AR board-game, featuring both physical and virtual objects, to test both performance and enjoyability. Based on the ?findings and issues that emerged from this experiment we decided to take a closer look at the intricacies of fi?nger-tracking and touchscreen interactions in an attempt to enhance the performance of our
fi?nger-tracking based system. The outcome of this second experiment suggested that our ?finger-tracking based system, and others, could be further improved by adding additional feedback. A third experiment was therefor designed to study the effects of multimodal feedback on performance and user perception. With the phone as our only source of feedback, we tested the the combinations of visual, audible and remote haptic feedback, with constant and temporary intervals. The results showed that multimodal feedback in general, and constant visual cues combined with temporary haptic cues especially, can increase user responsiveness when transitioning between interactions. Finally we show remote haptic feedback to be a preferable method of feedback both in terms of usage, performance and preference.