Show simple item record

dc.rights.licenseCC-BY-NC-ND
dc.contributor.advisorYumak, Z.
dc.contributor.advisorVolk, A.
dc.contributor.authorBogaers, A.P.S.
dc.date.accessioned2021-01-27T19:00:15Z
dc.date.available2021-01-27T19:00:15Z
dc.date.issued2020
dc.identifier.urihttps://studenttheses.uu.nl/handle/20.500.12932/38684
dc.description.abstractWhile audio-driven face and gesture motion synthesis has been studied before, to our knowledge no research has been done yet for automatic generation of musical gestures for virtual humans. Existing work either focuses on precise 3D finger movement generation required to play an instrument or expressive musical gestures based on 2D video data. In this paper, we propose a music-driven piano performance generation method using 3D motion capture data and recurrent neural networks. Our results show that it is feasible to automatically generate expressive musical gestures for piano playing using various audio and musical features. However, it is not yet clear which features work best for which type of music. Our future work aims to further test with other datasets, deep learning methods and musical instruments using both objective and subjective evaluations.
dc.description.sponsorshipUtrecht University
dc.format.extent2565478
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.titleMusic-Driven Animation Generation of Expressive Musical Gestures
dc.type.contentMaster Thesis
dc.rights.accessrightsOpen Access
dc.subject.keywordsmusic-driven animation; audio-driven animation; virtual characters; musical gestures; neural networks; music-driven gestures; gesture animation; expressive gestures; LSTM
dc.subject.courseuuGame and Media Technology


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record