Musical parameters and synchronization in instrumental music AMVs
MetadataShow full item record
AMVs or anime music videos have been created since the 1980s and are still being made by fans in the West who want to express their love for anime, or Japanese animation. The website animemusicvideos.org was established in 2000 and became a very important site for the AMV creators to upload their videos to and share them with other fans. The site hosts an annual competition called the Viewer’s Choice Awards which features different categories where fans can submit their creative work to. In 2004, the contest added the category ‘best use of instrumental music’ which is aimed at AMVs with instrumental music as soundtrack. In most AMVs, the visuals are synchronized to the lyrics as they serve as a guide and are easier to edit to. Instrumental music usually does not include lyrics and thus seems more difficult to synchronize images to, as ‘pure’ music often lacks fixed meaning and reference, unlike lyrical music. Nowadays, the instrumental music category still exists, so how are the images connected to the music? If there are no lyrics, which musical parameters serve as a ‘guide’ for the AMV? In this thesis, I try to find answers to these questions by analysing the AMVs ‘Dentelle’ and ‘You Make Me Smile’ which both won the ‘best use of instrumental music’ category in 2018 and 2019 respectively. It turns out that these AMVs use instrumental music that have some lyrical references. ‘Dentelle’ is accompanied by Saint-Saëns’ famous tone poem Danse Macabre, which has a title and a theme and thus some reference points. ‘You Make Me Smile’ is edited to the song ‘Passionfruit’ which has some samples with spoken words in it. However, most of the music is instrumental but it shows that it is possible to synchronize images to other musical parameters such as mood, tempo, sound, timbre, texture, melody, looping and phrasing as well.