Show simple item record

dc.rights.licenseCC-BY-NC-ND
dc.contributor.advisorGauthier, David
dc.contributor.authorGelderen, Beryl van
dc.date.accessioned2024-10-18T00:02:43Z
dc.date.available2024-10-18T00:02:43Z
dc.date.issued2024
dc.identifier.urihttps://studenttheses.uu.nl/handle/20.500.12932/47993
dc.description.abstractPopular music streaming platforms do not typically display any audio-derived information other than track duration. This limits users’ ability to guide their own exploration of music beyond listening to each song recommended by the algorithm or making decisions based on their knowledge of the artist or other metadata. This work proposes visual thumbnails generated from audio features to facilitate music discovery: SoundShapes. SoundShapes visualise the mood and timbre characteristics of the audio. The valence-arousal scale approximates mood. Timbre is represented by the instruments used and an abstract representation of the genre. Various extraction methods were used to facilitate feature extraction, including low-level signal processing and the use of a convolutional neural network. A prototype of a user interface was built and used to evaluate the SoundShapes. Although the sample was small and not representative of a wider user population, the results of the evaluation show potential for wider user acceptance.
dc.description.sponsorshipUtrecht University
dc.language.isoEN
dc.subjectMusic Discovery through Visualisation of Audio Features
dc.titleMusic Discovery through Visualisation of Audio Features: Introducing SoundShapes
dc.type.contentMaster Thesis
dc.rights.accessrightsOpen Access
dc.subject.courseuuApplied Data Science
dc.thesis.id40351


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record