dc.rights.license | CC-BY-NC-ND | |
dc.contributor.advisor | Gauthier, David | |
dc.contributor.author | Gelderen, Beryl van | |
dc.date.accessioned | 2024-10-18T00:02:43Z | |
dc.date.available | 2024-10-18T00:02:43Z | |
dc.date.issued | 2024 | |
dc.identifier.uri | https://studenttheses.uu.nl/handle/20.500.12932/47993 | |
dc.description.abstract | Popular music streaming platforms do not typically display any audio-derived information other than track duration. This limits users’ ability to guide their own exploration of music beyond listening to each song recommended by the algorithm or making decisions based on their knowledge of the artist or other metadata. This work proposes visual thumbnails generated from audio features to facilitate music discovery: SoundShapes.
SoundShapes visualise the mood and timbre characteristics of the audio. The valence-arousal scale approximates mood. Timbre is represented by the instruments used and an abstract representation of the genre. Various extraction methods were used to facilitate feature extraction, including low-level signal processing and the use of a convolutional neural network.
A prototype of a user interface was built and used to evaluate the SoundShapes. Although the sample was small and not representative of a wider user population, the results of the evaluation show potential for wider user acceptance. | |
dc.description.sponsorship | Utrecht University | |
dc.language.iso | EN | |
dc.subject | Music Discovery through Visualisation of Audio Features | |
dc.title | Music Discovery through Visualisation of Audio Features: Introducing SoundShapes | |
dc.type.content | Master Thesis | |
dc.rights.accessrights | Open Access | |
dc.subject.courseuu | Applied Data Science | |
dc.thesis.id | 40351 | |