How potential users of music search and retrieval systems describe the semantic quality of music
Author(s) -
Lesaffre Micheline,
Voogdt Liesbeth De,
Leman Marc,
Baets Bernard De,
Meyer Hans De,
Martens JeanPierre
Publication year - 2008
Publication title -
journal of the american society for information science and technology
Language(s) - English
Resource type - Journals
eISSN - 1532-2890
pISSN - 1532-2882
DOI - 10.1002/asi.20731
Subject(s) - emotive , set (abstract data type) , pop music automation , musical , quality (philosophy) , psychology , music and emotion , focus (optics) , scale (ratio) , computer science , cognitive psychology , music education , musical composition , music history , visual arts , art , pedagogy , optics , epistemology , quantum mechanics , programming language , philosophy , physics
A large‐scale study was set up aiming at the clarification of the influence of demographic and musical background on the semantic description of music. Our model for rating high‐level music qualities distinguishes between affective/emotive, structural and kinaesthetic descriptors. The focus was on the understanding of the most important attributes of music in view of the development of efficient search and retrieval systems. We emphasized who the users of such systems are and how they describe their favorite music. Particular interest went to inter‐subjective similarities among listeners. The results from our study suggest that gender, age, musical expertise, active musicianship, broadness of taste and familiarity with the music have an influence on the semantic description of music.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom