Contextual information of the listener is only slowly being integrated into music retrieval and recommendation systems. Given the enormous rise in mobile music consumption and the many sensors integrated into today’s smart-phones, at the same time, an unprecedented source for user context data of different kinds is becoming available.
Equipped with a smart-phone application, which had been developed to monitor contextual aspects of users when listening to music, we collected contextual data of listening events for 48 users. About 100 different user features, in addition to music meta-data have been recorded.
In this paper, we analyze the relationship between aspects of the
music listening preference
. The goals are to assess (i) whether user context factors allow predicting the song, artist, mood, or genre of a listened track, and (ii) which contextual aspects are most promising for an accurate prediction. To this end, we investigate various classifiers to learn relations between user context aspects and music meta-data. We show that the user context allows to predict artist and genre to some extent, but can hardly be used for song or mood prediction. Our study further reveals that the level of listening activity has little influence on the accuracy of predictions.