Skip to main content

Profiling Based on Music and Physiological State

  • Conference paper
  • First Online:
Enterprise Interoperability VII

Abstract

Technology is driven by the objective of improving our lives, attending our needs and supporting our work and daily activities. Such guidelines are important to lead technological developments in many areas, whereas the most important result of such enabler is the enhancement of the users’ wellbeing. It is questionable how such technologies make us feel better. One can argue whether environmental conditions can be modified in order to enhance people’s wellbeing, and in what direction. One of the adopted methods in this work explores that thought, on whether the usage of a person’s physiological state can wield adequate sensorial stimulation to be usefully used thereafter. Another question considered in this work is whether it is possible to use such collected data to build a user’s musical playlists that tries to match a user’s physiological and psychological state with the stimuli evoked by the music that he or she is listening.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Vieillard, S., Peretz, I., Gosselin, N., Khalfa, S., Gagnon, L., & Bouchard, B. (2008). Happy, sad, scary and peaceful musical excerpts for research on emotions. Cognition and Emotion, 22, 720–752. doi:10.1080/02699930701503567

    Article  Google Scholar 

  2. Dingley, J. (2011). Processing, storage and display of physiological measurements. Anaesth Intensive Care Med, 12, 426–429. doi:10.1016/j.mpaic.2011.06.010

    Article  Google Scholar 

  3. Picard, R. W., Vyzas, E., & Healey, J. (2001). Toward machine emotional intelligence: Analysis of affective physiological state. IEEE Transactions on Pattern Analysis and Machine Intelligence, 23, 1175–1191. doi:10.1109/34.954607

    Article  Google Scholar 

  4. Takahashi, K., Namikawa, S. Y., & Hashimoto, M. (2012). Computational emotion recognition using multimodal physiological signals: Elicited using Japanese kanji words. In 35th International Conference Telecommunication on Signal Process (TSP) 2012 (pp. 615–20), Prague. doi:10.1109/TSP.2012.6256370

  5. Wen, W., Liu, G., Cheng, N., Wei, J., Shangguan, P., & Huang, W. (2014). Emotion recognition based on multi-variant correlation of physiological signals. IEEE Transactions on Affective Computing, 5, 132–140.

    Article  Google Scholar 

  6. Lin, Y. P., Wang, C. H., Jung, T. P., Wu, T. L., Jeng, S. K., Duann, J. R., et al. (2010). EEG-Based emotion recognition in music listening. IEEE Transactions on Biomedical Engineering, 57, 1798–1806. doi:10.1109/TBME.2010.2048568

    Article  Google Scholar 

  7. Canento, F., Fred, A., Silva, H., Gamboa, H., & Lourenço, A. (2011). Multimodal biosignal sensor data handling for emotion recognition. In Sensors (pp. 647–650). Limerick: IEEE. doi:10.1109/ICSENS.2011.6127029

  8. Balteş, F. R., Avram, J., Miclea, M., & Miu, A. C. (2011). Emotions induced by operatic music: Psychophysiological effects of music, plot, and acting: A scientist’s tribute to Maria Callas. Brain and Cognition, 76, 146–157. doi:10.1016/j.bandc.2011.01.012

    Article  Google Scholar 

  9. Russell, J. A., & Barrett, L. F. (1999). Core affect, prototypical emotional episodes, and other things called emotion: Dissecting the elephant. Journal of Personality and Social Psychology, 76, 805–819. doi:10.1037/0022-3514.76.5.805

    Article  Google Scholar 

  10. Feidakis, M., Daradoumis, T., & Caballé, S. (2012). A multi-fold time approach to address emotions in live and virtualized collaborative learning. In 2012 Sixth International Conference on Complex, Intelligent and Software Intensive Systems (CISIS) (pp. 881–886). Palermo: IEEE. doi:10.1109/CISIS.2012.197

  11. Kleinginna, P. R., & Kleinginna, A. M. (1981). A categorized list of emotion definitions, with suggestions for a consensual definition. Motivation and Emotion, 5, 345–379. doi:10.1007/BF00993889

    Article  Google Scholar 

  12. Ortony, A., Clore, G.L., & Collins A. (1990). The Cognitive Structure of Emotions. Cambridge Univ. Press

    Google Scholar 

  13. Ivan Akira. Plutchik’s eight primary emotions and how to use them (Part 1 of 2). https://dragonscanbebeaten.wordpress.com/2010/06/04/plutchiks-eight-primary-emotions-and-how-to-use-them-part-1/. Accessed October 1, 2015.

  14. Yang, Y., Lin, Y., Su, Y., Chen, H.H. (2008). A regression approach to music emotion recognition. IEEE trans audio, speech, and language processing , 16, 448–457. doi:10.1109/TASL.2007.911513

    Google Scholar 

  15. Leung, C.H.C., Deng, J. (2015). Emotion-Based music retrieval. Encycl. Inf. Sci. Technol, IGI Global, 3, 143–53. doi:10.4018/978-1-60566-026-4

  16. Nicolaou, M.A., Gunes, H., Pantic, M. (2011). Continuous prediction of spontaneous affect from multiple cues and modalities in valence-arousal space. IEEE trans affect comput, 2, 92–105. doi:10.1109/T-AFFC.2011.9

    Google Scholar 

  17. Apache Jena (2015). https://jena.apache.org/index.html. Accessed October 1, 2015

Download references

Acknowledgments

The research leading to this result has received funding from the European Union 7th Framework Programme (FP7-ICT) under grant agreement of the OSMOSE—OSMOsis applications for the Sensing Enterprise, project ID nr. 610905 (http://www.osmose-project.eu/). It has also received funding from the Portuguese-Serbian bilateral research initiative specifically related to the project entitled: Context-aware Smart Cyber—Physical Ecosystems.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to João Sarraipa .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Gião, J., Sarraipa, J., Francisco-Xavier, F., Ferreira, F., Jardim-Goncalves, R., Zdravkovic, M. (2016). Profiling Based on Music and Physiological State. In: Mertins, K., Jardim-Gonçalves, R., Popplewell, K., Mendonça, J. (eds) Enterprise Interoperability VII. Proceedings of the I-ESA Conferences, vol 8. Springer, Cham. https://doi.org/10.1007/978-3-319-30957-6_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-30957-6_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-30956-9

  • Online ISBN: 978-3-319-30957-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics