The GEMS is the first model and instrument specifically designed to capture the richness of musically evoked emotions. It is part of a broader attempt to understand musically evoked emotions. It is based on multiple studies that included a wide range of music and listener samples.
The GEMS model comprises nine categories of musical emotions (see video below). This domain-specific model accounts for ratings of music-evoked emotions more powerfully than multi-purpose scales that are based on non-musical areas of emotion research. In addition, we also showed that the experience of the musical emotions tends to activate distinct emotive brain sites (click here to download this article).
As a corollary to the model, we developed the Geneva Emotional Music Scale (GEMS). The GEMS has 9 scales and 45 emotion labels and is now frequently used in studies on music and emotion. Shorter Scales, the GEMS-25 and the GEMS-9 have also been developed (see below).
In the video, Prof. Marcel Zentner explains the nine dimensions of the GEMS model.
The full-length TEDxInnsbruck talk shows how characterising great composers through the emotions their music evokes provides an accessible introduction to the world of classical music.
Marcel Zentner: “9 great composers explained in 9 emotions”
TEDxInnsbruck 25. Sept. 2021
There are currently three versions of the GEMS. The GEMS-45 contains 45 labels that proved to be consistently chosen for describing musically evoked emotive states across a relatively wide range of music and listener samples. Moreover, we found that these states can be grouped into 9 different categories. Thus, the GEMS is composed of nine emotional scales, which in turn condense into three “superfactors”.
Shorter Scales, the GEMS-25 and the GEMS-9 have also been developed. As demonstrated in our research (Zentner et al., 2008), the GEMS accounts for ratings of music-evoked emotions more powerfully than multi-purpose scales that are based on non-musical areas of emotion research.
Zentner, M., Grandjean, D., & Scherer, K. (2008). Emotions evoked by the sound of music: Characterization, classification, and measurement, Emotion, 8, 494-521. https://doi.org/10.1037/1528-3542.8.4.494
Jacobsen, P.-O.*, Strauss, H.*, Vigl, J., Zangerle, E., & Zentner, M. (2024). Measuring aesthetic music-evoked emotions in a minute or
less. Comparison of the GEMS-45 and the GEMS-9. Musicae Scientiae. https://doi.org/10.1177/10298649241256252
Strauss, H., Vigl, J., Jacobsen, P.-O., Bayer, M., Talamini, F., Vigl, W., Zangerle, E., & Zentner, M. (2024). The Emotion-to-Music Mapping Atlas (EMMA): A systematically organized online database of emotionally evocative music excerpts. Behavior Research Methods. Advance online. https://doi.org/10.3758/s13428-024-02336-0
If you wish to use the GEMS for academic research purposes in a university environment please follow the instructions provided below:
1.
Download the User Agreement
Please download and print the User Agreement.
3.
Receipt of the Scale Instructions
Upon confirmation of your request, you will receive an email with the scale instructions for conducting and evaluating the GEMS.