DOI QR코드

DOI QR Code

Affective Representation and Consistency Across Individuals Responses to Affective Videos

정서 영상에 대한 정서표상 및 개인 간 반응 일관성

  • Received : 2023.01.09
  • Accepted : 2023.02.05
  • Published : 2023.09.30

Abstract

This study examined the affective representation and response consistency among individuals using affective videos, a naturalistic stimulus inducing emotional experiences most similar to those in daily life. In this study, multidimensional scaling was conducted to investigate whether the various affective representations induced through video stimuli are located in the core affect dimensions. A cross-participant classification analysis was also performed to verify whether the video stimuli are well classified. Additionally, the newly developed intersubject correlation analysis was conducted to assess the consistency of affective representations across participant responses. Multidimensional scaling revealed that the video stimuli are represented well in the valence dimension, partially supporting Russell (1980)'s core affect theory. The classification results showed that affective conditions were successfully classified across participant responses. Moreover, the intersubject correlation analysis showed that the consistency of affective representations to video stimuli differed with respect to the condition. This study suggests that the affective representations and consistency of individual responses to affective videos varied across different affective conditions.

본 연구는 정서 자극 유형 중 일상생활과 가장 유사한 정서 경험을 유발하는 자연주의적 자극인 영상 자극을 활용하여 정서표상의 유사성과 개인 간 반응 일관성을 살펴보기 위해 수행되었다. 이를 위해 다차원척도법을 실시하여 영상 자극이 핵심정서 차원에 위치하는지 확인하고, 참가자 간 분류분석을 사용하여 영상들이 정서유형 별로 구분이 잘 이루어지는지, 영상 자극에 대한 참가자들의 정서표상이 일관적인지 검증하였다. 또한 참가자간 상관분석을 통해 각 영상 자극에 대한 정서표상이 참가자들간 유사한지 추가적으로 확인하였다. 다차원척도법 결과, 정서유발 영상들이 정서가 차원에서 유의하게 구분되어 Russell(1980)의 핵심정서차원을 부분적으로 지지하였다. 분류분석 결과, 각 영상이 정서유형에 따라 잘 분류되고 예측되었다. 마지막으로, 참가자간 상관분석을 통해 정서 반응 일관성이 각 영상의 정서유형에 따라 다르게 나타남을 확인하였다. 본 연구는 영상 자극에 대한 정서표상과 정서 반응 일관성이 정서 유형에 따라 차이가 있음을 시사한다.

Keywords

Acknowledgement

이 논문은 한국연구재단 4단계 BK21사업(전북대학교 심리학과)의 지원을 받아 연구되었음(No.4199990714213).

References

  1. Bach, D. R., Friston, K. J., & Dolan, R. J. (2010). Analytic measures for quantification of arousal from spontaneous skin conductance fluctuations. International Journal of Psychophysiology, 76(1), 52-55. DOI: 10.1016/J.IJPSYCHO.2010.01.011
  2. Baucom, L. B., Wedell, D. H., Wang, J., Blitzer, D. N., & Shinkareva, S. V. (2012). Decoding the neural representation of affective states. Neuroimage, 59(1), 718-727. DOI: 10.1016/j.neuroimage.2011.07.037
  3. Berlyne, D. E. (1960). Conflict, arousal, and curiosity. NY: McGraw-Hill.
  4. Bigand, E., Vieillard, S., Madurell, F., Marozeau, J., & Dacquet, A. (2005). Multidimensional scaling of emotional responses to music: The effect of musical expertise and of the duration of the excerpts. Cognition & Emotion, 19(8), 1113-1139. DOI: 10.1080/02699930500204250
  5. Bimler, D., & Kirkland, J. (2001). Categorical perception of facial expressions of emotion: Evidence from multidimensional scaling. Cognition & Emotion, 15(5), 633-658. DOI: 10.1080/02699930126214
  6. Camras, L. A. (1980). Children's understanding of facial expressions used during conflict encounters. Child Development, 51, 879-855.
  7. Catz, O., Kampf, M., Nachson, I., & Babkoff, H. (2009). From theory to implementation: Building a multidimensional space for face recognition. Acta Psychologica, 131(2), 143-152. DOI: 10.1016/j.actpsy.2009.03.010
  8. Codispoti, M., Bradley, M. M., & Lang, P. J. (2001). Affective reactions to briefly presented pictures. Psychophysiology, 38(3), 474-478. DOI: 10.1111/1469-8986.3830474
  9. Dauer, T., Nguyen, D. T., Gang, N., Dmochowski, J. P., Berger, J., & Kaneshiro, B. (2021). Inter-subject correlation while listening to minimalist music: A study of electrophysiological and behavioral responses to steve reich's piano phase. Frontiers in Neuroscience, 15. DOI: 10.3389/fnins.2021.702067
  10. Ekman, P. (1993). Facial expression and emotion. American Psychologist, 48(4), 384-392. https://doi.org/10.1037/0003-066X.48.4.384
  11. Ekman, P. E., & Davidson, R. J. (1994). The nature of emotion: Fundamental questions. Oxford University Press.
  12. Fredrickson, B. L. (2000). Cultivating positive emotions to optimize health and well-being. Prevention & Treatment, 3(1), 1. DOI: 10.1037/1522-3736.3.1.31a
  13. Gabert-Quillen, C. A., Bartolini, E. E., Abravanel, B. T., & Sanislow, C. A. (2015). Ratings for emotion film clips. Behavior Research Methods, 47(3), 773-787. DOI: 10.3758/s13428-014-0500-0
  14. Gilman, T. L., Shaheen, R., Nylocks, K. M., Halachoff, D., Chapman, J., Flynn, J. J., Matt, L. M., & Coifman, K. G. (2017). A film set for the elicitation of emotion in research: A comprehensive catalog derived from four decades of investigation. Behavior Research Methods, 49(6), 2061-2082. DOI: 10.3758/s13428-016-0842-x
  15. Golland, Y., Hakim, A., Aloni, T., Schaefer, S., & Levit-Binnun, N. (2018). Affect dynamics of facial EMG during continuous emotional experiences. Biological Psychology, 139, 47-58. DOI: 10.1016/j.biopsycho.2018.10.003
  16. Gomez, P., Stahel, W. A., & Danuser, B. (2004). Respiratory responses during affective picture viewing. Biological Psychology, 67(3), 359-373. DOI: 10.1016/j.biopsycho.2004.03.013
  17. Gosselin, P., & Simard, J. (1999). Children's knowledge of facial expressions of emotions: Distinguishing fear and surprise. Journal of Genetic Psychology, 160(2), 181-193. https://doi.org/10.1080/00221329909595391
  18. Gravetter, F. J., & Wallnau, L. B. (2004). Statistics for the behavioral sciences. Belmont. CA: Thomson Wadsworth.
  19. Hasson, U., Nir, Y., Levy, I., Fuhrmann, G., & Malach, R. (2004). Intersubject synchronization of cortical activity during natural vision. Science, 303(5664), 1634-1640. https://doi.org/10.1126/science.1089506
  20. Haynes, J. D., & Rees, G. (2006). Decoding mental states from brain activity in humans. Nature Reviews Neuroscience, 7(7), 523-534.
  21. Izard, C. E. (2007). Basic emotions, natural kinds, emotion schemas, and a new paradigm. Perspectives on Psychological Science, 2(3), 260-280. DOI: 10.1111/j.1745-6916.2007.00044.x
  22. Jenkins, L. M., & Andrewes, D. G. (2012). A new set of standardised verbal and non-verbal contemporary film stimuli for the elicitation of emotions. Brain Impairment, 13(2), 212-227. DOI: 10.1017/BrImp.2012.18
  23. Jang J., Kim H., & Kim J. (2023). Consistency between individuals of affective responses for multiple modalities based on behavioral and physiological data. Science of Emotion and Sensibility, 26(1), 43-54. DOI: 10.14695/KJSOS.2023.26.1.43
  24. Kashdan, T. B., Gallagher, M. W., Silvia, P. J., Winterstein, B. P., Breen, W. E., Terhar, D., & Steger, M. F. (2009). The curiosity and exploration inventory-II: Development, factor structure, and psychometrics. Journal of Research in Personality, 43(6), 987-998. DOI: 10.1016/j.jrp.2009.04.011
  25. Kim, I., Jang, J., Kim, H., & Kim, J. (2022). Measuring consistency of affective responses to ASMR stimuli across individuals using intersubject correlation. Korean Journal of Cognitive and Biological Psychology, 34(2), 121-133. DOI: 10.22172/cogbio.2022.34.2.007
  26. Kim, J. (2021). Representation of facial expressions of different ages: A multidimensional scaling study. Science of Emotion and Sensibility, 24(3), 71-80. DOI: 10.14695/KJSOS.2021.24.3.71
  27. Kim, J., Shinkareva, S. V., & Wedell, D. H. (2017). Representations of modality-general valence for videos and music derived from fMRI data. NeuroImage, 148, 42-54. DOI: 10.1016/J.NEUROIMAGE.2017.01.002
  28. Kim, J., & Wedell, D. H. (2016). Comparison of physiological responses to affect eliciting pictures and music. International Journal of Psychophysiology, 101, 9-17. DOI: 10.1016/j.ijpsycho.2015.12.011
  29. Kim, J., Weber, C. E., Gao, C., Schulteis, S., Wedell, D. H., & Shinkareva, S. V. (2020). A study in affect: Predicting valence from fMRI data. Neuropsychologia, 143, 107473. DOI: 10.1016/j.neuropsychologia.2020.107473
  30. Ladinig, O., & Schellenberg, E. G. (2012). Liking unfamiliar music: Effects of felt emotion and individual differences. Psychology of Aesthetics, Creativity, and the Arts, 6(2), 146. DOI: 10.1037/a0024671
  31. Li, X., Zhu, Y., Vuoriainen, E., Ye, C., & Astikainen, P. (2021). Decreased intersubject synchrony in dynamic valence ratings of sad movie contents in dysphoric individuals. Scientific Reports, 11(1), 1-13. DOI: 10.1038/s41598-021-93825-1
  32. Lindquist, K. A., Satpute, A. B., Wager, T. D., Weber, J., & Barrett, L. F. (2016). The brain basis ofpositive and negative affect: Evidence from ameta-analysis of the human neuroimaging literature. Cerebral Cortex, 26(5), 1910-1922. DOI: 10.1093/cercor/bhv001
  33. Maffei, A. (2020). Spectrally resolved EEG intersubject correlation reveals distinct cortical oscillatory patterns during free-viewing of affective scenes. Psychophysiology, 57(11), e13652.
  34. Maffei, A., & Angrilli, A. (2019). E-MOVI-experimental MOVies for induction of emotions in neuroscience: An innovative film database with normative data and sex differences. Plos one, 14(10), e0223124. DOI: 10.1371/journal.pone.0223124
  35. Najafi, M., Kinnison, J., & Pessoa, L. (2017). Dynamics of intersubject brain networks during anxious anticipation. Frontiers in Human Neuroscience, 11, 552. DOI: 10.3389/FNHUM.2017.00552/BIBTEX
  36. Nastase, S. A., Gazzola, V., Hasson, U., & Keysers, C. (2019). Measuring shared responses across subjects using intersubject correlation. Social Cognitive and Affective Neuroscience, 14(6), 669-687. DOI: 10.1093/scan/nsz037
  37. Noordewier, M. K., & Breugelmans, S. M. (2013). On the valence of surprise. Cognition & Emotion, 27(7), 1326-1334. DOI:10.1080/02699931.2013.777660
  38. Nummenmaa, L., Glerean, E., Viinikainen, M., Jaaskelainen, I. P., Hari, R., & Sams, M. (2012). Emotions promote social interaction by synchronizing brain activity across individuals. Proceedings of the National Academy of Sciences, 109(24), 9599-9604. DOI: 10.1073/pnas.1206095109
  39. Pereira, C. S., Teixeira, J., Figueiredo, P., Xavier, J., Castro, S. L., & Brattico, E. (2011). Music and emotions in the brain: Familiarity matters. PloS one, 6(11), e27241. DOI: 10.1371/journal.pone.0027241
  40. Plutchik, R. (1980). A general psychoevolutionary theory of emotion. In Theories of emotion (pp. 3-33). Academic Press.
  41. Plutchik, R. (2003). Emotions and life: Perspectives from psychology, biology, and evolution. American Psychological Association.
  42. Rottenberg, J., Ray, R. D., & Gross, J. J. (2007). Emotion elicitation using films. In J. A. Coan & J. J. B. Allen (Eds.), Handbook of emotion elicitation and assessment (pp. 9-28). Oxford University Press.
  43. Roy-Charland, A., Perron, M., Beaudry, O., & Eady, K. (2014). Confusion of fear and surprise: A test of the perceptual-attentional limitation hypothesis with eye movement monitoring. Cognition and Emotion, 28(7), 1214-1222. DOI: 10.1080/02699931.2013.878687
  44. Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161. DOI: 10.1037/h0077714
  45. Russell, J. A., & Bullock, M. (1985). Multidimensional scaling of emotional facial expressions: similarity from preschoolers to adults. Journal of Personality and Social Psychology, 48(5), 1290.
  46. Sachs, M. E., Habibi, A., Damasio, A., & Kaplan, J. T. (2018). Decoding the neural signatures of emotions expressed through sound. Neuroimage, 174, 1-10. DOI: 10.1016/j.neuroimage.2018.02.058
  47. Schmuckler, M. A. (2001). What is ecological validity? A dimensional analysis. Infancy, 2(4), 419-436. DOI: 10.1207/S15327078IN0204_02
  48. Shinkareva, S. V., Wang, J., & Wedell, D. H. (2013). Examining similarity structure: Multidimensional scaling and related approaches in neuroimaging. Computational and Mathematical Methods in Medicine, 2013. DOI: 10.1155/2013/796183
  49. Simons, R. F., Detenber, B. H., Cuthbert, B. N., Schwartz, D. D., & Reiss, J. E. (2003). Attention to television: Alpha power and its relationship to image motion and emotional content. Media Psychology, 5(3), 283-301. DOI: 10.1207/S1532785XMEP0503
  50. Simony, E., Honey, C. J., Chen, J., Lositsky, O., Yeshurun, Y., Wiesel, A., & Hasson, U. (2016). Dynamic reconfiguration of the default mode network during narrative comprehension. Nature Communications, 7(1), 1-13. DOI: 10.1038/ncomms12141
  51. Peelen, M. V., Atkinson, A. P., & Vuilleumier, P. (2010). Supramodal representations of perceived emotions in the human brain. Journal of Neuroscience, 30(30), 10127-10134. DOI: 10.1523/JNEUROSCI.2161-10.2010
  52. Teigen, K. H., & Keren, G. (2003). Surprises: Low probabilities or high contrasts? Cognition, 87(2), 55-71. DOI: 10.1016/s0010-0277(02)00201-9
  53. Uhrig, M. K., Trautmann, N., Baumgartner, U., Treede, R.-D., Henrich, F., Hiller, W., & Marschall, S. (2016). Emotion elicitation: A comparison of pictures and films. Frontiers in Psychology, 7(180), 1-12. DOI: 10.3389/fpsyg.2016.00180
  54. Westermann, R., Spies, K., Stahl, G., & Hesse, F. W. (1996). Relative effectiveness and validity of mood induction procedures: A meta-analysis. European Journal of Social Psychology, 26(4), 557-580.
  55. Zupan, B., & Eskritt, M. (2020). Eliciting emotion ratings for a set of film clips: A preliminary archive for research in emotion. The Journal of Social Psychology, 160(6), 768-789. DOI: 10.1080/00224545.2020.1758016