Detecting emotions with EEG patterns

The holy grail of consumer research is to figure out how to accurately measure emotions. Emotions are complex, involving physiological, cognitive and social components that come together to create the subjective experience. Consumer decision making is driven by emotions (Schwartz, 2000) that signal us to approach or avoid things in our environment, based on how rewarding we think they will be (Damasio et al., 1996).

Common consumer research approaches to measuring emotion all have their weaknesses (self-reporting, autonomic measures, facial encoding, facial electromyography, fMRI).* Electroencephalography (EEG) is a promising tool, as it measures real-time changes in voltage caused by brain activity, has good temporal resolution, and is less invasive than fMRI. Frontal alpha asymmetries are a commonly used metric in EEG consumer research. Greater left than right frontal alpha asymmetry was originally thought to reflect positive valence in the brain (Davidson, 1992), but research on anger revealed it to be a measure of approach motivation rather than positivity (Harmon-Jones, 2007). However, this asymmetry is affected by trait factors as well as current emotional state (Coan & Allen, 2003), and it is unclear how accurate this measure is on a second-by-second basis.

We sought to use EEG pattern recognition to detect positive and negative emotional responses in the brain. The first challenge was eliciting real positive and negative emotions in a reliable and consistent way. We used several methods including presenting participants with emotive pictures, emotional face images and short video clips, mood induction statements,  emotive music and pure major, minor and dissonant tones. We created our own genuine facial emotional expression database for use in the study**. Participants viewed these stimuli while their EEG was measured. Then a classifier recognised which EEG features (power and coherence) corresponded to positive and negative emotions in each individual. We randomly selected 70% of each individual’s data from the positive and negative conditions, respectively, for each stimulus type. A pattern recognition algorithm learned how to classify the data into two separate categories (positive and negative). We then used the other 30% of the data to attempt to predict whether the participant was viewing or listening to positive or negative stimuli. The result is the percentage accuracy of this prediction. EEG power alone had very poor predictive power, hardly better than chance. In contrast, power and coherence together had excellent predictive power. The stimuli that produced the most accurate predictions were pure tones (95% accuracy); and the stimuli that produced the least accurate predictions were emotional images. This is unsurprising as tones are pure and unimpeded by ‘noise’, while images are varied in their content. The music was also highly predictive (93%); followed by the emotional faces (92%), the mood induction statements (91%) and finally the short video clips of emotional faces (87%).

Although this was a pilot study with a very small sample, this study showed that EEG pattern recognition is a promising method for measuring individuals’ emotional responses to visual and auditory stimuli. Its accuracy depends on the validity of the trained algorithms and its capability to perform on new datasets. Further research will focus on whether this powerful method can be used to accurately predict how people feel towards images and videos of products, people, brands and concepts.

* Self-reporting of emotion is a common approach with many pitfalls. Autonomic measures include heart rate and skin conductance, which are good for measuring the physiological intensity of emotion, but fail to specify whether the emotion is positive or negative. Facial encoding or facial electromyography (EMG) only capture the expression of emotion. Functional Magnetic Resonance Imaging (fMRI) is extremely expensive, has poor temporal resolution, and has ecological validity issues, (participants are placed in a small tube in a noisy magnet and must keep still).

** Most emotional face databases are not available for commercial use, and they use acted emotional expressions. We used a variety of tasks to elicit Ekman’s (1971) six emotions (happiness, sadness, disgust, surprise, anger and fear) while recording 20 participants’ emotional expressions. We chose still images and short clips of the participants showing genuine emotion and had 90 participants evaluate the images on valence and arousal to ensure they adequately represented the desired emotions. The innovative idea of creating short video clips capturing the most intense and convincing emotional responses was to ensure the highest possible level of ecological validity. The resulting database of still images and short clips was used to elicit emotions in the EEG study.

 

References:

Coan, J.A., & Allen, J.J. (2003). Frontal EEG asymmetry and the behavioural activation and inhibition systems. Psychophysiology, 40, 106-114.

Damasio, A. R., Everitt, B. J., & Bishop, D. (1996). The somatic marker hypothesis and the possible functions of the prefrontal cortex. Phil. Trans. R. Soc. Lond., 351(1346), 1413-1420.

Davidson, R.J. (1992). Anterior cerebral asymmetry and the nature of emotion. Brain and Cognition, 20, 125–151.

Ekman, P., & Friesen, W. V. (1971). Constants across cultures in the face and emotion. Journal of Personality and Social Psychology, 17(2), 124-129.

Harmon-Jones, E. (2007). Trait anger predicts relative left frontal cortical activation to anger-inducing stimuli. International Journal of Psychophysiology, 66, 154-160.

Schwarz, N. (2000). Emotion, cognition, and decision making. Cognition and Emotion, 14, 433–440.


Subscribe



Share