The study aimed to create and develop an emotion recognition algorithm based on EEG data. Although studying emotions is quite common practice in neuromarketing, most of the existing models focus on separate characteristics of an emotional process: mainly on emotional engagement and valence. More sophisticated methods of measuring the precise emotional states could help companies to better understand customers, their behavior, and the differences in perception of commercials by various segments of target audiences. An emotional component becomes particularly important in the perception of brands, logos, packaging, etc. Knowing what particular combination of emotional reactions is the key to consumers’ hearts might be a useful finding for commercial production.
In this study, the aim was to build a predictive model estimating salience of eight specific emotional responses that could potentially be useful in assessing the perception of advertising stimuli. We hope the results will allow us to reveal the emotional pattern of a stimulus and help gain a complex and differentiated understanding of potential customers’ experience.
In order to identify the specific patterns of emotional responses, we first identified the emotions that seemed to be worth studying from the perspective of our everyday commercial studies. Secondly, we pre-selected video stimuli eliciting the identified emotions. Various video content was manually chosen from different sources and the relevance of the stimuli to expected affective reactions was validated. During the pilot session, we asked the participants to assess and describe their emotional states. Then we merged all videos into eight categories in accordance with the collected responses: surprise, tenderness, joy, laughter, boredom, anger, sadness, and fear. The pilot study confirmed that the emotional responses elicited by our pre-selected videos are salient and consistent (only one kind of emotional response is elicited at the same time and similarly described by the majority of participants). The respondents (250 men and women aged 25 to 55 years old) viewed the selected videos that were intended to elicit the emotional responses mentioned above. A total of 16 videos (two in each category) lasting from 10 to 120 seconds were shown. After each video, the participants were asked to name an emotion evoked by the video and fill in the brief questionnaire describing their emotional state in terms of valence, engagement, and presence of each of the studied emotional reactions using a 10-point Likert scale. EEG data were recorded simultaneously during the study. The collected data was expected to be used in machine learning to derive an emotion recognition algorithm.
The obtained data were used to determine the relationship between self-assessing emotional states and EEG activity. The numerous EEG parameters such as spectral power, Shannon entropy and coherence between each of 20 electrodes in theta, alpha, beta 1, beta 2, and gamma bands were taken into the analysis as independent variables. The categories of emotional responses were treated as eight binary dependent variables. For each emotional response, a predictive model was built. The model allows the probability of emotional appearance in a stimulus to be calculated. Eight models (one per emotion) were calculated on a learning sample. The results showed an accuracy of 56% to 83% in identifying the target emotion. The models’ outcomes, together with the collected data, create a unique emotional pattern reflecting the complex perception of stimuli. The predicted values of all eight emotions’ presence in each tested video were highly correlated to self-assessing emotional profiles.
The findings suggest that the extracted models could be used as metrics of emotion recognition on EEG data, reflecting a complex emotional state when perceiving emotional stimuli. The created metrics might reveal some tiny details that distinguish emotional and memorable commercials from many others. It could help marketers to create high-performance and more targeted advertising with high efficiency and “selling function”. We believe that our algorithm of emotion recognition is one of the first steps in a long-term process to better understand the complicated emotional processes in the human brain. However, initial progress seems encouraging. More data, more varied stimuli, and more dedicated researchers are required to create better tools that will help us understand better what people feel. But the prize is invaluable. It will create a new level of neuromarketing application in the commercial world and contribute not only to applied neuromarketing and business production but to the scientific community too.
Igor Zimin (Neuro Trend)