Skip to main content
  • Original article
  • Open access
  • Published:

The role of leadership level in college students’ facial emotion recognition: evidence from event-related potential analysis

A Correction to this article was published on 16 February 2024

This article has been updated

Abstract

While the role of emotion in leadership practice is well-acknowledged, there is still a lack of clarity regarding the behavioral distinctions between individuals with varying levels of leadership and the underlying neurocognitive mechanisms at play. This study utilizes facial emotion recognition in conjunction with electroencephalograms to explore the temporal dynamics of facial emotion recognition processes among college students with high and low levels of leadership. The results showed no significant differences in the amplitude of P1 during the early stage of facial emotion recognition between the two groups. In the middle stage of facial emotion recognition, the main effect of group was significant on the N170 component, with higher N170 amplitude evoked in high-leadership students than low-leadership students. In the late stage of facial emotion recognition, low-leadership students evoked greater LPP amplitude in the temporal-parietal lobe when recognizing happy facial emotions compared to high-leadership students. In addition, time–frequency results revealed a difference in the alpha frequency band, with high-leadership students exhibiting lower alpha power than low-leadership students. The results suggest differences in the brain temporal courses of facial emotion recognition between students with different leadership levels, which are mainly manifested in the middle stage of structural encoding and the late stage of delicate emotional processing during facial emotion recognition.

Introduction

Leadership has long been a central topic in the field of behavioral science. In recent decades, the study of youth leadership has also gained widespread attention. As future elites and potential leaders, college students need to develop strong leadership skills. In recent years, the concept of leadership has undergone a transformation. Traditional leadership theories, such as the trait theory, behavioral theory, and contingency theory, have been shown to have their limitations. Newer leadership theories, such as the charismatic leadership theory, transformational leadership theory, and value-driven leadership theory, have emerged and have gained prominence in the field of management.

Leadership and emotion

Weber (1963) proposed the charismatic leadership theory, which emphasizes leaders’ unique personal charm and influence over followers. This theory acknowledges the malleability of leadership skills. Burns (1978) introduced transformational leadership theory, focusing on leader–follower interactions and inspiring followers through non-material means like appealing to emotions and developmental needs. Since the twenty-first century, value-driven leadership arouses followers’ cognitive schemas to enhance emotional connections and organizational identification (Bass & Riggio, 2006). Overall, emotions are playing an increasingly important role in leadership research, considered a crucial mechanism of leader influence and a conduit for information transfer during interactions.

Following these developments regarding the role of emotions in leadership, traditionally, cognitive factors were seen as the main predictors of leadership effectiveness. However, more recently, the role of emotions has gained prominence as an additional factor. Studies found emotional intelligence correlates positively with leadership effectiveness. For instance, Kerr et al. (2006) reported emotional intelligence predicted leadership effectiveness (r = 0.50) among corporate leaders. Also, Gómez-Leal et al.’s (2022) review revealed that emotional intelligence is key for effective leadership. Clearly, effective emotional abilities are critical for leadership, warranting further research.

Mayer and Salovey (1993) first proposed the concept of emotional intelligence as the ability to identify, monitor, and regulate emotions in self and others. The four key dimensions are: recognizing emotions, using emotions to facilitate thought, understanding emotions, and managing emotions. This model has also received some empirical support. In terms of emotion recognition, Rubin et al. (2005) found that leaders’ use of nonverbal cues to recognize emotions accounted for 20% of the variance in leadership effectiveness. Regarding emotion expression, Avolio et al. (2004) found that leaders’ appropriate emotional expression was positively correlated with leadership effectiveness. Trichas and Schyns (2012) found that when subordinates’ implicit leadership theories aligned with leaders’ facial emotions, subordinates had higher evaluative impressions of the leaders. For understanding and analyzing emotions, Toegel et al. (2013) argued that the process of understanding emotions is one of perspective-taking; leaders high in empathy are better able to accurately understand and meet subordinates’ needs, forming emotional bonds. Regarding using emotions to facilitate thinking, Visser et al.’s (2013) research found mediating effects of leaders’ emotions on the relationship between subordinates’ creativity and leadership effectiveness. For emotion regulation, Richards and Hackett (2012) found that when leaders and subordinates engaged in emotion regulation strategies, they were able to reduce experiences of failure and enhance leadership effectiveness, while excessive suppression of emotions decreased effectiveness.

Facial emotion recognition

Through reviewing the concept of emotional intelligence, we can see that research on emotional intelligence involves multiple dimensions. In social environments, human faces are the most biologically significant visual stimuli. Faces contain a wealth of emotional cues, so the ability to recognize facial emotions is an important dimension of emotional intelligence. Whether one can accurately identify others’ emotional facial expressions, especially by gathering key social information from eyes, mouth, facial contours, etc., to infer inner states, emotions, and affective intentions, and respond accordingly, reflects to some extent the development of one’s social adaptability and overall competence (Orejarena et al., 2019). This ability relates to social interaction, job performance, and group survival and development. Scholars agree facial expressions are objective indicators of emotion. Individuals with high emotional intelligence can, on a personal level, better recognize, control, and regulate their own emotions. On an interpersonal level, they can more easily perceive others’ attitudes, emotions, and intentions. This has important significance for facilitating smooth social interaction and maintaining good interpersonal relationships. Recognizing others’ emotions has become a hot topic in psychology and cognitive neuroscience. A classic paradigm for studying facial emotion processing is the facial emotion recognition task (Montagne et al., 2007).

With increasing use of electrophysiological and neuroimaging techniques to study neural pathways for facial expressions, Luo et al., (2010) revised a three-stage temporal model of emotional facial processing: early direct visual processing, mid-level structural encoding, and late detailed affective analysis. The early stage involves rapid, automatic processing of low-level physical attributes, reflecting involuntary attention to stimuli. The mid-stage focuses on configural face processing, independent of feature information. The late stage involves further assessment of emotion-relevant details, differentiating faces by emotional valence. This model posits the visual processing, structural encoding, and expressive analysis stages are largely independent and occur in parallel. It has been widely applied in research on emotional facial recognition.

ERP& facial emotion recognition

Event-related potentials (ERPs), also known as cognitive potentials or evoked potentials, refer to changes in electrical brain activity observed on the scalp surface using noninvasive brain electrical signal recording devices when specific stimuli or several types of stimuli act on the human sensory channels or a specific part of the brain. The neurophysiological changes of the brain reflect the changes in the cognitive function activity of the individual’s brain. Compared with functional magnetic resonance imaging, near-infrared and other technologies, ERP technology has the advantage of high time resolution at the millisecond level to record the process of brain cognitive processing activity. By extracting features from the collected brain electrical signals, emotionally processing-related feature indicators can be extracted. Previous studies on emotional face recognition have already divided the processing of faces into three stages of early, middle, and late and have also isolated the corresponding ERP components. In the early stages of emotional face recognition, there is a positive-going component P1 (mainly distributed in the occipital-temporal region, with a peak latency mainly between 80 and 130 ms after stimulus presentation) and a negative-going component N1 (widely distributed in various brain regions, with a peak latency in the prefrontal region between 60 and 140 ms after stimulus presentation). These early components are related to the individual’s attention to low-level physical features of the stimuli such as brightness and spatial resolution. P1 in emotional face recognition reflects the amygdala and occipital cortex’s crude but rapid monitoring of faces, while N1 reflects the individual’s attention capture by visual stimuli (Gu et al., 2019). In the middle stages of emotional face recognition, compared with non-face stimuli, face stimuli will evoke a negative-going component N170 (mainly distributed in the lateral temporal region and lower temporal region, with a peak latency mainly between 130 and 240 ms after stimulus presentation) in the posterior areas of the human brain. It typically peaks around 170 ms after stimulus presentation. Face stimuli can evoke a prominent N170 wave. Most studies consider N170 to be a face-specific component that is independent of attentional resources, not limited to emotion types, and has an automatic processing property, reflecting the structural encoding of face recognition (Hinojosa et al., 2015; Zhang et al., 2013). In the late stages of emotional face recognition, the P300 that appears near the central vertex and the subsequent positive slow wave, collectively referred to as the late positive potential (LPP), reflect higher-level processing such as conscious evaluation of emotional stimuli, working memory representation, decision making, and coping responses. They are not only influenced by attentional control but also related to the individual’s processing of emotional valence. Not only do emotional expressions elicit larger LPP amplitudes than neutral expressions, but negative expressions elicit larger LPP amplitudes than positive expressions (Luo et al., 2010).

Time–frequency analysis is a necessary complement to time domain signal analysis, which can observe and analyze components that are difficult to observe in the time domain signal. We can see the ERP waveform graph over time in a certain spatial state, and we can also see the increase or decrease of power values in different frequency bands to discuss the non-phase-locked neural oscillation information related to emotional processing by combining time domain and frequency domain information. Previous studies mainly analyze neural rhythmic oscillations in 5 frequency bands: Delta band (1–4 Hz), Theta band (4–8 Hz), Alpha band (8–13 Hz) and Beta band (14–30 Hz) as well as Gamma band (31–45 Hz) (Zhao, 2012). Alpha is the sleep wave, related to deep, subconscious and sleep states. The frequency of the Alpha band is generally defined in the range of 8–13 Hz, and different cortical regions of the brain have their own unique Alpha rhythms (such as motor cortex, visual cortex, etc.). Theta waves are related to intuition, creativity, recollection, imagination, depression, fatigue, intoxication states, and specific psychological tasks can also evoke significant Theta band oscillation activity. Oscillation in this band has a close relationship with working memory, and Theta band oscillation activity increases with the increase of memory load (Gu et al., 2020). Beta rhythms are modulated by different motor tasks and are associated with alertness, excitement, and highly concentrated task states. The Gamma oscillation rhythm is generally greater than 30 Hz and is mainly used in fields related to pain.

Above all, from the perspective of research on the relationship between leadership and emotion, although previous studies have made continuous improvements in measuring these constructs, ranging from single-structure dimensions to multi-structure dimensions and from ability models to mixed models, the traditional questionnaire-based measurements are limited in their ability to deeply explore the underlying neural mechanisms of the relationship between leadership and emotion. In our preliminary search of major databases, we found no studies specifically investigating the temporal course of leadership and emotion processing. Therefore, this study employed a questionnaire survey to assess leadership among college students in the surrounding area. Subsequently, based on the questionnaire results, participants with high and low leadership levels were selected and engaged in a subsequent facial emotion recognition task. The accuracy (ACC) and reaction time (RT) of participants in recognizing faces displaying three different emotional expressions were recorded, while their EEG signals were simultaneously collected using a 32-channel electroencephalogram system. The study aimed to analyze the differences between the two groups of participants in the temporal and time–frequency features during the emotion facial recognition task. Two hypotheses were proposed as follows: (1) College students with high and low leadership levels would exhibit behavioral differences in the facial emotion recognition task, with high-leadership students showing higher accuracy and shorter reaction times compared to low-leadership students. (2) College students with high and low leadership levels would demonstrate stage-specific differences in the EEG indicators during facial emotion recognition.

Method

Tools

Participants completed the self-report version of the Student Leadership Practices Inventory (SLPI: Kouzes & Posner, 2012), which was specifically designed for students. They assessed the frequency of their engagement in each of the 30 inventory items, which were rated on a five-point Likert scale ranging from 1 (rarely) to 5 (frequently). These items corresponded to five practices of exemplary leadership: (1) challenging the process; (2) inspiring a shared vision; (3) enabling others to act; (4) modeling the way; and (5) encouraging the heart. The total responses for each practice could range from 6 to 30, obtained by adding the response scores for each of the six behavioral statements related to that practice. To obtain the total score, the scores from the five dimensions of leadership practice were summed. The responses on the 30 items were also summed to create a composite scale for leadership, with a Cronbach’s alpha coefficient of internal reliability of 0.94 in the current study. The SLPI has demonstrated reasonably robust validity across multiple student populations (Posner, 2009).

Participates

This study used G-power software to estimate the required sample size. With alpha = 0.05 for statistical significance in a two-tailed test, a sample size ≥ 530 was estimated to achieve high statistical power (1 − β = 0.95). Paper questionnaires were distributed to 760 university students from a certain university in Henan Province and surrounding universities, with 750 questionnaires returned. After excluding 19 invalid questionnaires (incomplete or all options were the same), 731 valid questionnaires remained, giving a valid return rate of 96.2%. The age of respondents was 17–25 years, with an average of 19.5 ± 1.14 years. Based on the SLPI (Posner, 2004) scores, the total leadership questionnaire scores were ranked and screened to select subjects in the top and bottom 5% for the high- and low-leadership groups. 35 subjects were selected for each group and contacted, but 7 students declined to participate in the experiment. Finally, 31 subjects were recruited for the high-leadership group, 32 for the low-leadership group, for a total of 63 subjects (30 males, 33 females; age range 17–23 years). Ultimately, participates completed a computerized version of a facial emotion recognition task while their EEG was recorded. All participants provided written informed consent before the experiment, and the assessment procedures were fully explained to them.

Stimuli and procedures

The paradigm procedure was programmed using E-prime 2.0. We used the facial emotion recognition task experimental paradigm, which included stimulus images selected from the native Chinese Facial Affective Picture System (CFAPS) (Lu et al., 2005). The stimuli consisted of 40 fear emotion pictures, 40 happy emotion pictures, and 40 neutral emotion pictures, with an equal number of male and female faces in each category. Participants were seated in a tranquil room with their eyes approximately 100 cm from a 17-inch computer screen. They were instructed to distinguish happy, fear, and neutral emotions by pressing corresponding buttons (4 for right index finger, 5 for right middle finger, and 6 for right ring finger). The task consisted of 240 trials, each beginning with the presentation of a “ + ” for 500 ms. After the fixation, a stimulus was presented in the center of the screen (4.0° × 4.6° visual angle) for 2500 ms. This was followed by a blank screen lasting randomly from 1000 to 2000 ms before the next trial began. Prior to the actual task, a practice session consisting of 15 trials with randomly selected happy, fear, and neutral faces was conducted to ensure that participants fully understood the task (see Fig. 1).

Fig. 1
figure 1

Procedures of the experimental tasks. Stimuli were composed of three emotions (happy, fear, and neutral)

Statistic recording

To analyze the behavioral data, we conducted a repeated-measures analysis of variance (ANOVA) with ACC (proportion of correct responses) and RT (the interval of time between application of a stimulus and detection of a response) as dependent variables. Only correct responses were used for the RT analysis in this study. Emotion (happy, fear, neutral) was included as a within-subjects factor, and group (high leadership vs. low leadership) was included as a between-subjects factor. For effects with two or more degrees of freedom, we adjusted for violations of sphericity using the Greenhouse–Geisser correction.

EEG recording

The EEG was recorded from a 32-channel scalp standard cap using the 10/20 system (Brain Products, Munich, Germany). We monitored the vertical electrooculogram (VEOG) by placing electrodes 1 cm from the outer canthi of the right eye and the horizontal electrooculogram (HEOG) by placing electrodes above and below the left eye. All electrode recordings were referenced online to FCz, and inter-electrode impedances were maintained below 5kΩ. The EEG and EOG signals were amplified using a 0.01–100 Hz band-pass filter and continuously sampled at 500 Hz/channel for offline analysis.

After data acquisition, EEG data were imported into the open-source MATLAB toolboxes EEGLAB and Letswave for neurophysiological data analysis. The EEG recordings were re-referenced to the average of the two mastoids and band-pass filtered between 0.1 and 30 Hz. Independent component analysis (ICA) was used to isolate ocular (blink and saccade) and other remaining artifacts. Epochs were extracted from 200 ms prior to the stimulus onset to 1000 ms post-stimulus interval, and baseline correction was performed using the mean voltage in the 200 ms interval preceding stimulus onset. ERPs were computed offline by averaging according to the experimental design. The data analyses were conducted using MATLAB R2013b (MathWorks, Natick, USA) and SPSS Statistics 20.0 (IBM, Somers, USA).

ERP analysis

The present study focused on the potentials of three ERP components: P1, N170, and LPP. The selection of electrodes and time windows was based on previous relevant studies as well as visual inspection of the topographies in our study (Ji et al., 2021; Gu et al., 2019; Shim et al., 2016). Specifically, P1 amplitudes (130–200 ms) were analyzed at P3, P4, and Pz; N170 amplitudes (130–240 ms) were analyzed at P7 and P8; and LPP amplitudes (400–800 ms) were analyzed at C3, C4, Cz, P3, P4, and Pz. Consistent with previous studies, P1 and N170 amplitudes were measured as baseline-to-peak values, while LPP amplitude was measured as mean value.

Time–frequency analysis

The present study analyzed the oscillatory power of the alpha frequency band (8–14 Hz) at O1 and O2 electrodes, using the continuous wavelet transform (CWT) on single-trial EEG epochs. Time–frequency representations were explored between 1 and 30 Hz in steps of 0.29 Hz, with epochs extracted from 200 ms pre-stimulus to 800 ms post-stimulus time points. To avoid edge effects when performing CWT, the pre-stimulus time interval (−400 ms to −200 ms) was used as a baseline interval. Based on average condition contrast maps and previous studies, a single cluster was tested: 8–14 Hz at 200–800 ms for alpha. The oscillatory power of this component was quantified as the mean amplitude within these time windows for each participant.

Results

Behavioral data

A repeated-measures ANOVA was performed for behavioral data analysis with group (high leadership vs. low leadership) as a between-subject factor and emotion (happy, fear and neutral) as a within-subject factor. Results showed a difference of emotion on accuracy [F(2, 122) = 3.426, p = 0.054,η2p = 0.052], with happy emotion (M = 0.978, SD = 0.005) exhibiting higher accuracy than fear emotion (M = 0.954, SD = 0.013). But the difference between fear and neutral emotion (M = 0.955, SD = 0.008) was not found. The similar statistic measure was performed on reaction time, and we found the reaction time was significantly affected by emotion [F(2, 122) = 15.529, p < 0.001,η2p = 0.203], with the reaction time in recognition of neutral emotion (M = 761.691, SD = 18.055) slower than fear emotion (M = 738.619, SD = 17.615) and the reaction time in recognition of fear emotion significantly slower than happy emotion (M = 695.989, SD = 16.312). (See Table 1 and Fig. 2).

Table 1 Mean accuracy and reaction time (Mean ± SD) of high-leadership vs. low-leadership undergraduates
Fig. 2
figure 2

ACC and RT for three types of emotion between low-leadership and high-leadership groups

ERP amplitude analysis

For the consideration of space, we only included significant results in this part. Figure 3 shows the grand-averaged waveforms of P1, N170, and LPP between groups among three types of emotion.

Fig. 3
figure 3

The amplitude of P1 elicited by faces with different emotions

P1

We conducted a repeated-measures analysis of variance on the maximum amplitude of the wave within the 130–200 ms time window following stimulus presentation, with emotion (happy, fearful, and neutral) and electrode (P3, P4, and Pz) as within-subject factors and group (high leadership and low leadership) as a between-subject factor. Results revealed a significant main effect of emotion [F(2, 122) = 4.373, p = 0.017, η2p = 0.067], indicating that fearful emotion (M = 6.658, SD = 0.574) elicited larger P1 amplitudes than happy emotion (M = 5.551, SD = 0.547) and neutral emotion (M = 5.930, SD = 0.573). There was also a significant main effect of electrode [F(2, 122) = 20.585, p < 0.001, η2p = 0.252), with Pz (M = 7.326, SD = 0.600) eliciting larger P1 amplitudes than P3 (M = 5.855, SD = 0.567) and P4 (M = 4.958, SD = 0.519). The main effect of group was not significant, [F(1, 61) = 2.148, p = 0.148, η2p = 0.034]. The interaction between emotion and group was not significant [F(2, 122) = 0.903, p = 0.401, η2p = 0.015]. The interaction between electrode and group was not significant [F(2, 122) = 1.267, p = 0.283, η2p = 0.020]. The interaction between emotion and electrode was not significant [F(4, 244) = 0.643, p = 0.592, η2p = 0.010]. The interaction among emotion, electrode, and group was not significant [F(4, 244) = 1.472, p = 0.223, η2p = 0.024].

N170

A repeated-measures ANOVA was applied in this component with minimum amplitude between 130 and 240 ms as a dependent variable, emotion (happy, fear, and neutral), electrode (P7 and P8) as within-subject factors, and group (high leadership vs. low leadership) as a between-subject factor. Results showed a significant difference in group [F(1, 61) = 3.817, p = 0.05,η2p = 0.059], with high-leadership group (M = − 1.940, SD = 0.569) elicited larger N170 amplitude than low-leadership group (M = − 0.355, SD = 0.578). The main effects of emotion were not significant [F(2, 122) = 1.111, p = 0.332, η2p = 0.018]. Similarly, the main effects of electrode were not statistically significant [F(1, 61) = 2.576, p = 0.114, η2p = 0.041]. The interaction between emotion and group also did not yield significant results [F(2, 122) = 0.783, p = 0.449, η2p = 0.023]. Additionally, the interaction between electrode and group showed no significant effect [F(1, 61) = 2.836, p = 0.097, η2p = 0.044]. The interaction between electrode and emotion was not significant [F(2, 122) = 0.009, p = 0.987, η2p = 0.001]. Furthermore, the three-way interaction between electrode, emotion, and group was not significant [F(2, 122) = 0.487, p = 0.596, η2p = 0.008].

LPP

A repeated-measures ANOVA was applied in this component with mean amplitude between 400 and 800 ms as a dependent variable, emotion (happy, fear, and neutral), electrode (C3, Cz, C4, P3, Pz, and P4) as within-subject factors, and group (high leadership vs. low leadership) as a between-subject factor. The results showed a marginally significant main effect of emotion on LPP amplitude [F(2, 122) = 2.806, p = 0.071, η2p = 0.044], with fearful emotion (M = 8.884, SD = 0.805) eliciting greater LPP amplitudes than happy emotion (M = 7.177, SD = 0.953). The main effect of electrode was also significant [F(1, 61) = 11.061, p < 0.001, η2p = 0.153], with frontal sites (i.e., C3, C4, Cz) (M = 8.738, SD = 0.815) showing larger LPP amplitudes than parietal sites (i.e., P3, P4, Pz) (M = 7.590, SD = 0.720), and midline sites (i.e., Cz, Pz) (M = 8.945, SD = 0.775) showing larger amplitudes than left hemisphere sites (i.e., C3, P3) (M = 7.773, SD = 0.767) and right hemisphere sites (i.e., C4, P4) (M = 7.775, SD = 0.732). The interaction between emotion × group × electrode was also significant [F(2, 122) = 3.362, p = 0.046, η2p = 0.052], with lower-leadership college students (M = 10.282, SD = 1.274) showing greater LPP amplitudes compared to higher-leadership students (M = 8.423, SD = 1.294) when recognizing happy faces over parietal sites (see Figs. 3, 4 and 5).

Fig. 4
figure 4

The N170 amplitudes for college students with different levels of leadership

Fig. 5
figure 5

The LPP amplitude at the Pz electrode site under the interaction of group and emotion

Time–frequency analysis

Time–frequency measures in alpha band showed a marginal significant difference in emotion [F(2, 122) = 2.935, p = 0.061, η2p = 0.046], with fearful emotion (M = 1.753, SD = 0.160) eliciting enhanced alpha desynchronization compared to happy (M = 2.006, SD = 0.201) and neutral emotion (M = 2.016, SD = 0.200). There is a significant difference between groups [F(1, 61) = 4.745, p = 0.033, η2p = 0.036], which revealed a reduced alpha desynchronization in low-leadership group (M = 2.305, SD = 0.236) compared to high-leadership group(M = 1.546, SD = 0.256). The main effect of electrode was not significant [F(1, 61) = 0.182, p = 0.671, η2p = 0.003]. The interaction between emotion and group was not statistically significant [F(2, 122) = 0.711, p = 0.483, η2p = 0.012]. The interaction between electrode and group also failed to yield statistical significance [F(1, 61) = 0.478, p = 0.492, η2p = 0.008]. The three-way interaction among emotion, electrode, and group was likewise non-significant [F(2, 122) = 1.193, p = 0.306, η2p = 0.019]. (See Fig. 6).

Fig. 6
figure 6

Group-averaged alpha band time–frequency spectrogram during facial emotion recognition task performance. Time (in ms) is denoted on the x-axis, with 0 ms defined as the onset of the stimuli. Frequency (in Hz) is shown on the y-axis

Discussion

The present study aimed to investigate the neural underpinnings of emotional face processing in high-leadership and low-leadership college students. The results indicated no significant differences between the two groups in terms of accuracy and reaction time. However, several significant findings were observed from the ERP data analysis.

Early stage of emotion face recognition

In the early stage of emotion face recognition, this study focused on exploring the role of the early components P1 elicited from the frontal-parietal lobes in emotion face recognition between the two groups. The results showed that the main effect of emotional face on P1 amplitude was significant, with fearful faces eliciting greater P1 amplitudes than happy and neutral faces. This indicates that visual processing resources are automatically captured by negative emotional faces during pre-attentive processing before automation in college students, which may imply that individuals need more attentional resources to process the threatening information conveyed by negative emotional faces (Jiang et al., 2014). The face processing model proposed by Adolphs (2002) suggests that there is a fast perceptual stage jointly executed by brain regions including the striate cortex, subcortical tissues, and amygdala in the early stage of face processing. The finding of the P1 component verifies the validity of this model. In the emotion face recognition task, the P1 component reflects the coarse but rapid categorization of facial information conducted by the early detection system composed of the above brain regions (Schindler et al., 2021). Fearful faces convey threatening or hostile information that is vital to individuals’ adaptability, while happy faces convey friendly information that is more common in interpersonal interactions, and neutral faces convey uncertain information. Our study found that participants showed greater brain responses to negative emotional faces, consistent with Luo et al., and and’s (2010) finding that fearful faces led to enhanced P1 amplitudes in the early stage of emotion face recognition. In addition, the main effect of group on P1 amplitude was not significant. Previous studies have used the P1 component to examine individuals’ proximal attention to salient physical features of space and objects (Smith et al., 2003). In the field of emotion face recognition, P1 also reflects the rapid extraction and unconscious processing of emotional information (Vuilleumier & Pourtois, 2007). The lack of significant differences in P1 amplitudes between the two groups indicates no obvious differences between high- and low-leadership college students in the early stage of emotion processing.

Middle stage of emotion face recognition

In the middle stage of emotion face recognition, one of our main findings is that the main effect of group on N170 amplitude was significant, with higher-leadership college students eliciting greater N170 amplitudes than lower-leadership students. Brain magnetic and electrical studies have shown that a negative-going component N170 occurring around 170 ms after stimulus onset can be observed near the temporal-parietal region, reflecting the encoding process of facial configuration according to the cognitive economy principle. In addition, N170 also reflects individuals’ visual experience (familiarity level). Some studies have found that when individuals learn something new and become skilled to a certain level, larger N170 responses can also be elicited. For example, Rossion et al. (2002) found that native language could elicit N170 components similar to faces. Herrmann (2007) found same-race faces elicited larger N170 than other-race faces. Larger N170 indicates greater neural activity in encoding facial configuration, suggesting greater brain flexibility in processing facial structural information and representing better facial configuration encoding ability (Yang et al., 2020). Since emotion face recognition paradigms have been mostly conducted in special participant groups, the smaller N170 amplitudes elicited by patients with severe depression, ADHD, and schizophrenia compared to normal controls also indirectly concur with our experimental results (Aydin et al., 2023; Chen et al., 2022; Salisbury et al., 2019).

In addition, we also found that the main effect of emotion and other interaction effects did not significantly affect N170 amplitude. This indicates that N170 is a face configuration specific component, unaffected by feature information like emotional valence. First, according to Bruce and Young's emotion face processing stage model, facial configuration encoding and expression processing proceed independently, with facial configuration encoding unaffected by emotional information. Thus, no interaction exists between N170 and emotional valence (Bruce & Young, 1986). Our lack of finding an effect of emotional valence on N170 amplitude also supports this model. Krumhuber et al. (2019) manipulated the orientation of emotional faces and their valence and found N170 unaffected by face orientation or valence, verifying Bruce’s hypothetical model. Subsequent studies using 6 basic emotions also obtained consistent conclusions.

Late stage of emotion face recognition

In the late stage of emotion processing, first, we found a marginally significant main effect of emotional face on LPP amplitude, with fearful faces eliciting larger LPP than happy and neutral faces. LPP refers to the late positive component elicited from the central-parietal region 300 ms after emotion face presentation. During emotion processing, LPP reflects the conscious processing of emotional stimuli by the cerebral cortex. In this process, the brain evaluates the attributes of stimuli, represents memories, and makes decisions and responses based on task requirements (Schupp et al., 2006). Frühholz et al. (2009) examined ERP components elicited when background and face colors were consistent or inconsistent. They found larger LPP amplitudes in the inconsistent than consistent condition, with negative emotional faces eliciting larger LPP than other valences. Hietanen and Astikainen (2013) compared ERP components for emotional faces and pictures and found larger LPP for faces, especially for negative emotions. Luo et al. (2010) used Chinese emotional faces and Chinese participants, proposing a three-stage model of emotion processing, with negative faces eliciting larger LPP than positive or neutral faces in the late stage. Second, we found lower-leadership college students showed larger LPP amplitudes than higher-leadership students for happy faces, suggesting slower attentional responses to positive stimuli for lower-leadership individuals, implying some insensitivity to positive emotional stimuli. Previous studies on negative emotion processing biases in anxiety disorders, post-traumatic stress disorder (PTSD), and various phobias also found larger LPP amplitudes in patients than healthy controls.

Neural oscillation characteristics

In addition to ERP analysis, this study also used continuous wavelet transform to examine the dynamic neural oscillation information elicited during emotion face recognition between the two groups. Similar to the ERP results, the main effect of group was significant, with lower Alpha power for the higher-leadership group. Alpha power is an important indicator of individuals’ attention, memory, and agility. When individuals are awake, the magnitude of Alpha waves is negatively correlated with cortical activity in corresponding brain regions. That is, when concentrating on a stimulus, Alpha power decreases, while not focusing on a stimulus increases Alpha power (Keitel et al., 2019). Keune et al.’s (2017) study on the relationship between resting state neural oscillations and cognitive function in multiple sclerosis patients found Alpha power increases negatively correlated with processing speed. A magnetoencephalography (MEG) study revealed a positive predictive relationship between Alpha power and cognitive decline (Schoonheim et al., 2013). Our results show greater Alpha desynchronization associated with reduced brain activity in the lower-leadership group, suggesting differences in attention, memory and other cognitive processes during emotion face recognition compared to the higher-leadership group.

However, our study has some limitations that need to be addressed. Firstly, our sample only included college students. It is unclear whether these findings can be generalized to managers in business and public-sector organizations, which operate in significantly different environments. Future studies should aim to replicate our findings in a more diverse sample. Secondly, the measurement of leadership in our study was self-reported by the participants, which may have been influenced by social desirability bias. Future studies could incorporate additional measures of leadership, such as evaluations from teachers, classmates, or other objective sources. This would allow for a more comprehensive assessment of the participants’ leadership abilities.

Conclusion

In summary, this study found that high leadership level college students exhibited larger N170 amplitudes, while low-leadership group students exhibited larger LPP amplitudes under an interaction of emotion valence, as analyzed using the ERPs P1, N170, and LPP. Additionally, our results showed attenuated brain oscillations (alpha) in low-leadership college students. These findings suggest that deficits in facial emotion processing, particularly among low-leadership individuals, may be due to impairments in visual processing, facial structure encoding, attention, and emotional semantic distribution abilities.

Availability of data and materials

The data and materials supporting the findings of this study are available from the corresponding author upon reasonable request.

Change history

Abbreviations

EEG:

Electroencephalography

ERPs:

Event-related potentials

LPP:

Late positive potential

ACC:

Accuracy

RT:

Reaction time

SLPI:

Student leadership practices inventory

CFAPS:

Chinese facial affective picture system

ANOVA:

Analysis of variance

VEOG:

Vertical electrooculogram

HEOG:

Horizontal electrooculogram

ICA:

Independent component analysis

CWT:

Continuous wavelet transform

PTSD:

Post-traumatic stress disorder

MEG:

Magnetoencephalography

References

  • Adolphs, R. (2002). Recognizing emotion from facial expressions: Psychological and neurological mechanisms. Behavioral and Cognitive Neuroscience Reviews, 1(1), 21–62.

    Article  PubMed  Google Scholar 

  • Avolio, B. J., Gardner, W. L., Walumbwa, F. O., Luthans, F., & May, D. R. (2004). Unlocking the mask: A look at the process by which authentic leaders impact follower attitudes and behaviors. The Leadership Quarterly, 15(6), 801–823.

    Article  Google Scholar 

  • Aydin, Ü., Cañigueral, R., Tye, C., & McLoughlin, G. (2023). Face processing in young adults with autism and ADHD: An event related potentials study. Frontiers in Psychiatry, 14, 1080681.

    Article  PubMed  PubMed Central  Google Scholar 

  • Bass, B. M., & Riggio, R. E. (2006). Transformational leadership (2nd ed.). Psychology Press. https://doi.org/10.4324/9781410617095.

  • Bruce, V., & Young, A. W. (1986). Understanding face recognition. British Journal of Psychology, 77(3), 305–327.

    Article  PubMed  Google Scholar 

  • Burns, J. M. (1978). Leadership. New York: Harper and Row. Leadership Quarterly, 2(1), 1.

    MathSciNet  Google Scholar 

  • Chen, Y., Tong, F., Zhao, L., & Sun, G. (2022). Intact face detection in young patients with major depressive disorder revealed by the face-specific N170 component. Scientific Reports, 12(1), 14099.

    Article  CAS  PubMed  PubMed Central  ADS  Google Scholar 

  • Frühholz, S., Fehr, T., & Herrmann, M. (2009). Early and late temporo-spatial effects of contextual interference during perception of facial affect. International Journal of Psychophysiology, 74(1), 1–13.

    Article  PubMed  Google Scholar 

  • Gómez-Leal, R., Holzer, A. A., Bradley, C., Fernández-Berrocal, P., & Patti, J. (2022). The relationship between emotional intelligence and leadership in school leaders: A systematic review. Cambridge Journal of Education, 52(1), 1–21.

    Article  Google Scholar 

  • Gu, H., Chen, Q., Xing, X., Zhao, J., & Li, X. (2019). Facial emotion recognition in deaf children: Evidence from event-related potentials and event-related spectral perturbation analysis. Neuroscience Letters, 703, 198–204.

    Article  CAS  PubMed  Google Scholar 

  • Herrmann, M. J., Schreppel, T., Jäger, D., Koehler, S., Ehlis, A. C., & Fallgatter, A. J. (2007). The other-race effect for face perception: An event-related potential study. Journal of Neural Transmission, 114(7), 951.

    Article  CAS  PubMed  Google Scholar 

  • Hietanen, J. K., & Astikainen, P. (2013). N170 response to facial expressions is modulated by the affective congruency between the emotional expression and preceding affective picture. Biological Psychology, 92(2), 114–124.

    Article  PubMed  Google Scholar 

  • Hinojosa, J. A., Mercado, F., & Carretie, L. (2015). N170 sensitivity to facial expression: A meta-analysis. Neuroscience and Biobehavioral Reviews, 55, 498–509.

    Article  CAS  PubMed  Google Scholar 

  • Ji, L., Zhao, Q., Gu, H., Chen, Y., Zhao, J., Jiang, X., & Wu, L. (2021). Effect of executive function on event-based prospective memory for different forms of learning disabilities. Frontiers in Psychology, 12, 528883.

  • Jiang, Z., Li, W., Recio, G., Liu, Y., Luo, W., Zhang, D., & Sun, D. (2014). Time pressure inhibits dynamic advantage in the classification of facial expressions of emotion. PLoS ONE, 9(6), e100162.

    Article  PubMed  PubMed Central  ADS  Google Scholar 

  • Keitel, C., Keitel, A., Benwell, C. S., Daube, C., Thut, G., & Gross, J. (2019). Stimulus-driven brain rhythms within the alpha band: The attentional-modulation conundrum. Journal of Neuroscience, 39(16), 3119–3129.

    Article  CAS  PubMed  Google Scholar 

  • Kerr, R., Garvin, J., Heaton, N., & Boyle, E. (2006). Emotional intelligence and leadership effectiveness. Leadership & Organization Development Journal, 27(4), 265–279.

  • Keune, P. M., Hansen, S., Weber, E., Zapf, F., Habich, J., Muenssinger, J., & Oschmann, P. (2017). Exploring resting-state EEG brain oscillatory activity in relation to cognitive functioning in multiple sclerosis. Clinical Neurophysiology, 128(9), 1746–1754.

    Article  PubMed  Google Scholar 

  • Kouzes, J. M., & Posner, B. Z. (2012). LPI: Leadership Practices Inventory: Development Planner (Vol. 270). John Wiley & Sons.

  • Krumhuber, E. G., Lai, Y. K., Rosin, P. L., & Hugenberg, K. (2019). When facial expressions do and do not signal minds: The role of face inversion, expression dynamism, and emotion type. Emotion, 19(4), 746.

    Article  PubMed  Google Scholar 

  • Lu, B., Hui, M. A., & Yu-Xia, H. (2005). The Development of Native Chinese Affective Picture System--A pretest in 46 College Students. Chinese mental health journal.

  • Luo, W., Feng, W., He, W., Wang, N. Y., & Luo, Y. J. (2010). Three stages of facial expression processing: ERP study with rapid serial visual presentation. NeuroImage, 49(2), 1857–1867.

    Article  PubMed  Google Scholar 

  • Mayer, J. D., & Salovey, P. (1993). The intelligence of emotional intelligence. Intelligence, 17(4), 433–442.

    Article  Google Scholar 

  • Montagne, B., Kessels, R. P., De Haan, E. H., & Perrett, D. I. (2007). The emotion recognition task: A paradigm to measure the perception of facial emotional expressions at different intensities. Perceptual and Motor Skills, 104(2), 589–598.

    Article  PubMed  Google Scholar 

  • Orejarena, H., Zambrano, O., & Carvajal, M. (2019). Emotional intelligence and its influence on organizational leadership in the VUCA world. 4th International Conference on Social, Business, and Academic Leadership (ICSBAL 2019).

  • Posner, B. Z. (2004). A leadership development instrument for students: Updated. Journal of college student development, 45(4), 443–456.

  • Posner, B. Z. (2009). A longitudinal study examining changes in students' leadership behavior. Journal of College Student Development, 50(5), 551–563.

  • Richards, D. A., & Hackett, R. D. (2012). Attachment and emotion regulation: Compensatory interactions and leader–member exchange. The Leadership Quarterly, 23(4), 686–701.

    Article  Google Scholar 

  • Rossion, B., Gauthier, I., Goffaux, V., Tarr, M. J., & Crommelinck, M. (2002). Expertise training with novel objects leads to left-lateralized facelike electrophysiological responses. Psychological Science, 13(3), 250–257.

    Article  CAS  PubMed  Google Scholar 

  • Rubin, R. S., Munz, D. C., & Bommer, W. H. (2005). Leading from within: The effects of emotion recognition and personality on transformational leadership behavior. Academy of Management Journal, 48(5), 845–858.

    Article  Google Scholar 

  • Salisbury, D. F., Krompinger, J. W., Lynn, S. K., Onitsuka, T., & McCarley, R. W. (2019). Neutral face and complex object neurophysiological processing deficits in long-term schizophrenia and in first hospitalized schizophrenia-spectrum individuals. International Journal of Psychophysiology, 145, 57–64.

    Article  PubMed  PubMed Central  Google Scholar 

  • Schindler, S., Bruchmann, M., Gathmann, B., Moeck, R., & Straube, T. (2021). Effects of low-level visual information and perceptual load on P1 and N170 responses to emotional expressions. Cortex, 136, 14–27.

    Article  PubMed  Google Scholar 

  • Schoonheim, M. M., Geurts, J. J., Landi, D., Douw, L., van der Meer, M. L., Vrenken, H., & Stam, C. J. (2013). Functional connectivity changes in multiple sclerosis patients: a graph analytical study of MEG resting state data. Human Brain Mapping, 34(1), 52–61.

    Article  PubMed  Google Scholar 

  • Schupp, H. T., Flaisch, T., Stockburger, J., & Junghöfer, M. (2006). Emotion and attention: Event-related brain potential studies. Progress in Brain Research, 156, 31–51.

    Article  PubMed  Google Scholar 

  • Smith, N. K., Cacioppo, J. T., Larsen, J. T., & Chartrand, T. L. (2003). May I have your attention, please: Electrocortical responses to positive and negative stimuli. Neuropsychologia, 41(2), 171–183.

    Article  PubMed  Google Scholar 

  • Toegel, G., Kilduff, M., & Anand, N. (2013). Emotion helping by managers: An emergent understanding of discrepant role expectations and outcomes. Academy of Management Journal, 56(2), 334–357.

    Article  Google Scholar 

  • Trichas, S., & Schyns, B. (2012). The face of leadership: Perceiving leaders from facial expression. The Leadership Quarterly, 23(3), 545–566.

    Article  Google Scholar 

  • Visser, V. (2013). Leader affect and leadership effectiveness: How leader affective displays influence follower outcomes (No. EPS-2013-286-ORG).

  • Vuilleumier, P., & Pourtois, G. (2007). Distributed and interactive brain mechanisms during emotion face perception: Evidence from functional neuroimaging. Neuropsychologia, 45(1), 174–194.

    Article  PubMed  Google Scholar 

  • Weber, M. (1963). The sociology of religion/Max Weber. Beacon Press.

    Google Scholar 

  • Yang, Y. F., Brunet-Gouet, E., Burca, M., Kalunga, E. K., & Amorim, M. A. (2020). Brain processes while struggling with evidence accumulation during facial emotion recognition: An ERP study. Frontiers in Human Neuroscience, 14, 340.

    Article  PubMed  PubMed Central  Google Scholar 

  • Zhang, D., Luo, W., & Luo, Y. (2013). Single-trial ERP evidence for the three-stage scheme of facial expression processing. Science China Life Sciences, 56(9), 835–847.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

None.

Funding

The study is supported by Philosophy and Social Science Planning Project of Henan province in China under Grant [number 2021BTY002]. Ministry of Education Humanities and Social Science Project-Intervention Effects of Transcranial Magnetic Stimulation Combined with Aerobic Exercise on Emotional Development in Hearing-Impaired Children [number 23C10475055].

Author information

Authors and Affiliations

Authors

Contributions

HG, SD, PJ, and MZ conceived and designed the study. PJ and SD collected and analyzed the data. HG and SD wrote the manuscript. HH and CW provided critical feedback and revised the manuscript. All authors have read and approved the final version of the manuscript.

Corresponding author

Correspondence to Mingnan Zhao.

Ethics declarations

Ethics approval and consent to participate

All participants provided written informed consent prior to the experiment, and assessment procedures were fully explained.

Consent for publication

This paper is new. Neither the entire paper nor any part of its content has been published or has been accepted elsewhere. It is not being submitted to any other journal.

Competing interests

There are no competing interests in this study.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gu, H., Du, S., Jin, P. et al. The role of leadership level in college students’ facial emotion recognition: evidence from event-related potential analysis. Cogn. Research 8, 73 (2023). https://doi.org/10.1186/s41235-023-00523-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41235-023-00523-9

Keywords