Psychological states are closely linked to measurable physiological processes. Key data sources include brain activity (EEG, fNIRS), autonomic nervous system indicators such as heart rate, skin conductance, and respiration, as well as ocular metrics like eye tracking and pupil dilation. Reliable data collection requires suitable technologies, attention to ecological validity and participant comfort, and adherence to ethical standards. Processing steps such as artifact removal, normalization, and feature extraction transform raw signals into usable information. Analytical and visualization strategies then make it possible to observe patterns of attention, stress, engagement, and other cognitive or emotional responses.
The concept of mind-body interaction represents a profound scientific frontier that challenges traditional boundaries between psychological and physiological domains. At its core, this principle argues that mental processes are not isolated abstract phenomena, but dynamic, measurable biological events with tangible physiological correlates.
Historical perspectives on mind-body interactions have evolved dramatically. Early philosophical approaches, such as Cartesian dualism, viewed mind and body as separate entities. Contemporary neuroscience, however, demonstrates their intricate, inseparable relationship. Every cognitive event—whether it's a fleeting emotion, a complex decision, or a subtle perception—generates immediate and measurable changes in bodily systems.
For instance, when a person experiences stress, the following physiological cascade occurs:
These interconnected responses illustrate how mental states are not merely metaphorical but represent complex, quantifiable neurophysiological processes.
Physiological data sources can be categorized into three primary domains, each offering unique insights into human cognitive and emotional experiences:
Understanding the functioning of the brain is at the heart of neurometrics. While the autonomic nervous system (ANS) reflects how the body reacts to internal and external stimuli, neurological signals provide direct insights into what the brain is doing—that is, how it processes, perceives, and evaluates a stimulus in real time. In this context, technologies such as Electroencephalography (EEG) and functional Near-Infrared Spectroscopy (fNIRS) have become essential tools in assessing neural dynamics in applied contexts such as film production, advertising, and user experience design.
Electroencephalography represents one of the most fascinating windows into human brain activity, offering a non-invasive method to observe the brain's electrical symphony. The journey of EEG began in 1924 when Hans Berger, a German psychiatrist, first recorded electrical signals from the human brain, marking a revolutionary moment in neuroscience. At its core, EEG measures the electrical activity generated by neural populations in the brain. Imagine the brain as an incredibly complex network of billions of neurons, each acting like a tiny electrical generator. When these neurons communicate, they produce minute electrical signals that can be detected by sensitive electrodes placed on the scalp.
Neurons communicate through electrical and chemical signals. When a neuron fires, it creates a small electrical potential. Thousands of neurons firing simultaneously produce measurable electrical fields that can be detected on the scalp.
EEG electrodes are typically arranged according to the international 10-20 system, a standardized method of electrode placement that ensures consistent and reproducible measurements:
EEG captures different types of brain waves, each associated with specific mental states:
One of EEG’s key advantages is its ability to track rapid changes in cognitive and emotional states, such as shifts in attention, mental effort, or engagement, with a latency of milliseconds. This makes EEG especially suitable for studying dynamic and temporally rich stimuli like films.
Examples of EEG-based neurometrics include: Mental workload, Assessed via increased theta power in the frontal cortex and suppressed alpha in the parietal areas (Borghini et al., 2013); Attention/vigilance, Monitored through frontal theta/beta ratios or blink rate synchrony with EEG (Sciaraffa et al., 2021); Approach-withdrawal, Measured via frontal alpha asymmetry, with greater left-side activity indicating positive approach-related motivation (Davidson et al., 1990).
However, EEG is sensitive to artifacts—electrical noise from eye movements, muscle contractions, or external devices. For this reason, careful signal processing and noise reduction techniques (e.g., Independent Component Analysis) are essential for accurate interpretation (Uriguen et al., 2015).
Functional Near-Infrared Spectroscopy (fNIRS) emerged as a groundbreaking neuroimaging technique in the late 1990s, offering a unique window into brain functioning that complements and differs from traditional electroencephalography (EEG). While EEG measures electrical activity, fNIRS explores the brain's hemodynamic responses—the changes in blood flow and oxygenation that occur during neural activity. Unlike EEG's electrical signal detection, fNIRS operates on a fundamentally different principle: light-based measurement of brain activity. The technique uses near-infrared light to penetrate the skull and measure changes in hemoglobin concentration in the brain's cortical regions.
Analogously to what done with EEG analysis, also fNIRs data are usually processed to obtain specific neurometrics of certain mental states, such as: Mental workload and cognitive control: Increases in oxygenated hemoglobin (HbO) in the dorsolateral prefrontal cortex are correlated with executive function demands (Ayaz et al., 2012); Emotional processing: Changes in prefrontal activation can indicate emotional valence and regulation strategies (Herrmann et al., 2008); User interaction studies: Due to its portability and tolerance to movement, fNIRS is increasingly used in real-world applications, including wearable systems for neuroergonomics.
Although fNIRS has lower temporal resolution than EEG, it is less sensitive to movement artifacts, making it advantageous in some ecological environments such as on-set filming or immersive virtual reality experiences.
Generally speaking, while EEG captures the brain's electrical symphony, fNIRS observes its metabolic choreography. Together, they provide a more comprehensive understanding of neural functioning, demonstrating that consciousness is a complex, multilayered phenomenon. The most advanced neuroimaging approaches increasingly combine multiple techniques, including simultaneous EEG and fNIRS recordings.
The Autonomic Nervous System (ANS) plays a central role in regulating unconscious physiological functions such as heart rate, respiration, perspiration, and pupil dilation. These processes are tightly linked to emotional and cognitive states, making the ANS an essential target in neurometric research. By analyzing ANS-related signals, researchers can non-invasively infer levels of stress, arousal, engagement, and emotional valence—states that are highly relevant in the context of audience response to cinematic content.
The ANS consists of two main branches: the sympathetic nervous system (SNS), which is activated in response to stress, danger, or high arousal situations—commonly known as the “fight or flight” response. The parasympathetic nervous system (PNS), which promotes relaxation and recovery—the “rest and digest” response.
Also referred to as electrodermal activity (EDA), GSR is one of the most widely used ANS indicators in neurometrics. It measures changes in the skin's electrical conductance due to sweat gland activity, which is directly modulated by sympathetic arousal. Increased GSR is associated with heightened emotional arousal, stress, or attentional engagement. This makes it particularly useful for assessing moment-by-moment reactions to film scenes, trailers, or user experiences. For example, emotionally intense or suspenseful scenes typically elicit higher GSR amplitudes, indicating stronger engagement (Sequeira et al., 2009; Bradley & Lang, 2000).
Heart Rate (HR) is a robust physiological marker of arousal, influenced by both sympathetic and parasympathetic input. When an individual is emotionally engaged or stressed, HR typically increases due to sympathetic activation. Conversely, during calm or pleasant experiences, the parasympathetic system dominates, reducing HR. Heart Rate Variability (HRV) refers to the variation in time intervals between heartbeats. It reflects the dynamic balance between the SNS and PNS. High HRV is generally associated with emotional regulation, resilience, and relaxed attentional states (Thayer et al., 2012). In neurometric applications, HRV can help differentiate emotional valence: for instance, a high HRV may indicate a relaxed enjoyment of a scene, whereas a sharp HR increase with low variability may reflect tension or stress.
Respiration is another ANS-controlled process. Emotional states affect not only how fast we breathe but also how deeply and rhythmically. Stress, fear, or excitement may lead to faster, shallower breaths, whereas slow, deep breathing is typical of relaxed states (Philippot et al., 2002). Although less commonly used in film-related neurometrics, respiration patterns can enhance multimodal assessments by providing additional context to HR and GSR signals.
Pupillometry, the study of pupil size, is a sensitive marker of mental effort and emotional arousal. The pupils dilate under cognitive load or emotional stimulation—even in constant lighting—due to sympathetic activation. This phenomenon has been leveraged to assess attention and interest during film watching or advertising exposure (Bradley et al., 2008). Pupil dilation is often recorded via eye-tracking systems and complements visual attention metrics such as fixation and gaze patterns. For example, larger pupils while watching a specific scene may suggest cognitive or emotional intensity.
Though less commonly used than GSR and HR, skin temperature and blood volume pulse (BVP) also reflect ANS activity. Stress and fear often result in peripheral vasoconstriction, reducing skin temperature, particularly in the extremities. These measures can be captured via thermistors or photoplethysmography (PPG), adding another layer of emotional insight.
While neurological and autonomic signals offer insights into internal cognitive and emotional states, ocular metrics provide a direct window into visual attention and mental processing. The eyes are not only the organ of vision, but also an external indicator of what the brain is focusing on, how it processes information, and how engaged it is. In neurometrics, eye tracking has become a foundational tool for evaluating user experience, attention, and affective responses to visual stimuli—especially in film, advertising, gaming, and immersive media.
Ocular metrics offer several advantages: they are non-invasive, highly precise, and often intuitive to interpret. In the context of film production, they allow creators to understand what viewers look at, for how long, and in what sequence—providing critical feedback on visual saliency, narrative design, and emotional pacing.
Time to First Fixation (TTFF) is the time (in seconds) that elapses between the onset of a visual stimulus and the moment a viewer first fixates on a predefined Area of Interest (AOI).
A shorter TTFF typically indicates that the AOI is visually salient or cognitively prioritized. For instance, in a film scene, if viewers consistently fixate first on a character’s face rather than on the background, this suggests effective visual hierarchy and narrative focus (Mancini et al., 2022a).
TTFF is especially relevant in evaluating editing choices, scene composition, and graphic design in trailers, commercials, and title sequences.
The Eye Ball Metric—also referred to as fixation count or percentage of viewers fixating on an AOI—quantifies how many users visually attended to a specific element. For example, if 85% of viewers looked at a logo within the first 5 seconds of a commercial, it indicates strong visual capture.
This metric is key in branding, interface testing, and set design evaluation. High EBM values suggest successful visual targeting—crucial for both artistic and marketing goals in audiovisual content (Mancini et al., 2022b).
Dwell time, also known as gaze duration, refers to the total time a participant spends looking at an AOI during a trial or scene. It is one of the most commonly used metrics in attention research.
Longer dwell time usually implies higher interest, cognitive engagement, or emotional relevance. However, it may also reflect confusion or complexity—hence the importance of interpreting this metric in context, ideally in combination with EEG or GSR data (Cherubino et al., 2019).
In filmmaking, dwell time can reveal the narrative pull of a character or the visual strength of a key prop or logo.
Scanpaths represent the sequence of fixations and saccades (rapid eye movements) during viewing. They provide a dynamic map of how visual exploration unfolds over time.
By analyzing scanpaths, one can assess whether viewers follow the intended visual flow of a scene or are distracted by peripheral elements. In narrative filmmaking, coherent scanpaths across participants may indicate successful visual storytelling, while scattered patterns may suggest overly complex or unclear design (Noton & Stark, 1971; Duchowski, 2007).
As discussed in the chapter on ANS indicators, pupil dilation is not only a physiological reaction but also a cognitive and emotional marker. Larger pupils are associated with mental effort, surprise, and emotional arousal (Bradley et al., 2008).
Pupil size changes can be tracked simultaneously with gaze position, offering a nuanced view of both where and how users respond to a stimulus. For instance, a dramatic plot twist or jump scare might trigger simultaneous dilation and fixation clustering, pointing to both attention and affective intensity.
Eye blinks—especially their frequency and duration—are influenced by cognitive load and attentional states. Lower blink rates are typically observed during periods of focused attention, while higher rates may signal mental fatigue, boredom, or cognitive overload (Stern et al., 1994).
In long-format content such as films or documentaries, changes in blink rate over time can help map viewer engagement curves and identify drop-off moments.
The different data sources here listed have not to be considered as alternative, on the contrary their insightfulness will be enhanced by their combined use, in what is known as multimodal approach.
For instance, with respect to attention, it's essential to distinguish between visual attention (what the eyes look at) and cognitive attention (what the brain processes deeply). Eye-tracking provides observable evidence of attention, while EEG or fNIRS measure its internal processing counterpart. For example: a viewer may look at an actor’s face (eye tracking), but EEG may show low engagement—suggesting automatic viewing without emotional or cognitive investment. Conversely, sustained dwell time and pupil dilation combined with theta EEG activity may indicate deep narrative immersion. This highlights the benefit of combining ocular metrics with neural or autonomic signals for a comprehensive understanding of audience experience.
Next Lesson
Collecting and Processing Physiological Data
Learn how collecting neurometric data goes far beyond hooking up sensors. It requires reliable setups, careful experimental design, and attention to participant comfort and ethics so that raw signals truly reflect authentic audience experience.