EEG Face Emotion

The EEG Face Emotion task aims understand how the developing brain differentially responds to viewing faces with different facial expressions (happy and fear). In the task, young children (from 10 months onwards) passively watch pictures of happy or fearful faces while their Electrocardiography (EEG) is measured. Note that the same faces, but with neutral expressions, are presented in EEG Face House experiment. This task lasts about 4 minutes.

Continuous EEG was recorded using a 32-channel ActiveTwo BioSemi system (Amsterdam, Netherlands), configured to the standard International 10-20 System (channels: 28 lateral channels FP1/2; F7/8; F3/4; AF3/4; FC1/2, FC5/6, C3/4, T7/8, CP1/2, CP5/6, P3/4, P7/8, O1/2, PO3/4, plus 4 midline channels Fz, Cz, Pz, Oz). For some children, an additional eye-electrode was placed behind the child’s left eye (Ex3), and/or additional loose electrodes were positioned at the mastoids (Ex1-2). Electrodes offset were less than 20μv. The EEG data were recorded relative to common mode sense and driven right leg (CMS/DRL) electrodes placed near Cz. Continuous EEG was acquired at a 2048Hz sample rate using Actiview (version 7.05) from a Dell latitude E5540 laptop (operating system Windows 10 Professional; in lab 0.41 version details: i5-4310U CPU @ 2,00 GHz 2,60GHz 8GB; in lab 0.42, version details: i3-4010U CPU @ 1,70 GHz 1,70GHz 4GB). Tasks were programmed in Matlab using Psych-Toolbox 3 (Brainard & Vision, 1997) from a second laptop (MacBookPro 11,1 13 inch retina OSX 10.9.5 Intel Core i7 2,8 GHz 16GB 1600 MHz DDR3).

Citation Loading citation...
Keywords
Constructs
Waves
  • Baby and Child

  • 10 months
  •   C
    9 months - 16 months   From 2016-08-22 to 2023-11-01

  • 3 years
  •   C
    24 months - 5.0 years   From 2018-06-22 to 2023-11-01

  • 6 years
  •   C
    5.0 - 7.0 years   From 2022-07-01 to 2023-11-01

    Mode of collection MeasurementsAndTests    EEG
    Analysis unit Individual
    Instrument name EEG Face Emotion
    Measure name EEG Face emotion
    You can also access this dataset using the API (see API Docs).