Gaze Cueing experiment

The Gaze Cuing experiment is an eye-tracking task that measures a participant's sensitivity to another person's gaze direction as a possible cue to predict the location of a next event. Sensitivity to gaze direction is taken as a marker of social competence. In the Social Gaze Task, children see a face with direct gaze, followed by an eye gaze shift to one side, followed by a small object (target) that appears on the cued side or the opposite side. Eye movements are recorded using a Tobii TX300 eye tracker at 300 Hz frequency. The dependent variable is the latency with which the child detects the target. Generally, people detect targets on the cued side faster than targets on the opposite side. The reaction time differences between cued and opposite-side targets have been taken to reflect better social skill.

In YOUth, the task is available in both the Baby & Child cohort (Infant Social Gaze) and in the Child & Adolescent cohort (Child Social Gaze). In the Baby and Child cohort, infants received no instruction, whereas in the Child and Adolescent cohort, children were instructed to first look at the face, and then at the cued object as quickly as possible. Data collection in the Child and Adolescent cohort stopped in May 2022.

Citation Loading citation...
Keywords
Constructs
Waves
  • Baby and Child

  • 5 months
  •   C
    4 months - 7 months   From 2016-05-27 to 2023-11-01

  • 10 months
  •   C
    9 months - 16 months   From 2016-08-22 to 2023-11-01

  • 3 years
  •   C
    24 months - 5.0 years   From 2018-06-22 to 2023-11-01

  • 6 years
  •   C
    5.0 - 7.0 years   From 2022-07-01 to 2023-11-01

  • Child and Adolescent

  • 9 years
  •   C
    8.0 - 10.0 years   From 2016-03-14 to 2020-04-14

  • 12 years
  •   C
    11.0 - 16.0 years   From 2019-07-04 to 2022-12-01

    Mode of collection MeasurementsAndTests    Eyetracking
    Analysis unit Individual
    Instrument name Social Gaze Task
    Alternate name Child social gaze, Infant social gaze
    Measure name Gaze Cueing experiment
    References
    You can also access this dataset using the API (see API Docs).