annadeichler's picture
Update README.md
1dac25b verified

Dataset Card for Gaze-Speech Analysis in Referential Communication with ARIA Headset

Dataset Description

This dataset investigates the synchronization of eye tracking and speech recognition using Aria smart glasses to determine whether individuals exhibit visual and verbal synchronization when identifying an object. Participants were tasked with identifying food items from a recipe while wearing Aria glasses, which recorded their eye movements and speech in real time. The dataset provides insight into gaze-speech synchronization patterns in referential communication.

  • Curated by: KTH Royal Institute of Technology
  • Language(s) (NLP): English
  • License: CC BY-NC-ND 4.0 (Link)

Dataset Details

  • Total duration: 2.259 h
  • Number of takes: 96
  • Average take duration: 84.7 s

Example from dataset. Example from dataset.

Dataset Sources [optional]

TBA

Direct Use

This dataset is suitable for research in:

  • Referential communication analysis
  • Gaze and speech synchronization
  • Human-robot interaction and multimodal dialogue systems
  • Eye-tracking studies in task-based environments

Out-of-Scope Use

  • The dataset is not intended for commercial applications without proper ethical considerations.
  • Misuse in contexts where privacy-sensitive information might be inferred or manipulated should be avoided.

Dataset Structure

  • Participants: 20 individuals (2 men, 18 women).
  • Data Collection Setup: Participants memorized a series of ingredients and steps in five recipes and verbally instructed the steps while wearing ARIA glasses.
  • Recorded Data: Eye movement (gaze tracking) and speech (audio recordings), alongside 3rd view camera.
  • Analysis Methods: Python-based temporal correlation detection, helper functions to plot gaze fixations and track objects.

Dataset Creation

Curation Rationale

The dataset was created to explore how gaze and speech synchronize in referential communication and whether object location influences this synchronization.

Source Data

Data Collection and Processing

  • Hardware: ARIA smart glasses, GoPro camera
  • Collection Method: Participants wore ARIA glasses while describing recipe ingredients and steps, allowing real-time capture of gaze and verbal utterances.

Who are the source data producers?

  • KTH Students involved in the project: Gong, Yanliang; Hafsteinsdóttir, Kristín; He, Yiyan; Lin, Wei-Jun; Lindh, Matilda; Liu, Tianyun; Lu, Yu; Yan, Jingyi; Zhang, Ruopeng; Zhang, Yulu

Dataset

  • Audio (.wav)
  • Utterances (.txt)
  • 1st person video feed (.mp4)
  • Gaze fixation from running provided python script.

Annotation process

  • Temporal correlation between gaze and speech was detected using Python scripts.