annadeichler commited on
Commit
c30741c
·
verified ·
1 Parent(s): 90b26c6

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +64 -0
README.md ADDED
@@ -0,0 +1,64 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ # Dataset Card for Gaze-Speech Analysis in Referential Communication with ARIA Headset
3
+
4
+ ## Dataset Details
5
+
6
+ ### Dataset Description
7
+
8
+ This dataset investigates the synchronization of eye tracking and speech recognition using Aria smart glasses to determine whether individuals exhibit visual and verbal synchronization when identifying an object. Participants were tasked with identifying food items from a recipe while wearing Aria glasses, which recorded their eye movements and speech in real time. The dataset provides insight into gaze-speech synchronization patterns in referential communication.
9
+
10
+ - **Curated by:** KTH Royal Institute of Technology
11
+ - **Language(s) (NLP):** English
12
+ - **License:** CC BY-NC-ND 4.0 ([Link](https://creativecommons.org/licenses/by-nc-nd/4.0/))
13
+
14
+ ### Dataset Sources [optional]
15
+ TBA
16
+
17
+ ### Direct Use
18
+
19
+ This dataset is suitable for research in:
20
+ - Referential communication analysis
21
+ - Gaze and speech synchronization
22
+ - Human-robot interaction and multimodal dialogue systems
23
+ - Eye-tracking studies in task-based environments
24
+
25
+ ### Out-of-Scope Use
26
+
27
+ - The dataset is not intended for commercial applications without proper ethical considerations.
28
+ - Misuse in contexts where privacy-sensitive information might be inferred or manipulated should be avoided.
29
+
30
+ ## Dataset Structure
31
+
32
+ - **Participants:** 20 individuals (2 men, 18 women).
33
+ - **Data Collection Setup:** Participants memorized a series of ingredients and steps in five recipes and verbally instructed the steps while wearing ARIA glasses.
34
+ - **Recorded Data:** Eye movement (gaze tracking) and speech (audio recordings), alongside 3rd view camera.
35
+ - **Analysis Methods:** Python-based temporal correlation detection, helper functions to plot gaze fixations and track objects.
36
+
37
+ ## Dataset Creation
38
+
39
+ ### Curation Rationale
40
+
41
+ The dataset was created to explore how gaze and speech synchronize in referential communication and whether object location influences this synchronization.
42
+
43
+ ### Source Data
44
+
45
+ #### Data Collection and Processing
46
+
47
+ - **Hardware:** ARIA smart glasses, GoPro camera
48
+ - **Collection Method:** Participants wore ARIA glasses while describing recipe ingredients and steps, allowing real-time capture of gaze and verbal utterances.
49
+
50
+ #### Who are the source data producers?
51
+
52
+ - KTH Students involved in the project: Gong, Yanliang; Hafsteinsdóttir, Kristín; He, Yiyan; Lin, Wei-Jun; Lindh, Matilda; Liu, Tianyun; Lu, Yu; Yan, Jingyi; Zhang, Ruopeng; Zhang, Yulu
53
+
54
+ ### Annotations [optional]
55
+
56
+ #### Annotation process
57
+
58
+ - Temporal correlation between gaze and speech was detected using Python scripts.
59
+
60
+
61
+
62
+
63
+
64
+