Datasets:
Update README.md
Browse files
README.md
CHANGED
@@ -24,6 +24,7 @@ configs:
|
|
24 |
|
25 |
<!-- Provide a longer summary of what this dataset is. -->
|
26 |
We introduce ViewSpatial-Bench, a comprehensive benchmark with over 5,700 question-answer pairs across 1,000+ 3D scenes from ScanNet and MS-COCO validation sets. This benchmark evaluates VLMs' spatial localization capabilities from multiple perspectives, specifically testing both egocentric (camera) and allocentric (human subject) viewpoints across five distinct task types.
|
|
|
27 |
ViewSpatial-Bench addresses a critical gap: while VLMs excel at spatial reasoning from their own perspective, they struggle with perspective-taking—adopting another entity's spatial frame of reference—which is essential for embodied interaction and multi-agent collaboration.
|
28 |
- **Language(s) (NLP):** en
|
29 |
- **License:** apache-2.0
|
@@ -78,7 +79,7 @@ We provide benchmark results for various open-source models as well as **GPT-4o*
|
|
78 |
<td>34.98</td>
|
79 |
</tr>
|
80 |
<tr>
|
81 |
-
<td>Qwen2.5-VL (3B)
|
82 |
<td>43.43</td><td>33.33</td><td>39.80</td>
|
83 |
<td>39.16</td><td>28.62</td><td>28.51</td><td>32.14</td>
|
84 |
<td>35.85</td>
|
|
|
24 |
|
25 |
<!-- Provide a longer summary of what this dataset is. -->
|
26 |
We introduce ViewSpatial-Bench, a comprehensive benchmark with over 5,700 question-answer pairs across 1,000+ 3D scenes from ScanNet and MS-COCO validation sets. This benchmark evaluates VLMs' spatial localization capabilities from multiple perspectives, specifically testing both egocentric (camera) and allocentric (human subject) viewpoints across five distinct task types.
|
27 |
+
|
28 |
ViewSpatial-Bench addresses a critical gap: while VLMs excel at spatial reasoning from their own perspective, they struggle with perspective-taking—adopting another entity's spatial frame of reference—which is essential for embodied interaction and multi-agent collaboration.
|
29 |
- **Language(s) (NLP):** en
|
30 |
- **License:** apache-2.0
|
|
|
79 |
<td>34.98</td>
|
80 |
</tr>
|
81 |
<tr>
|
82 |
+
<td>Qwen2.5-VL (3B)</td>
|
83 |
<td>43.43</td><td>33.33</td><td>39.80</td>
|
84 |
<td>39.16</td><td>28.62</td><td>28.51</td><td>32.14</td>
|
85 |
<td>35.85</td>
|