FranciscoLozDataScience commited on
Commit
e415837
·
verified ·
1 Parent(s): 87b2d66

update dataset card

Browse files
Files changed (1) hide show
  1. README.md +40 -0
README.md CHANGED
@@ -54,3 +54,43 @@ configs:
54
  - split: test
55
  path: data/test-*
56
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
54
  - split: test
55
  path: data/test-*
56
  ---
57
+ # INQUIRE-Benchmark-small
58
+
59
+
60
+ <!-- **INQUIRE: A Natural World Text-to-Image Retrieval Benchmark** -->
61
+
62
+ INQUIRE is a text-to-image retrieval benchmark designed to challenge multimodal models with expert-level queries about the natural world. This dataset aims to emulate real world image retrieval and analysis problems faced by scientists working with large-scale image collections.
63
+ Therefore, we will use this benchmark to improve our Text-to-Image Retrieval Systems.
64
+
65
+ **Dataset Details**
66
+
67
+ This dataset was build off of [**INQUIRE-Rerank**](https://huggingface.co/datasets/evendrow/INQUIRE-Rerank) with additional modifications. Please refer to [modify_inquire_rerank.ipynb](https://huggingface.co/datasets/sagecontinuum/INQUIRE-Benchmark-small/blob/main/modify_inquire_rerank.ipynb) to see
68
+ the modifications we did.
69
+
70
+ **Loading the Dataset**
71
+
72
+ To load the dataset using HugginFace `datasets`, you first need to `pip install datasets`, then run the following code:
73
+
74
+ ```
75
+ from datasets import load_dataset
76
+
77
+ inquire = load_dataset("sagecontinuum/INQUIRE-Benchmark-small", split="validation") # or "test"
78
+ ```
79
+
80
+ **Additional Details**
81
+
82
+ For additional details, check out their paper and more.
83
+
84
+ [🌐 Website](https://inquire-benchmark.github.io/)
85
+ [📖 Paper](https://arxiv.org/abs/2411.02537)
86
+ [GitHub](https://github.com/inquire-benchmark/INQUIRE)
87
+
88
+ **Citations**
89
+ ```
90
+ @article{vendrow2024inquire,
91
+ title={INQUIRE: A Natural World Text-to-Image Retrieval Benchmark},
92
+ author={Vendrow, Edward and Pantazis, Omiros and Shepard, Alexander and Brostow, Gabriel and Jones, Kate E and Mac Aodha, Oisin and Beery, Sara and Van Horn, Grant},
93
+ journal={NeurIPS},
94
+ year={2024},
95
+ }
96
+ ```