zeroshot commited on
Commit
440ead8
·
1 Parent(s): a741025

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1058 -0
README.md CHANGED
@@ -48,6 +48,110 @@ model-index:
48
  value: 40.988
49
  - type: f1
50
  value: 40.776679545648506
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
51
  - task:
52
  type: STS
53
  dataset:
@@ -82,6 +186,856 @@ model-index:
82
  value: 81.25
83
  - type: f1
84
  value: 81.20841448916138
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
85
  - task:
86
  type: Classification
87
  dataset:
@@ -162,6 +1116,50 @@ model-index:
162
  value: 74.54270342972428
163
  - type: f1
164
  value: 74.02802500235784
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
165
  - task:
166
  type: STS
167
  dataset:
@@ -351,6 +1349,19 @@ model-index:
351
  value: 82.9630328314908
352
  - type: manhattan_spearman
353
  value: 82.13726553603003
 
 
 
 
 
 
 
 
 
 
 
 
 
354
  - task:
355
  type: PairClassification
356
  dataset:
@@ -406,6 +1417,41 @@ model-index:
406
  value: 94.58629018512772
407
  - type: max_f1
408
  value: 90.02531645569621
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
409
  - task:
410
  type: Classification
411
  dataset:
@@ -434,6 +1480,17 @@ model-index:
434
  value: 57.475947934352
435
  - type: f1
436
  value: 57.77676730676238
 
 
 
 
 
 
 
 
 
 
 
437
  - task:
438
  type: PairClassification
439
  dataset:
@@ -548,6 +1605,7 @@ license: mit
548
  language:
549
  - en
550
  ---
 
551
  This is the sparse ONNX variant of the [bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) embeddings model created with [DeepSparse Optimum](https://github.com/neuralmagic/optimum-deepsparse) for ONNX export and Neural Magic's [Sparsify](https://github.com/neuralmagic/sparsify) for One-Shot quantization and unstructured pruning (50%).
552
 
553
  Current up-to-date list of sparse and quantized bge ONNX models:
 
48
  value: 40.988
49
  - type: f1
50
  value: 40.776679545648506
51
+ - task:
52
+ type: Retrieval
53
+ dataset:
54
+ type: arguana
55
+ name: MTEB ArguAna
56
+ config: default
57
+ split: test
58
+ revision: None
59
+ metrics:
60
+ - type: map_at_1
61
+ value: 26.101999999999997
62
+ - type: map_at_10
63
+ value: 40.754000000000005
64
+ - type: map_at_100
65
+ value: 41.83
66
+ - type: map_at_1000
67
+ value: 41.845
68
+ - type: map_at_3
69
+ value: 36.178
70
+ - type: map_at_5
71
+ value: 38.646
72
+ - type: mrr_at_1
73
+ value: 26.6
74
+ - type: mrr_at_10
75
+ value: 40.934
76
+ - type: mrr_at_100
77
+ value: 42.015
78
+ - type: mrr_at_1000
79
+ value: 42.03
80
+ - type: mrr_at_3
81
+ value: 36.344
82
+ - type: mrr_at_5
83
+ value: 38.848
84
+ - type: ndcg_at_1
85
+ value: 26.101999999999997
86
+ - type: ndcg_at_10
87
+ value: 49.126999999999995
88
+ - type: ndcg_at_100
89
+ value: 53.815999999999995
90
+ - type: ndcg_at_1000
91
+ value: 54.178000000000004
92
+ - type: ndcg_at_3
93
+ value: 39.607
94
+ - type: ndcg_at_5
95
+ value: 44.086999999999996
96
+ - type: precision_at_1
97
+ value: 26.101999999999997
98
+ - type: precision_at_10
99
+ value: 7.596
100
+ - type: precision_at_100
101
+ value: 0.967
102
+ - type: precision_at_1000
103
+ value: 0.099
104
+ - type: precision_at_3
105
+ value: 16.524
106
+ - type: precision_at_5
107
+ value: 12.105
108
+ - type: recall_at_1
109
+ value: 26.101999999999997
110
+ - type: recall_at_10
111
+ value: 75.96000000000001
112
+ - type: recall_at_100
113
+ value: 96.65700000000001
114
+ - type: recall_at_1000
115
+ value: 99.431
116
+ - type: recall_at_3
117
+ value: 49.573
118
+ - type: recall_at_5
119
+ value: 60.526
120
+ - task:
121
+ type: Clustering
122
+ dataset:
123
+ type: mteb/arxiv-clustering-p2p
124
+ name: MTEB ArxivClusteringP2P
125
+ config: default
126
+ split: test
127
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
128
+ metrics:
129
+ - type: v_measure
130
+ value: 43.10651535441929
131
+ - task:
132
+ type: Clustering
133
+ dataset:
134
+ type: mteb/arxiv-clustering-s2s
135
+ name: MTEB ArxivClusteringS2S
136
+ config: default
137
+ split: test
138
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
139
+ metrics:
140
+ - type: v_measure
141
+ value: 34.41095293826606
142
+ - task:
143
+ type: Reranking
144
+ dataset:
145
+ type: mteb/askubuntudupquestions-reranking
146
+ name: MTEB AskUbuntuDupQuestions
147
+ config: default
148
+ split: test
149
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
150
+ metrics:
151
+ - type: map
152
+ value: 56.96575970919239
153
+ - type: mrr
154
+ value: 69.92503187794047
155
  - task:
156
  type: STS
157
  dataset:
 
186
  value: 81.25
187
  - type: f1
188
  value: 81.20841448916138
189
+ - task:
190
+ type: Clustering
191
+ dataset:
192
+ type: mteb/biorxiv-clustering-p2p
193
+ name: MTEB BiorxivClusteringP2P
194
+ config: default
195
+ split: test
196
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
197
+ metrics:
198
+ - type: v_measure
199
+ value: 34.69545244587236
200
+ - task:
201
+ type: Clustering
202
+ dataset:
203
+ type: mteb/biorxiv-clustering-s2s
204
+ name: MTEB BiorxivClusteringS2S
205
+ config: default
206
+ split: test
207
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
208
+ metrics:
209
+ - type: v_measure
210
+ value: 28.84301739171936
211
+ - task:
212
+ type: Retrieval
213
+ dataset:
214
+ type: BeIR/cqadupstack
215
+ name: MTEB CQADupstackAndroidRetrieval
216
+ config: default
217
+ split: test
218
+ revision: None
219
+ metrics:
220
+ - type: map_at_1
221
+ value: 23.401
222
+ - type: map_at_10
223
+ value: 32.451
224
+ - type: map_at_100
225
+ value: 33.891
226
+ - type: map_at_1000
227
+ value: 34.01
228
+ - type: map_at_3
229
+ value: 29.365999999999996
230
+ - type: map_at_5
231
+ value: 31.240000000000002
232
+ - type: mrr_at_1
233
+ value: 29.9
234
+ - type: mrr_at_10
235
+ value: 38.590999999999994
236
+ - type: mrr_at_100
237
+ value: 39.587
238
+ - type: mrr_at_1000
239
+ value: 39.637
240
+ - type: mrr_at_3
241
+ value: 36.028
242
+ - type: mrr_at_5
243
+ value: 37.673
244
+ - type: ndcg_at_1
245
+ value: 29.9
246
+ - type: ndcg_at_10
247
+ value: 38.251000000000005
248
+ - type: ndcg_at_100
249
+ value: 44.354
250
+ - type: ndcg_at_1000
251
+ value: 46.642
252
+ - type: ndcg_at_3
253
+ value: 33.581
254
+ - type: ndcg_at_5
255
+ value: 35.96
256
+ - type: precision_at_1
257
+ value: 29.9
258
+ - type: precision_at_10
259
+ value: 7.439
260
+ - type: precision_at_100
261
+ value: 1.28
262
+ - type: precision_at_1000
263
+ value: 0.17700000000000002
264
+ - type: precision_at_3
265
+ value: 16.404
266
+ - type: precision_at_5
267
+ value: 12.046
268
+ - type: recall_at_1
269
+ value: 23.401
270
+ - type: recall_at_10
271
+ value: 49.305
272
+ - type: recall_at_100
273
+ value: 75.885
274
+ - type: recall_at_1000
275
+ value: 90.885
276
+ - type: recall_at_3
277
+ value: 35.341
278
+ - type: recall_at_5
279
+ value: 42.275
280
+ - task:
281
+ type: Retrieval
282
+ dataset:
283
+ type: BeIR/cqadupstack
284
+ name: MTEB CQADupstackEnglishRetrieval
285
+ config: default
286
+ split: test
287
+ revision: None
288
+ metrics:
289
+ - type: map_at_1
290
+ value: 22.103
291
+ - type: map_at_10
292
+ value: 29.271
293
+ - type: map_at_100
294
+ value: 30.151
295
+ - type: map_at_1000
296
+ value: 30.276999999999997
297
+ - type: map_at_3
298
+ value: 27.289
299
+ - type: map_at_5
300
+ value: 28.236
301
+ - type: mrr_at_1
302
+ value: 26.943
303
+ - type: mrr_at_10
304
+ value: 33.782000000000004
305
+ - type: mrr_at_100
306
+ value: 34.459
307
+ - type: mrr_at_1000
308
+ value: 34.525
309
+ - type: mrr_at_3
310
+ value: 31.985000000000003
311
+ - type: mrr_at_5
312
+ value: 32.909
313
+ - type: ndcg_at_1
314
+ value: 26.943
315
+ - type: ndcg_at_10
316
+ value: 33.616
317
+ - type: ndcg_at_100
318
+ value: 37.669000000000004
319
+ - type: ndcg_at_1000
320
+ value: 40.247
321
+ - type: ndcg_at_3
322
+ value: 30.482
323
+ - type: ndcg_at_5
324
+ value: 31.615
325
+ - type: precision_at_1
326
+ value: 26.943
327
+ - type: precision_at_10
328
+ value: 6.146
329
+ - type: precision_at_100
330
+ value: 1.038
331
+ - type: precision_at_1000
332
+ value: 0.151
333
+ - type: precision_at_3
334
+ value: 14.521999999999998
335
+ - type: precision_at_5
336
+ value: 10.038
337
+ - type: recall_at_1
338
+ value: 22.103
339
+ - type: recall_at_10
340
+ value: 41.754999999999995
341
+ - type: recall_at_100
342
+ value: 59.636
343
+ - type: recall_at_1000
344
+ value: 76.801
345
+ - type: recall_at_3
346
+ value: 32.285000000000004
347
+ - type: recall_at_5
348
+ value: 35.684
349
+ - task:
350
+ type: Retrieval
351
+ dataset:
352
+ type: BeIR/cqadupstack
353
+ name: MTEB CQADupstackGamingRetrieval
354
+ config: default
355
+ split: test
356
+ revision: None
357
+ metrics:
358
+ - type: map_at_1
359
+ value: 32.565
360
+ - type: map_at_10
361
+ value: 43.07
362
+ - type: map_at_100
363
+ value: 44.102999999999994
364
+ - type: map_at_1000
365
+ value: 44.175
366
+ - type: map_at_3
367
+ value: 40.245
368
+ - type: map_at_5
369
+ value: 41.71
370
+ - type: mrr_at_1
371
+ value: 37.429
372
+ - type: mrr_at_10
373
+ value: 46.358
374
+ - type: mrr_at_100
375
+ value: 47.146
376
+ - type: mrr_at_1000
377
+ value: 47.187
378
+ - type: mrr_at_3
379
+ value: 44.086
380
+ - type: mrr_at_5
381
+ value: 45.318000000000005
382
+ - type: ndcg_at_1
383
+ value: 37.429
384
+ - type: ndcg_at_10
385
+ value: 48.398
386
+ - type: ndcg_at_100
387
+ value: 52.90899999999999
388
+ - type: ndcg_at_1000
389
+ value: 54.478
390
+ - type: ndcg_at_3
391
+ value: 43.418
392
+ - type: ndcg_at_5
393
+ value: 45.578
394
+ - type: precision_at_1
395
+ value: 37.429
396
+ - type: precision_at_10
397
+ value: 7.856000000000001
398
+ - type: precision_at_100
399
+ value: 1.093
400
+ - type: precision_at_1000
401
+ value: 0.129
402
+ - type: precision_at_3
403
+ value: 19.331
404
+ - type: precision_at_5
405
+ value: 13.191
406
+ - type: recall_at_1
407
+ value: 32.565
408
+ - type: recall_at_10
409
+ value: 61.021
410
+ - type: recall_at_100
411
+ value: 81.105
412
+ - type: recall_at_1000
413
+ value: 92.251
414
+ - type: recall_at_3
415
+ value: 47.637
416
+ - type: recall_at_5
417
+ value: 52.871
418
+ - task:
419
+ type: Retrieval
420
+ dataset:
421
+ type: BeIR/cqadupstack
422
+ name: MTEB CQADupstackGisRetrieval
423
+ config: default
424
+ split: test
425
+ revision: None
426
+ metrics:
427
+ - type: map_at_1
428
+ value: 18.108
429
+ - type: map_at_10
430
+ value: 24.613
431
+ - type: map_at_100
432
+ value: 25.624000000000002
433
+ - type: map_at_1000
434
+ value: 25.721
435
+ - type: map_at_3
436
+ value: 22.271
437
+ - type: map_at_5
438
+ value: 23.681
439
+ - type: mrr_at_1
440
+ value: 19.435
441
+ - type: mrr_at_10
442
+ value: 26.124000000000002
443
+ - type: mrr_at_100
444
+ value: 27.07
445
+ - type: mrr_at_1000
446
+ value: 27.145999999999997
447
+ - type: mrr_at_3
448
+ value: 23.748
449
+ - type: mrr_at_5
450
+ value: 25.239
451
+ - type: ndcg_at_1
452
+ value: 19.435
453
+ - type: ndcg_at_10
454
+ value: 28.632
455
+ - type: ndcg_at_100
456
+ value: 33.988
457
+ - type: ndcg_at_1000
458
+ value: 36.551
459
+ - type: ndcg_at_3
460
+ value: 24.035999999999998
461
+ - type: ndcg_at_5
462
+ value: 26.525
463
+ - type: precision_at_1
464
+ value: 19.435
465
+ - type: precision_at_10
466
+ value: 4.565
467
+ - type: precision_at_100
468
+ value: 0.771
469
+ - type: precision_at_1000
470
+ value: 0.10200000000000001
471
+ - type: precision_at_3
472
+ value: 10.169
473
+ - type: precision_at_5
474
+ value: 7.571
475
+ - type: recall_at_1
476
+ value: 18.108
477
+ - type: recall_at_10
478
+ value: 39.533
479
+ - type: recall_at_100
480
+ value: 64.854
481
+ - type: recall_at_1000
482
+ value: 84.421
483
+ - type: recall_at_3
484
+ value: 27.500000000000004
485
+ - type: recall_at_5
486
+ value: 33.314
487
+ - task:
488
+ type: Retrieval
489
+ dataset:
490
+ type: BeIR/cqadupstack
491
+ name: MTEB CQADupstackMathematicaRetrieval
492
+ config: default
493
+ split: test
494
+ revision: None
495
+ metrics:
496
+ - type: map_at_1
497
+ value: 11.087
498
+ - type: map_at_10
499
+ value: 17.323
500
+ - type: map_at_100
501
+ value: 18.569
502
+ - type: map_at_1000
503
+ value: 18.694
504
+ - type: map_at_3
505
+ value: 15.370000000000001
506
+ - type: map_at_5
507
+ value: 16.538
508
+ - type: mrr_at_1
509
+ value: 13.557
510
+ - type: mrr_at_10
511
+ value: 21.041
512
+ - type: mrr_at_100
513
+ value: 22.134
514
+ - type: mrr_at_1000
515
+ value: 22.207
516
+ - type: mrr_at_3
517
+ value: 18.843
518
+ - type: mrr_at_5
519
+ value: 20.236
520
+ - type: ndcg_at_1
521
+ value: 13.557
522
+ - type: ndcg_at_10
523
+ value: 21.571
524
+ - type: ndcg_at_100
525
+ value: 27.678000000000004
526
+ - type: ndcg_at_1000
527
+ value: 30.8
528
+ - type: ndcg_at_3
529
+ value: 17.922
530
+ - type: ndcg_at_5
531
+ value: 19.826
532
+ - type: precision_at_1
533
+ value: 13.557
534
+ - type: precision_at_10
535
+ value: 4.1290000000000004
536
+ - type: precision_at_100
537
+ value: 0.8370000000000001
538
+ - type: precision_at_1000
539
+ value: 0.125
540
+ - type: precision_at_3
541
+ value: 8.914
542
+ - type: precision_at_5
543
+ value: 6.691999999999999
544
+ - type: recall_at_1
545
+ value: 11.087
546
+ - type: recall_at_10
547
+ value: 30.94
548
+ - type: recall_at_100
549
+ value: 57.833999999999996
550
+ - type: recall_at_1000
551
+ value: 80.365
552
+ - type: recall_at_3
553
+ value: 20.854
554
+ - type: recall_at_5
555
+ value: 25.695
556
+ - task:
557
+ type: Retrieval
558
+ dataset:
559
+ type: BeIR/cqadupstack
560
+ name: MTEB CQADupstackPhysicsRetrieval
561
+ config: default
562
+ split: test
563
+ revision: None
564
+ metrics:
565
+ - type: map_at_1
566
+ value: 21.708
567
+ - type: map_at_10
568
+ value: 30.422
569
+ - type: map_at_100
570
+ value: 31.713
571
+ - type: map_at_1000
572
+ value: 31.842
573
+ - type: map_at_3
574
+ value: 27.424
575
+ - type: map_at_5
576
+ value: 29.17
577
+ - type: mrr_at_1
578
+ value: 26.756
579
+ - type: mrr_at_10
580
+ value: 35.304
581
+ - type: mrr_at_100
582
+ value: 36.296
583
+ - type: mrr_at_1000
584
+ value: 36.359
585
+ - type: mrr_at_3
586
+ value: 32.692
587
+ - type: mrr_at_5
588
+ value: 34.288999999999994
589
+ - type: ndcg_at_1
590
+ value: 26.756
591
+ - type: ndcg_at_10
592
+ value: 35.876000000000005
593
+ - type: ndcg_at_100
594
+ value: 41.708
595
+ - type: ndcg_at_1000
596
+ value: 44.359
597
+ - type: ndcg_at_3
598
+ value: 30.946
599
+ - type: ndcg_at_5
600
+ value: 33.404
601
+ - type: precision_at_1
602
+ value: 26.756
603
+ - type: precision_at_10
604
+ value: 6.795
605
+ - type: precision_at_100
606
+ value: 1.138
607
+ - type: precision_at_1000
608
+ value: 0.155
609
+ - type: precision_at_3
610
+ value: 15.046999999999999
611
+ - type: precision_at_5
612
+ value: 10.972
613
+ - type: recall_at_1
614
+ value: 21.708
615
+ - type: recall_at_10
616
+ value: 47.315000000000005
617
+ - type: recall_at_100
618
+ value: 72.313
619
+ - type: recall_at_1000
620
+ value: 90.199
621
+ - type: recall_at_3
622
+ value: 33.528999999999996
623
+ - type: recall_at_5
624
+ value: 39.985
625
+ - task:
626
+ type: Retrieval
627
+ dataset:
628
+ type: BeIR/cqadupstack
629
+ name: MTEB CQADupstackProgrammersRetrieval
630
+ config: default
631
+ split: test
632
+ revision: None
633
+ metrics:
634
+ - type: map_at_1
635
+ value: 18.902
636
+ - type: map_at_10
637
+ value: 26.166
638
+ - type: map_at_100
639
+ value: 27.368
640
+ - type: map_at_1000
641
+ value: 27.493000000000002
642
+ - type: map_at_3
643
+ value: 23.505000000000003
644
+ - type: map_at_5
645
+ value: 25.019000000000002
646
+ - type: mrr_at_1
647
+ value: 23.402
648
+ - type: mrr_at_10
649
+ value: 30.787
650
+ - type: mrr_at_100
651
+ value: 31.735000000000003
652
+ - type: mrr_at_1000
653
+ value: 31.806
654
+ - type: mrr_at_3
655
+ value: 28.33
656
+ - type: mrr_at_5
657
+ value: 29.711
658
+ - type: ndcg_at_1
659
+ value: 23.402
660
+ - type: ndcg_at_10
661
+ value: 30.971
662
+ - type: ndcg_at_100
663
+ value: 36.61
664
+ - type: ndcg_at_1000
665
+ value: 39.507999999999996
666
+ - type: ndcg_at_3
667
+ value: 26.352999999999998
668
+ - type: ndcg_at_5
669
+ value: 28.488000000000003
670
+ - type: precision_at_1
671
+ value: 23.402
672
+ - type: precision_at_10
673
+ value: 5.799
674
+ - type: precision_at_100
675
+ value: 1
676
+ - type: precision_at_1000
677
+ value: 0.14100000000000001
678
+ - type: precision_at_3
679
+ value: 12.633
680
+ - type: precision_at_5
681
+ value: 9.269
682
+ - type: recall_at_1
683
+ value: 18.902
684
+ - type: recall_at_10
685
+ value: 40.929
686
+ - type: recall_at_100
687
+ value: 65.594
688
+ - type: recall_at_1000
689
+ value: 85.961
690
+ - type: recall_at_3
691
+ value: 28.121000000000002
692
+ - type: recall_at_5
693
+ value: 33.638
694
+ - task:
695
+ type: Retrieval
696
+ dataset:
697
+ type: BeIR/cqadupstack
698
+ name: MTEB CQADupstackStatsRetrieval
699
+ config: default
700
+ split: test
701
+ revision: None
702
+ metrics:
703
+ - type: map_at_1
704
+ value: 19.168
705
+ - type: map_at_10
706
+ value: 25.142999999999997
707
+ - type: map_at_100
708
+ value: 25.993
709
+ - type: map_at_1000
710
+ value: 26.076
711
+ - type: map_at_3
712
+ value: 23.179
713
+ - type: map_at_5
714
+ value: 24.322
715
+ - type: mrr_at_1
716
+ value: 21.933
717
+ - type: mrr_at_10
718
+ value: 27.72
719
+ - type: mrr_at_100
720
+ value: 28.518
721
+ - type: mrr_at_1000
722
+ value: 28.582
723
+ - type: mrr_at_3
724
+ value: 25.791999999999998
725
+ - type: mrr_at_5
726
+ value: 26.958
727
+ - type: ndcg_at_1
728
+ value: 21.933
729
+ - type: ndcg_at_10
730
+ value: 28.866999999999997
731
+ - type: ndcg_at_100
732
+ value: 33.285
733
+ - type: ndcg_at_1000
734
+ value: 35.591
735
+ - type: ndcg_at_3
736
+ value: 25.202999999999996
737
+ - type: ndcg_at_5
738
+ value: 27.045
739
+ - type: precision_at_1
740
+ value: 21.933
741
+ - type: precision_at_10
742
+ value: 4.632
743
+ - type: precision_at_100
744
+ value: 0.733
745
+ - type: precision_at_1000
746
+ value: 0.101
747
+ - type: precision_at_3
748
+ value: 10.992
749
+ - type: precision_at_5
750
+ value: 7.853000000000001
751
+ - type: recall_at_1
752
+ value: 19.168
753
+ - type: recall_at_10
754
+ value: 37.899
755
+ - type: recall_at_100
756
+ value: 58.54899999999999
757
+ - type: recall_at_1000
758
+ value: 75.666
759
+ - type: recall_at_3
760
+ value: 27.831
761
+ - type: recall_at_5
762
+ value: 32.336
763
+ - task:
764
+ type: Retrieval
765
+ dataset:
766
+ type: BeIR/cqadupstack
767
+ name: MTEB CQADupstackTexRetrieval
768
+ config: default
769
+ split: test
770
+ revision: None
771
+ metrics:
772
+ - type: map_at_1
773
+ value: 12.764000000000001
774
+ - type: map_at_10
775
+ value: 17.757
776
+ - type: map_at_100
777
+ value: 18.677
778
+ - type: map_at_1000
779
+ value: 18.813
780
+ - type: map_at_3
781
+ value: 16.151
782
+ - type: map_at_5
783
+ value: 16.946
784
+ - type: mrr_at_1
785
+ value: 15.726
786
+ - type: mrr_at_10
787
+ value: 21.019
788
+ - type: mrr_at_100
789
+ value: 21.856
790
+ - type: mrr_at_1000
791
+ value: 21.954
792
+ - type: mrr_at_3
793
+ value: 19.282
794
+ - type: mrr_at_5
795
+ value: 20.189
796
+ - type: ndcg_at_1
797
+ value: 15.726
798
+ - type: ndcg_at_10
799
+ value: 21.259
800
+ - type: ndcg_at_100
801
+ value: 25.868999999999996
802
+ - type: ndcg_at_1000
803
+ value: 29.425
804
+ - type: ndcg_at_3
805
+ value: 18.204
806
+ - type: ndcg_at_5
807
+ value: 19.434
808
+ - type: precision_at_1
809
+ value: 15.726
810
+ - type: precision_at_10
811
+ value: 3.8920000000000003
812
+ - type: precision_at_100
813
+ value: 0.741
814
+ - type: precision_at_1000
815
+ value: 0.121
816
+ - type: precision_at_3
817
+ value: 8.58
818
+ - type: precision_at_5
819
+ value: 6.132
820
+ - type: recall_at_1
821
+ value: 12.764000000000001
822
+ - type: recall_at_10
823
+ value: 28.639
824
+ - type: recall_at_100
825
+ value: 49.639
826
+ - type: recall_at_1000
827
+ value: 75.725
828
+ - type: recall_at_3
829
+ value: 19.883
830
+ - type: recall_at_5
831
+ value: 23.141000000000002
832
+ - task:
833
+ type: Retrieval
834
+ dataset:
835
+ type: BeIR/cqadupstack
836
+ name: MTEB CQADupstackUnixRetrieval
837
+ config: default
838
+ split: test
839
+ revision: None
840
+ metrics:
841
+ - type: map_at_1
842
+ value: 18.98
843
+ - type: map_at_10
844
+ value: 25.2
845
+ - type: map_at_100
846
+ value: 26.279000000000003
847
+ - type: map_at_1000
848
+ value: 26.399
849
+ - type: map_at_3
850
+ value: 23.399
851
+ - type: map_at_5
852
+ value: 24.284
853
+ - type: mrr_at_1
854
+ value: 22.015
855
+ - type: mrr_at_10
856
+ value: 28.555000000000003
857
+ - type: mrr_at_100
858
+ value: 29.497
859
+ - type: mrr_at_1000
860
+ value: 29.574
861
+ - type: mrr_at_3
862
+ value: 26.788
863
+ - type: mrr_at_5
864
+ value: 27.576
865
+ - type: ndcg_at_1
866
+ value: 22.015
867
+ - type: ndcg_at_10
868
+ value: 29.266
869
+ - type: ndcg_at_100
870
+ value: 34.721000000000004
871
+ - type: ndcg_at_1000
872
+ value: 37.659
873
+ - type: ndcg_at_3
874
+ value: 25.741000000000003
875
+ - type: ndcg_at_5
876
+ value: 27.044
877
+ - type: precision_at_1
878
+ value: 22.015
879
+ - type: precision_at_10
880
+ value: 4.897
881
+ - type: precision_at_100
882
+ value: 0.8540000000000001
883
+ - type: precision_at_1000
884
+ value: 0.122
885
+ - type: precision_at_3
886
+ value: 11.567
887
+ - type: precision_at_5
888
+ value: 7.9479999999999995
889
+ - type: recall_at_1
890
+ value: 18.98
891
+ - type: recall_at_10
892
+ value: 38.411
893
+ - type: recall_at_100
894
+ value: 63.164
895
+ - type: recall_at_1000
896
+ value: 84.292
897
+ - type: recall_at_3
898
+ value: 28.576
899
+ - type: recall_at_5
900
+ value: 31.789
901
+ - task:
902
+ type: Retrieval
903
+ dataset:
904
+ type: BeIR/cqadupstack
905
+ name: MTEB CQADupstackWebmastersRetrieval
906
+ config: default
907
+ split: test
908
+ revision: None
909
+ metrics:
910
+ - type: map_at_1
911
+ value: 20.372
912
+ - type: map_at_10
913
+ value: 27.161
914
+ - type: map_at_100
915
+ value: 28.364
916
+ - type: map_at_1000
917
+ value: 28.554000000000002
918
+ - type: map_at_3
919
+ value: 25.135
920
+ - type: map_at_5
921
+ value: 26.200000000000003
922
+ - type: mrr_at_1
923
+ value: 24.704
924
+ - type: mrr_at_10
925
+ value: 31.219
926
+ - type: mrr_at_100
927
+ value: 32.092
928
+ - type: mrr_at_1000
929
+ value: 32.181
930
+ - type: mrr_at_3
931
+ value: 29.282000000000004
932
+ - type: mrr_at_5
933
+ value: 30.359
934
+ - type: ndcg_at_1
935
+ value: 24.704
936
+ - type: ndcg_at_10
937
+ value: 31.622
938
+ - type: ndcg_at_100
939
+ value: 36.917
940
+ - type: ndcg_at_1000
941
+ value: 40.357
942
+ - type: ndcg_at_3
943
+ value: 28.398
944
+ - type: ndcg_at_5
945
+ value: 29.764000000000003
946
+ - type: precision_at_1
947
+ value: 24.704
948
+ - type: precision_at_10
949
+ value: 5.81
950
+ - type: precision_at_100
951
+ value: 1.208
952
+ - type: precision_at_1000
953
+ value: 0.209
954
+ - type: precision_at_3
955
+ value: 13.241
956
+ - type: precision_at_5
957
+ value: 9.407
958
+ - type: recall_at_1
959
+ value: 20.372
960
+ - type: recall_at_10
961
+ value: 40.053
962
+ - type: recall_at_100
963
+ value: 64.71000000000001
964
+ - type: recall_at_1000
965
+ value: 87.607
966
+ - type: recall_at_3
967
+ value: 29.961
968
+ - type: recall_at_5
969
+ value: 34.058
970
+ - task:
971
+ type: Retrieval
972
+ dataset:
973
+ type: BeIR/cqadupstack
974
+ name: MTEB CQADupstackWordpressRetrieval
975
+ config: default
976
+ split: test
977
+ revision: None
978
+ metrics:
979
+ - type: map_at_1
980
+ value: 14.424000000000001
981
+ - type: map_at_10
982
+ value: 20.541999999999998
983
+ - type: map_at_100
984
+ value: 21.495
985
+ - type: map_at_1000
986
+ value: 21.604
987
+ - type: map_at_3
988
+ value: 18.608
989
+ - type: map_at_5
990
+ value: 19.783
991
+ - type: mrr_at_1
992
+ value: 15.895999999999999
993
+ - type: mrr_at_10
994
+ value: 22.484
995
+ - type: mrr_at_100
996
+ value: 23.376
997
+ - type: mrr_at_1000
998
+ value: 23.467
999
+ - type: mrr_at_3
1000
+ value: 20.548
1001
+ - type: mrr_at_5
1002
+ value: 21.731
1003
+ - type: ndcg_at_1
1004
+ value: 15.895999999999999
1005
+ - type: ndcg_at_10
1006
+ value: 24.343
1007
+ - type: ndcg_at_100
1008
+ value: 29.181
1009
+ - type: ndcg_at_1000
1010
+ value: 32.330999999999996
1011
+ - type: ndcg_at_3
1012
+ value: 20.518
1013
+ - type: ndcg_at_5
1014
+ value: 22.561999999999998
1015
+ - type: precision_at_1
1016
+ value: 15.895999999999999
1017
+ - type: precision_at_10
1018
+ value: 3.9739999999999998
1019
+ - type: precision_at_100
1020
+ value: 0.6799999999999999
1021
+ - type: precision_at_1000
1022
+ value: 0.105
1023
+ - type: precision_at_3
1024
+ value: 9.057
1025
+ - type: precision_at_5
1026
+ value: 6.654
1027
+ - type: recall_at_1
1028
+ value: 14.424000000000001
1029
+ - type: recall_at_10
1030
+ value: 34.079
1031
+ - type: recall_at_100
1032
+ value: 56.728
1033
+ - type: recall_at_1000
1034
+ value: 80.765
1035
+ - type: recall_at_3
1036
+ value: 23.993000000000002
1037
+ - type: recall_at_5
1038
+ value: 28.838
1039
  - task:
1040
  type: Classification
1041
  dataset:
 
1116
  value: 74.54270342972428
1117
  - type: f1
1118
  value: 74.02802500235784
1119
+ - task:
1120
+ type: Clustering
1121
+ dataset:
1122
+ type: mteb/medrxiv-clustering-p2p
1123
+ name: MTEB MedrxivClusteringP2P
1124
+ config: default
1125
+ split: test
1126
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
1127
+ metrics:
1128
+ - type: v_measure
1129
+ value: 30.488580544269002
1130
+ - task:
1131
+ type: Clustering
1132
+ dataset:
1133
+ type: mteb/medrxiv-clustering-s2s
1134
+ name: MTEB MedrxivClusteringS2S
1135
+ config: default
1136
+ split: test
1137
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
1138
+ metrics:
1139
+ - type: v_measure
1140
+ value: 28.80426879476371
1141
+ - task:
1142
+ type: Clustering
1143
+ dataset:
1144
+ type: mteb/reddit-clustering
1145
+ name: MTEB RedditClustering
1146
+ config: default
1147
+ split: test
1148
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
1149
+ metrics:
1150
+ - type: v_measure
1151
+ value: 42.862710845031565
1152
+ - task:
1153
+ type: Clustering
1154
+ dataset:
1155
+ type: mteb/reddit-clustering-p2p
1156
+ name: MTEB RedditClusteringP2P
1157
+ config: default
1158
+ split: test
1159
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
1160
+ metrics:
1161
+ - type: v_measure
1162
+ value: 54.270000736385626
1163
  - task:
1164
  type: STS
1165
  dataset:
 
1349
  value: 82.9630328314908
1350
  - type: manhattan_spearman
1351
  value: 82.13726553603003
1352
+ - task:
1353
+ type: Reranking
1354
+ dataset:
1355
+ type: mteb/scidocs-reranking
1356
+ name: MTEB SciDocsRR
1357
+ config: default
1358
+ split: test
1359
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
1360
+ metrics:
1361
+ - type: map
1362
+ value: 79.45753132898741
1363
+ - type: mrr
1364
+ value: 93.84029822755313
1365
  - task:
1366
  type: PairClassification
1367
  dataset:
 
1417
  value: 94.58629018512772
1418
  - type: max_f1
1419
  value: 90.02531645569621
1420
+ - task:
1421
+ type: Clustering
1422
+ dataset:
1423
+ type: mteb/stackexchange-clustering
1424
+ name: MTEB StackExchangeClustering
1425
+ config: default
1426
+ split: test
1427
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
1428
+ metrics:
1429
+ - type: v_measure
1430
+ value: 53.088941385715735
1431
+ - task:
1432
+ type: Clustering
1433
+ dataset:
1434
+ type: mteb/stackexchange-clustering-p2p
1435
+ name: MTEB StackExchangeClusteringP2P
1436
+ config: default
1437
+ split: test
1438
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
1439
+ metrics:
1440
+ - type: v_measure
1441
+ value: 33.146129414825744
1442
+ - task:
1443
+ type: Reranking
1444
+ dataset:
1445
+ type: mteb/stackoverflowdupquestions-reranking
1446
+ name: MTEB StackOverflowDupQuestions
1447
+ config: default
1448
+ split: test
1449
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
1450
+ metrics:
1451
+ - type: map
1452
+ value: 48.7511362739003
1453
+ - type: mrr
1454
+ value: 49.61682210763093
1455
  - task:
1456
  type: Classification
1457
  dataset:
 
1480
  value: 57.475947934352
1481
  - type: f1
1482
  value: 57.77676730676238
1483
+ - task:
1484
+ type: Clustering
1485
+ dataset:
1486
+ type: mteb/twentynewsgroups-clustering
1487
+ name: MTEB TwentyNewsgroupsClustering
1488
+ config: default
1489
+ split: test
1490
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
1491
+ metrics:
1492
+ - type: v_measure
1493
+ value: 38.3463456299738
1494
  - task:
1495
  type: PairClassification
1496
  dataset:
 
1605
  language:
1606
  - en
1607
  ---
1608
+
1609
  This is the sparse ONNX variant of the [bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5) embeddings model created with [DeepSparse Optimum](https://github.com/neuralmagic/optimum-deepsparse) for ONNX export and Neural Magic's [Sparsify](https://github.com/neuralmagic/sparsify) for One-Shot quantization and unstructured pruning (50%).
1610
 
1611
  Current up-to-date list of sparse and quantized bge ONNX models: