File size: 1,053 Bytes
f1c42ea
 
 
 
 
 
1d99022
f1c42ea
 
7c54d17
 
 
f1c42ea
 
7c54d17
 
 
 
 
 
 
 
 
 
 
ab21cd8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
---
title: Top5 Error Rate
emoji: πŸ‘
colorFrom: yellow
colorTo: blue
sdk: gradio
sdk_version: 5.24.0
app_file: app.py
pinned: false
tags:
- evaluate
- metric
---

# Metric Card for Top-5 error rate

## Metric Description

The "top-5 error" is the percentage of times that the target label does not appear among the 5 highest-probability predictions. It can be computed with:
Top-5 Error Rate = 1 - Top-5 Accuracy
or equivalently:
Top-5 Error Rate = (Number of incorrect top-5 predictions) / (Total number of cases processed)
 Where:
- Top-5 Accuracy: The proportion of cases where the true label is among the model's top 5 predicted classes.
- Incorrect top-5 prediction: The true label is not in the top 5 predicted classes (ranked by probability).

## How to Use

At minimum, this metric requires predictions and references as inputs.

```python
accuracy_metric = evaluate.load("Aye10032/top5_error_rate")
results = accuracy_metric.compute(references=[[0, 1, 2, 3, 4]], predictions=[0])
print(results)
```
output is

```
{'top5_error_rate': 0.0}
```