Spaces:
Sleeping
Sleeping
File size: 4,648 Bytes
58ff7c0 f12b919 58ff7c0 f12b919 58ff7c0 9019b36 f12b919 7e7ac44 58ff7c0 8e5798a 71a8d2b f12b919 9528b50 f12b919 8455f0e f12b919 8455f0e 8e5798a f12b919 8e5798a f12b919 8e5798a 8455f0e 8e5798a f12b919 8e5798a 69e13f0 f12b919 8e5798a f12b919 8e5798a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 |
---
app_file: app.py
colorFrom: gray
colorTo: green
description: 'TODO: add a description here'
emoji: "\U0001F4DA"
pinned: false
runme:
id: 01HPS3ASFJXVQR88985QNSXVN1
version: v3
sdk: gradio
sdk_version: 4.36.0
tags:
- evaluate
- metric
title: mot-metrics
---
# How to Use
```python {"id":"01HPS3ASFHPCECERTYN7Z4Z7MN"}
>>> import evaluate
>>> from seametrics.fo_utils.utils import fo_to_payload
>>> b = fo_to_payload(
>>> dataset="SENTRY_VIDEOS_DATASET_QA",
>>> gt_field="ground_truth_det",
>>> models=['volcanic-sweep-3_02_2023_N_LN1_ep288_TRACKER'],
>>> sequence_list=["Sentry_2022_11_PROACT_CELADON_7.5M_MOB_2022_11_25_12_12_39"],
>>> tracking_mode=True
>>> )
>>> module = evaluate.load("SEA-AI/user-friendly-metrics")
>>> res = module._calculate(b, max_iou=0.99, recognition_thresholds=[0.3, 0.5, 0.8])
>>> print(res)
{'Sentry_2022_11_PROACT_CELADON_7.5M_MOB_2022_11_25_12_12_39': {'volcanic-sweep-3_02_2023_N_LN1_ep288_TRACKER': {'idf1': 0.9543031226199543,
'idp': 0.9804381846635368,
'idr': 0.9295252225519288,
'recall': 0.9436201780415431,
'precision': 0.9953051643192489,
'num_unique_objects': 2,
'mostly_tracked': 1,
'partially_tracked': 0,
'mostly_lost': 1,
'num_false_positives': 6,
'num_misses': 76,
'num_switches': 1,
'num_fragmentations': 4,
'mota': 0.9384272997032641,
'motp': 0.5235835810268012,
'num_transfer': 0,
'num_ascend': 1,
'num_migrate': 0,
'recognition_0.3': 0.9230769230769231,
'recognition_0.5': 0.8461538461538461,
'recognition_0.8': 0.46153846153846156}}}
```
## Metric Settings
The `max_iou` parameter is used to filter out the bounding boxes with IOU less than the threshold. The default value is 0.5. This means that if a ground truth and a predicted bounding boxes IoU value is less than 0.5, then the predicted bounding box is not considered for association. So, the higher the `max_iou` value, the more the predicted bounding boxes are considered for association.
## Output
The output is a dictionary containing the following metrics:
| Name | Description |
| :------------------- | :--------------------------------------------------------------------------------- |
| idf1 | ID measures: global min-cost F1 score. |
| idp | ID measures: global min-cost precision. |
| idr | ID measures: global min-cost recall. |
| recall | Number of detections over number of objects. |
| precision | Number of detected objects over sum of detected and false positives. |
| num_unique_objects | Total number of unique object ids encountered. |
| mostly_tracked | Number of objects tracked for at least 80 percent of lifespan. |
| partially_tracked | Number of objects tracked between 20 and 80 percent of lifespan. |
| mostly_lost | Number of objects tracked less than 20 percent of lifespan. |
| num_false_positives | Total number of false positives (false-alarms). |
| num_misses | Total number of misses. |
| num_switches | Total number of track switches. |
| num_fragmentations | Total number of switches from tracked to not tracked. |
| mota | Multiple object tracker accuracy. |
| motp | Multiple object tracker precision. |
| recognition_0.3 | Recognition rate for a threshold of 0.3 |
## Citations
```bibtex {"id":"01HPS3ASFJXVQR88985GKHAQRE"}
@InProceedings{huggingface:module,
title = {A great new module},
authors={huggingface, Inc.},
year={2020}}
```
```bibtex {"id":"01HPS3ASFJXVQR88985KRT478N"}
@article{milan2016mot16,
title={MOT16: A benchmark for multi-object tracking},
author={Milan, Anton and Leal-Taix{\'e}, Laura and Reid, Ian and Roth, Stefan and Schindler, Konrad},
journal={arXiv preprint arXiv:1603.00831},
year={2016}}
```
## Further References
- [Github Repository - py-motmetrics](https://github.com/cheind/py-motmetrics/tree/develop)
|