--- title: Grounding DINO Demo emoji: 💻 colorFrom: purple colorTo: yellow sdk: gradio sdk_version: 3.23.0 app_file: app.py pinned: false license: apache-2.0 short_description: Cutting edge open-vocabulary object detection app --- # Grounding DINO [📃Paper](https://arxiv.org/abs/2303.05499) | [📽️Video](https://www.youtube.com/watch?v=wxWDt5UiwY8) | [🗯️ Github](https://github.com/IDEA-Research/GroundingDINO) | [📯Demo on Colab](https://colab.research.google.com/github/roboflow-ai/notebooks/blob/main/notebooks/zero-shot-object-detection-with-grounding-dino.ipynb) | [🤗Demo on HF (Coming soon)]() [](https://colab.research.google.com/github/roboflow-ai/notebooks/blob/main/notebooks/zero-shot-object-detection-with-grounding-dino.ipynb) \ [](https://paperswithcode.com/sota/zero-shot-object-detection-on-mscoco?p=grounding-dino-marrying-dino-with-grounded) \ [](https://paperswithcode.com/sota/zero-shot-object-detection-on-odinw?p=grounding-dino-marrying-dino-with-grounded) \ [](https://paperswithcode.com/sota/object-detection-on-coco-minival?p=grounding-dino-marrying-dino-with-grounded) \ [](https://paperswithcode.com/sota/object-detection-on-coco?p=grounding-dino-marrying-dino-with-grounded) Official pytorch implementation of [Grounding DINO](https://arxiv.org/abs/2303.05499), a stronger open-set object detector. Code is available now! ## Highlight - **Open-Set Detection.** Detect **everything** with language! - **High Performancce.** COCO zero-shot **52.5 AP** (training without COCO data!). COCO fine-tune **63.0 AP**. - **Flexible.** Collaboration with Stable Diffusion for Image Editting. ## News [2023/03/27] Support CPU-only mode. Now the model can run on machines without GPUs.\ [2023/03/25] A [demo](https://colab.research.google.com/github/roboflow-ai/notebooks/blob/main/notebooks/zero-shot-object-detection-with-grounding-dino.ipynb) for Grounding DINO is available at Colab. Thanks to @Piotr! \ [2023/03/22] Code is available Now! ## TODO - [x] Release inference code and demo. - [x] Release checkpoints. - [ ] Grounding DINO with Stable Diffusion and GLIGEN demos. - [ ] Release training codes. ## Install If you have a CUDA environment, please make sure the environment variable `CUDA_HOME` is set. It will be compiled under CPU-only mode if no CUDA available. ```bash pip install -e . ``` ## Demo ```bash CUDA_VISIBLE_DEVICES=6 python demo/inference_on_a_image.py \ -c /path/to/config \ -p /path/to/checkpoint \ -i .asset/cats.png \ -o "outputs/0" \ -t "cat ear." \ [--cpu-only] # open it for cpu mode ``` See the `demo/inference_on_a_image.py` for more details. ## Checkpoints
name | backbone | Data | box AP on COCO | Checkpoint | Config | |
---|---|---|---|---|---|---|
1 | GroundingDINO-T | Swin-T | O365,GoldG,Cap4M | 48.4 (zero-shot) / 57.2 (fine-tune) | link | link |