File size: 2,129 Bytes
ca3232d 698a022 ca3232d c479a23 ca3232d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 |
---
license: apache-2.0
language:
- en
pipeline_tag: depth-estimation
library_name: coreml
tags:
- depth
- relative depth
base_model:
- depth-anything/Depth-Anything-V2-Small
---
# Depth Anything V2 Small (mlpackage)
In this repo you can find:
* The notebook which was used to convert [depth-anything/Depth-Anything-V2-Small](https://huggingface.co/depth-anything/Depth-Anything-V2-Small) into a CoreML package.
* Both mlpackage files which can be opened in Xcode and used for Preview (of both images and videos) and development of macOS and iOS Apps
* Performence and compute unit mapping report for these models as meassured on an iPhone 16 Pro Max
* One model uses internal resolution of 518x518 ("Box") and the other 518x392 ("Landscape").
* The "Landscape" is much faster than "Box" but will also give more "juggy" edges, due to the patch I applied to avoid bicubing upsampling (.diff file is also present in this repo)
As a derivative work of Depth-Anything-V2-Small this port is also under apache-2.0

## Download
Install `huggingface-cli`
```bash
brew install huggingface-cli
```
To download one of the `.mlpackage` folders to the `models` directory:
```bash
huggingface-cli download \
--local-dir models --local-dir-use-symlinks False \
LloydAI/DepthAnything_v2-Small-CoreML \
--include "DepthAnything_v2_Small_518x392_Landscape.mlpackage/*"
```
## Citation of original work
If you find this project useful, please consider citing:
```bibtex
@article{depth_anything_v2,
title={Depth Anything V2},
author={Yang, Lihe and Kang, Bingyi and Huang, Zilong and Zhao, Zhen and Xu, Xiaogang and Feng, Jiashi and Zhao, Hengshuang},
journal={arXiv:2406.09414},
year={2024}
}
@inproceedings{depth_anything_v1,
title={Depth Anything: Unleashing the Power of Large-Scale Unlabeled Data},
author={Yang, Lihe and Kang, Bingyi and Huang, Zilong and Xu, Xiaogang and Feng, Jiashi and Zhao, Hengshuang},
booktitle={CVPR},
year={2024}
}
|