File size: 3,777 Bytes
2499868 74b542d 18b775d 2499868 b6b33f4 381fa3a b6b33f4 2d85ebf b6b33f4 2d85ebf b6b33f4 4f04353 b6b33f4 2499868 b33b792 2499868 0639a25 b6b33f4 b160927 2499868 e166999 75de073 a7ae121 33afd70 2499868 e166999 002876b 4808aed 381fa3a 4808aed 2499868 b6b33f4 381fa3a b6b33f4 e166999 e5c3ec4 e166999 2499868 d975893 a70b805 d975893 2499868 800f650 2499868 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 |
---
license: cc-by-4.0
tags:
- automatic-rigging
size_categories:
- 10K<n<100K
---
<div align="center">
<h1>MagicArticulate: Make Your 3D Models Articulation-Ready</h1>
<p>
<a href="https://chaoyuesong.github.io"><strong>Chaoyue Song</strong></a><sup>1,2</sup>,
<a href="http://jeff95.me/"><strong>Jianfeng Zhang</strong></a><sup>2*</sup>,
<a href="https://lixiulive.com/"><strong>Xiu Li</strong></a><sup>2</sup>,
<a href="https://scholar.google.com/citations?user=afDvaa8AAAAJ&hl"><strong>Fan Yang</strong></a><sup>1,2</sup>,
<a href="https://buaacyw.github.io/"><strong>Yiwen Chen</strong></a><sup>1</sup>,
<a href="https://zcxu-eric.github.io/"><strong>Zhongcong Xu</strong></a><sup>2</sup>,
<br>
<a href="https://liewjunhao.github.io/"><strong>Jun Hao Liew</strong></a><sup>2</sup>,
<strong>Xiaoyang Guo</strong><sup>2</sup>,
<a href="https://sites.google.com/site/fayaoliu"><strong>Fayao Liu</strong></a><sup>3</sup>,
<a href="https://scholar.google.com.sg/citations?user=Q8iay0gAAAAJ"><strong>Jiashi Feng</strong></a><sup>2</sup>,
<a href="https://guosheng.github.io/"><strong>Guosheng Lin</strong></a><sup>1*</sup>
<br>
*Corresponding authors
<br>
<sup>1 </sup>Nanyang Technological University
<sup>2 </sup>Bytedance Seed
<sup>3 </sup>A*STAR
</p>
<h3>CVPR 2025</h3>
<p>
<a href="https://chaoyuesong.github.io/MagicArticulate/"><strong>Project</strong></a> |
<a href="https://chaoyuesong.github.io/files/MagicArticulate_paper.pdf"><strong>Paper</strong></a> |
<a href="https://github.com/Seed3D/MagicArticulate"><strong>Code</strong></a> |
<a href="https://www.youtube.com/watch?v=eJP_VR4cVnk"><strong>Video</strong></a>
</p>
</div>
<br />
### Update
- 2025.3.22: Update the preprocessed data to include vertex normals. The data size will increase by 10GB after adding the normals. To save data load time during training, we suggest removing some unused information.
- 2025.3.20: Release Articulation-XL2.0 preprocessed data (a NPZ file includes vertices, faces, joints, bones, skinning weights, uuid, etc.).
- 2025.2.16: Release metadata for Articulation-XL2.0!
### Overview
This repository introduces <b>Articulation-XL2.0</b>, a large-scale dataset featuring over <b>48K</b> 3D models with high-quality articulation annotations, filtered from Objaverse-XL. Compared to version 1.0, Articulation-XL2.0 includes 3D models with multiple components. For further details, please refer to the statistics below.
<p align="center">
<img width="60%" src="https://raw.githubusercontent.com/Seed3D/MagicArticulate/main/assets/data_statistics.png"/>
</p>
Note: The data with rigging has been deduplicated (over 150K). The quality of most data has been manually verified.
<p align="center">
<img width="80%" src="https://raw.githubusercontent.com/Seed3D/MagicArticulate/main/assets/articulation-xl2.0.png"/>
</p>
### Metadata
We provide the following information in the metadata of Articulation-XL2.0.
```
uuid,source,vertex_count,face_count,joint_count,bone_count,category_label,fileType,fileIdentifier
```
### Preprocessed data
We provide the preprocessed data that saved in NPZ files, which contain the following information:
```
'vertices', 'faces', 'normals', 'joints', 'bones', 'root_index', 'uuid', 'pc_w_norm', 'joint_names', 'skinning_weights_value', 'skinning_weights_row', 'skinning_weights_col', 'skinning_weights_shape'
```
### Citation
```
@article{song2025magicarticulate,
title={MagicArticulate: Make Your 3D Models Articulation-Ready},
author={Chaoyue Song and Jianfeng Zhang and Xiu Li and Fan Yang and Yiwen Chen and Zhongcong Xu and Jun Hao Liew and Xiaoyang Guo and Fayao Liu and Jiashi Feng and Guosheng Lin},
journal={arXiv preprint arXiv:2502.12135},
year={2025},
}
``` |