update
Browse files
README.md
CHANGED
@@ -21,12 +21,10 @@ tags:
|
|
21 |
[\[🗨️ Chat Demo\]](http://eagle-vlm.xyz/) [\[🤗 HF Demo\]](TODO)
|
22 |
## Introduction
|
23 |
|
24 |
-
We are thrilled to release our latest Eagle2 series Vision-Language Model. Open-source Vision-Language Models (VLMs) have made significant strides in narrowing the gap with proprietary models. However, critical details about data strategies and implementation are often missing, limiting reproducibility and innovation. In this project, we focus on VLM post-training from a data-centric perspective, sharing insights into building effective data strategies from scratch. By combining these strategies with robust training recipes and model design, we introduce
|
25 |
|
26 |
|
27 |
-
|
28 |
-
In this repo, we are open-sourcing Eagle2-9B, which strikes the perfect balance between performance and inference speed.
|
29 |
-
|
30 |
|
31 |
|
32 |
|
@@ -79,7 +77,7 @@ We provide a [inference script](./demo.py) to help you quickly start using the m
|
|
79 |
pip install transformers==4.37.2
|
80 |
pip install flash-attn
|
81 |
```
|
82 |
-
**Note**: Latest version of transformers
|
83 |
|
84 |
### 1. Prepare the Model worker
|
85 |
|
|
|
21 |
[\[🗨️ Chat Demo\]](http://eagle-vlm.xyz/) [\[🤗 HF Demo\]](TODO)
|
22 |
## Introduction
|
23 |
|
24 |
+
We are thrilled to release our latest Eagle2 series Vision-Language Model. Open-source Vision-Language Models (VLMs) have made significant strides in narrowing the gap with proprietary models. However, critical details about data strategies and implementation are often missing, limiting reproducibility and innovation. In this project, we focus on VLM post-training from a data-centric perspective, sharing insights into building effective data strategies from scratch. By combining these strategies with robust training recipes and model design, we introduce Eagle2, a family of performant VLMs. Our work aims to empower the open-source community to develop competitive VLMs with transparent processes.
|
25 |
|
26 |
|
27 |
+
In this repo, we are open-sourcing Eagle2-1B, a compact and efficient model designed for scenarios requiring fast inference and minimal computational resources, without compromising essential performance
|
|
|
|
|
28 |
|
29 |
|
30 |
|
|
|
77 |
pip install transformers==4.37.2
|
78 |
pip install flash-attn
|
79 |
```
|
80 |
+
**Note**: Latest version of transformers is not compatible with the model.
|
81 |
|
82 |
### 1. Prepare the Model worker
|
83 |
|