Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,43 @@
|
|
| 1 |
-
---
|
| 2 |
-
license:
|
| 3 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: mit
|
| 3 |
+
---
|
| 4 |
+
|
| 5 |
+
# Convert MUSE from TensorFlow to PyTorch and ONNX
|
| 6 |
+
|
| 7 |
+
Read more about the project: [GitHub](https://github.com/dayyass/muse_tf2pt/tree/main).
|
| 8 |
+
|
| 9 |
+
> [!IMPORTANT]
|
| 10 |
+
> **The PyTorch model can be used not only for inference, but also for additional training and fine-tuning**.
|
| 11 |
+
|
| 12 |
+
# Usage
|
| 13 |
+
|
| 14 |
+
The model is available in [HF Models](https://huggingface.co/dayyass/universal-sentence-encoder-multilingual-large-3-pytorch/tree/main) directly through `torch` (*currently, without native support from the `transformers` library*).
|
| 15 |
+
|
| 16 |
+
Model initialization and usage code:
|
| 17 |
+
```python
|
| 18 |
+
import torch
|
| 19 |
+
from functools import partial
|
| 20 |
+
from src.architecture import MUSE
|
| 21 |
+
from src.tokenizer import get_tokenizer, tokenize
|
| 22 |
+
|
| 23 |
+
PATH_TO_PT_MODEL = "model.pt"
|
| 24 |
+
PATH_TO_TF_MODEL = "universal-sentence-encoder-multilingual-large-3"
|
| 25 |
+
|
| 26 |
+
tokenizer = get_tokenizer(PATH_TO_TF_MODEL)
|
| 27 |
+
tokenize = partial(tokenize, tokenizer=tokenizer)
|
| 28 |
+
|
| 29 |
+
model_torch = MUSE(
|
| 30 |
+
num_embeddings=128010,
|
| 31 |
+
embedding_dim=512,
|
| 32 |
+
d_model=512,
|
| 33 |
+
num_heads=8,
|
| 34 |
+
)
|
| 35 |
+
model_torch.load_state_dict(
|
| 36 |
+
torch.load(PATH_TO_PT_MODEL)
|
| 37 |
+
)
|
| 38 |
+
|
| 39 |
+
sentence = "Hello, world!"
|
| 40 |
+
res = model_torch(tokenize(sentence))
|
| 41 |
+
```
|
| 42 |
+
> [!NOTE]
|
| 43 |
+
> Currently, the checkpoint of the original TF Hub model is used for tokenization, so it is loaded in the code above.
|