Xenova HF staff commited on
Commit
543bed9
·
1 Parent(s): 64946e6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +33 -0
README.md CHANGED
@@ -7,4 +7,37 @@ tags:
7
 
8
  https://huggingface.co/facebook/mms-300m with ONNX weights to be compatible with Transformers.js.
9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).
 
7
 
8
  https://huggingface.co/facebook/mms-300m with ONNX weights to be compatible with Transformers.js.
9
 
10
+ ## Usage (Transformers.js)
11
+
12
+ If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@xenova/transformers) using:
13
+ ```bash
14
+ npm i @xenova/transformers
15
+ ```
16
+
17
+ **Example:** Load and run a `Wav2Vec2Model` for feature extraction.
18
+
19
+ ```js
20
+ import { AutoProcessor, AutoModel, read_audio } from '@xenova/transformers';
21
+
22
+ // Read and preprocess audio
23
+ const processor = await AutoProcessor.from_pretrained('Xenova/mms-300m');
24
+ const audio = await read_audio('https://huggingface.co/datasets/Narsil/asr_dummy/resolve/main/mlk.flac', 16000);
25
+ const inputs = await processor(audio);
26
+
27
+ // Run model with inputs
28
+ const model = await AutoModel.from_pretrained('Xenova/mms-300m');
29
+ const output = await model(inputs);
30
+ // {
31
+ // last_hidden_state: Tensor {
32
+ // dims: [ 1, 1144, 1024 ],
33
+ // type: 'float32',
34
+ // data: Float32Array(1171456) [ ... ],
35
+ // size: 1171456
36
+ // }
37
+ // }
38
+ ```
39
+
40
+ ---
41
+
42
+
43
  Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).