Update README.md
Browse files
README.md
CHANGED
@@ -16,11 +16,18 @@ Zamba2-7B-Instruct long-context has been extended from 4k to 16k context by adju
|
|
16 |
|
17 |
### Prerequisites
|
18 |
|
19 |
-
To
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
|
25 |
|
26 |
### Inference
|
|
|
16 |
|
17 |
### Prerequisites
|
18 |
|
19 |
+
To use Zamba2-7B-instruct, install `transformers`:
|
20 |
+
|
21 |
+
`pip install transformers`
|
22 |
+
|
23 |
+
To install dependencies necessary to run Mamba2 kernels, install `mamba-ssm` from source (due to compatibility issues with PyTorch) as well as `causal-conv1d`:
|
24 |
+
|
25 |
+
1. `git clone https://github.com/state-spaces/mamba.git`
|
26 |
+
2. `cd mamba && git checkout v2.1.0 && pip install .`
|
27 |
+
3. `pip install causal-conv1d`
|
28 |
+
|
29 |
+
|
30 |
+
You can run the model without using the optimized Mamba2 kernels, but it is **not** recommended as it will result in significantly higher latency and memory usage.
|
31 |
|
32 |
|
33 |
### Inference
|