Update README.md
Browse files
README.md
CHANGED
@@ -42,16 +42,16 @@ For starters we find that the default temperature of 1.0 works great. But just l
|
|
42 |
|
43 |
In terms of context length, we would recommend the default maximum (32,768). We understand that this model’s size is quite large, and so to ease VRAM requirements you can lower the context length to meet your compute.
|
44 |
|
45 |
-
- Min
|
46 |
-
- Regular
|
47 |
-
- Max
|
48 |
|
49 |
Also make sure to max out your output length to prevent the model from stopping short and having to manually ‘continue’ the response.
|
50 |
|
51 |
After you've generated a MilkDrop preset, copy-and-paste it into a text file, and then save it using the <.milk> file format. Move the <.milk> file into the presets folder of your Milkdrop app. We recommend using the NestDrop Classic app, which is freely available. (Make sure to close and reopen NestDrop to see your newly added presets.)
|
52 |
|
53 |
|
54 |
-
##Text Prompt Template
|
55 |
In terms of the approach to text prompting with this model, the classic “Give me a Glowsticks milkdrop preset” or “Make a milkdrop preset with [x], [y], [z]” still works very well. But feel free to experiment with brand new ways to ask the 32b model for presets to take advantage of its new conversational capabilities and the results may surprise you! Below is the full list of preset categories/subcategories that this model was trained on.
|
56 |
|
57 |
```
|
@@ -73,6 +73,6 @@ As with any new model release, we want to emphasize that MilkDropLM-32b-v0.3 is
|
|
73 |
## Acknowledgements
|
74 |
This project is the result of a collaboration between [ISOSCELES](https://www.instagram.com/isosceles.vj) and InferenceIllusionist. This was a unique meeting of minds since ISOSCELES brought his MilkDrop preset knowledge and experience in helping develop NestDrop for the VJ community, and InferenceIllusionist brought his vital experience in fine-tuning and quantizing LLMs. We stand on the shoulders of the many Milkdrop authors which have freely released their original presets for everyone to enjoy. Much respect!
|
75 |
|
76 |
-
We would like to express our deepest
|
77 |
|
78 |
Shoutout to [Unsloth](https://unsloth.ai) as well for providing the tools used for this fine-tune.
|
|
|
42 |
|
43 |
In terms of context length, we would recommend the default maximum (32,768). We understand that this model’s size is quite large, and so to ease VRAM requirements you can lower the context length to meet your compute.
|
44 |
|
45 |
+
- **Min**: 8192 /// Minimum requirement. Enough to output most presets once
|
46 |
+
- **Regular**: 16384 /// Allows for having up to 2 presets in ‘memory’
|
47 |
+
- **Max**: 32768 /// Allows for 3-4 Presets in ‘memory’, recommended
|
48 |
|
49 |
Also make sure to max out your output length to prevent the model from stopping short and having to manually ‘continue’ the response.
|
50 |
|
51 |
After you've generated a MilkDrop preset, copy-and-paste it into a text file, and then save it using the <.milk> file format. Move the <.milk> file into the presets folder of your Milkdrop app. We recommend using the NestDrop Classic app, which is freely available. (Make sure to close and reopen NestDrop to see your newly added presets.)
|
52 |
|
53 |
|
54 |
+
## Text Prompt Template
|
55 |
In terms of the approach to text prompting with this model, the classic “Give me a Glowsticks milkdrop preset” or “Make a milkdrop preset with [x], [y], [z]” still works very well. But feel free to experiment with brand new ways to ask the 32b model for presets to take advantage of its new conversational capabilities and the results may surprise you! Below is the full list of preset categories/subcategories that this model was trained on.
|
56 |
|
57 |
```
|
|
|
73 |
## Acknowledgements
|
74 |
This project is the result of a collaboration between [ISOSCELES](https://www.instagram.com/isosceles.vj) and InferenceIllusionist. This was a unique meeting of minds since ISOSCELES brought his MilkDrop preset knowledge and experience in helping develop NestDrop for the VJ community, and InferenceIllusionist brought his vital experience in fine-tuning and quantizing LLMs. We stand on the shoulders of the many Milkdrop authors which have freely released their original presets for everyone to enjoy. Much respect!
|
75 |
|
76 |
+
We would like to express our deepest gratitude towards our growing community of alpha testers and feedback providers for their invaluable insights and support throughout this development process. We truly appreciate your pioneer spirit and courage in embracing this new family of Large Language Models.
|
77 |
|
78 |
Shoutout to [Unsloth](https://unsloth.ai) as well for providing the tools used for this fine-tune.
|