FlashVenom
flashvenom
AI & ML interests
Exploring LLM training, inference and serving
Organizations
flashvenom's activity
Incredible work, would it be possible to make this work with FLUX?
1
#1 opened 6 months ago
by
flashvenom

Code?
2
#1 opened over 1 year ago
by
flashvenom

Really cool work
#1 opened over 1 year ago
by
flashvenom

Model details?
1
#1 opened over 1 year ago
by
flashvenom

Thanks for all the hard work! Chance to see superhot-65b?
9
#1 opened over 1 year ago
by
Panchovix
Released a 4bit GPTQ to make it easier for folks to try it out!
3
#3 opened over 1 year ago
by
flashvenom

7B, 33B and 65B versions?
3
#2 opened over 1 year ago
by
flashvenom

oobabooga Settings
2
#2 opened over 1 year ago
by
siiNCeyy

Download corrupted/Not working
16
#1 opened over 1 year ago
by
Sergei6000
Love the idea
5
#1 opened over 1 year ago
by
flashvenom

Difference between this and 8k version?
10
#1 opened over 1 year ago
by
flashvenom

Performance?
7
#1 opened over 1 year ago
by
flashvenom

Is my understanding correct that the monkey patch will be needed to be added for inference only?
5
#1 opened over 1 year ago
by
flashvenom

4 bit GPTQ
2
#1 opened over 1 year ago
by
flashvenom

woo
3
#1 opened over 1 year ago
by
flashvenom

What does the 4k stand for?
2
#1 opened almost 2 years ago
by
flashvenom

Prompt style?
1
#1 opened almost 2 years ago
by
flashvenom

Code to train?
#1 opened almost 2 years ago
by
flashvenom

Will there be other versions (ie. 7B/13B/65B)
2
#2 opened almost 2 years ago
by
flashvenom

Context Length?
2
#4 opened almost 2 years ago
by
flashvenom
