Vocabulary?
What's the vocabulary count for this model? Some have the entry and some don't.
Hi, can you give a bit more detail on what you want to know? What do you mean with "vocabulary count" and "entry". All the quants have the same vocabulary (it comes form the original model, which you probably should consult for such questions).
You can get a count of the tokens, if you mean that, by counting the entries in tokenizer.ggml.tokens for example, which again, is something that should be identical between all quants.
I was looking for this
llama.vocab_size
Then saw this
token_embd.weight
I'm new so I have some dumb questions. I'm also on my phone.
Since it's not a llama model, it wouldn't have that parameter, and token_embd.weight is not a parameter at all, but a tensor name.