please quant this model

#537
by kalle07 - opened

Queued! Should be done in an hour or so.

mradermacher changed discussion status to closed

thx, but its a Pharia For Causal LM, it dont work with hugging face quant machine ;) maybe you have luck

here is the other one
https://huggingface.co/Aleph-Alpha/Pharia-1-LLM-7B-control-aligned-hf

thx, but its a Pharia For Causal LM, it dont work with hugging face quant machine ;) maybe you have luck

No luck as PhariaForCausalLM is not yet supported by llama.cpp. Before any Pharia based model can be converted into a GGUF there must be llama.cpp support implemented for it.

Sign up or log in to comment