Quantized models
#3
by
MaziyarPanahi
- opened
Thanks for sharing your model, I have quantized it in GGUF format in case anyone needs to run them on CPUs:
Thanks for sharing your model, I have quantized it in GGUF format in case anyone needs to run them on CPUs: