phi-3-mini-llamafile-nonAVX
llamafile lets you distribute and run LLMs with a single file. announcement blog post
Downloads
This repository was created using the llamafile-builder
- Downloads last month
- 0
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.
Model tree for blueprintninja/phi-3-mini-llamafile-nonAVX
Base model
QuantFactory/Phi-3-mini-4k-instruct-GGUF