|
--- |
|
license: apache-2.0 |
|
base_model: Felladrin/TinyMistral-248M-Chat-v2 |
|
--- |
|
|
|
GGUF version of [Felladrin/TinyMistral-248M-Chat-v2](https://huggingface.co/Felladrin/TinyMistral-248M-Chat-v2). |
|
|
|
## Usage with llama.cpp |
|
|
|
```bash |
|
brew install ggerganov/ggerganov/llama.cpp |
|
``` |
|
|
|
```bash |
|
llama-cli \ |
|
--hf-repo Felladrin/gguf-TinyMistral-248M-Chat-v2 \ |
|
--model TinyMistral-248M-Chat-v2.Q8_0.gguf \ |
|
-p "<|im_start|>system\nYou are a helpful assistant who answers user's questions with details and curiosity.<|im_end|>\n<|im_start|>user\nWhat are some potential applications for quantum computing?<|im_end|>\n<|im_start|>assistant\n" \ |
|
-e \ |
|
--dynatemp-range "0.1-0.35" \ |
|
--min-p 0.05 \ |
|
--repeat-penalty 1.1 \ |
|
-c 2048 \ |
|
-n 250 \ |
|
-r "<|im_end|>" |
|
``` |
|
|