|
--- |
|
datasets: |
|
- IlyaGusev/ru_turbo_alpaca |
|
- IlyaGusev/ru_turbo_saiga |
|
- IlyaGusev/ru_sharegpt_cleaned |
|
language: |
|
- ru |
|
inference: false |
|
pipeline_tag: text2text-generation |
|
--- |
|
|
|
Llama.cpp compatible version of an original [30B model](https://huggingface.co/IlyaGusev/saiga_30b_lora). |
|
|
|
How to run: |
|
``` |
|
sudo apt-get install git-lfs |
|
pip install llama-cpp-python fire |
|
|
|
git clone https://huggingface.co/IlyaGusev/saiga_30b_lora_llamacpp |
|
|
|
cd saiga_30b_lora_llamacpp |
|
python3 interact.py ggml-model-q4_1.bin |
|
``` |