Q4_K_M vestion of https://huggingface.co/t-tech/T-lite-it-1.0
Llama-cpp-python code:
from llama_cpp import Llama
from huggingface_hub import snapshot_download
# load model
snapshot_download(repo_id="ichrnkv/t_lite_1.0_gguf", local_dir="./")
# llama cpp model
model = Llama(
model_path="./model.gguf",
verbose=True,
n_gpu_layers=-1,
seed=42
)
- Downloads last month
- 48
Model tree for ichrnkv/t_lite_1.0_gguf
Base model
t-tech/T-lite-it-1.0