datasets: | |
- cerebras/SlimPajama-627B | |
- bigcode/starcoderdata | |
- HuggingFaceH4/ultrachat_200k | |
- HuggingFaceH4/ultrafeedback_binarized | |
language: | |
- en | |
license: apache-2.0 | |
widget: | |
- example_title: Fibonacci (Python) | |
messages: | |
- role: system | |
content: You are a chatbot who can help code! | |
- role: user | |
content: Write me a function to calculate the first 10 digits of the fibonacci | |
sequence in Python and print it out to the CLI. | |
# TinyLlama-1.1B-Chat-v1.0-RK3588-1.1.4 | |
This version of TinyLlama-1.1B-Chat-v1.0 has been converted to run on the RK3588 NPU using w8a8 quantization. | |
This model has been optimized with the following LoRA: | |
Compatible with RKLLM version: 1.1.4 | |
## Useful links: | |
[Official RKLLM GitHub](https://github.com/airockchip/rknn-llm) | |
[RockhipNPU Reddit](https://reddit.com/r/RockchipNPU) | |
[EZRKNN-LLM](https://github.com/Pelochus/ezrknn-llm/) | |
Pretty much anything by these folks: [marty1885](https://github.com/marty1885) and [happyme531](https://huggingface.co/happyme531) | |
Converted using https://github.com/c0zaut/ez-er-rkllm-toolkit | |
# Original Model Card for base model, TinyLlama-1.1B-Chat-v1.0, below: | |