license: apache-2.0 | |
datasets: | |
- cerebras/SlimPajama-627B | |
- bigcode/starcoderdata | |
- mc4 | |
language: | |
- fi | |
- en | |
- da | |
- sv | |
- 'no' | |
- nn | |
- is | |
# Viking 7B | |
Viking 7B is a 7B parameter decoder-only transformer pretrained on Finnish, | |
English, Swedish, Danish, Norwegian, Icelandic and code. It has been trained | |
on 2 trillion tokens. Viking 7B is a fully open source model and is made available under the Apache 2.0 License. | |
Please see [the upstream repository](https://huggingface.co/LumiOpen/Viking-7B) for more information. | |
This GGML quantization was done with [akx/ggify](https://github.com/akx/ggify) with llama.cpp b2901 with a small modification to the conversion script to support the Viking tokenizer. |