metadata
license: apache-2.0
language:
- en
pipeline_tag: text-generation
inference: false
tags:
- pytorch
- mistral
- finetuned
Mistral 7B - Holodeck exl2
Original model: Mistral-7B-Holodeck-1
Model creator: KoboldAI
Quants
4bpw-h6 (main)
4.25bpw-h6
4.65bpw-h6
5bpw-h6
6bpw-h6
8bpw-h8
Quantization notes
Made with exllamav2 0.0.15 with the default dataset.
How to run
This quantization method uses GPU and requires Exllamav2 loader which can be found in following applications:
Original card
Mistral 7B - Holodeck
Model Description
Mistral 7B-Holodeck is a finetune created using Mistral's 7B model.
Training data
The training data contains around 3000 ebooks in various genres.
Most parts of the dataset have been prepended using the following text: [Genre: <genre1>, <genre2>]
### Limitations and Biases
Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).