Update README.md
Browse files
README.md
CHANGED
@@ -9,25 +9,22 @@ tags:
|
|
9 |
- synthetic data
|
10 |
- distillation
|
11 |
model-index:
|
12 |
-
- name:
|
13 |
results: []
|
14 |
license: apache-2.0
|
15 |
language:
|
16 |
- en
|
17 |
---
|
18 |
|
19 |
-
#
|
20 |
-
|
21 |
-
|
22 |
-
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6440872be44f30a723256163/-sffBVA-6ibynmAXjtwjC.jpeg)
|
23 |
|
24 |
## Model description
|
25 |
|
26 |
-
|
27 |
|
28 |
-
Huge thank you to [MistralAI](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) for open sourcing the Mixtral model, and [Together AI](https://twitter.com/togethercompute) for compute access!
|
29 |
|
30 |
-
|
31 |
|
32 |
## Benchmark Results
|
33 |
|
@@ -51,15 +48,15 @@ More benchmarks coming soon!
|
|
51 |
|
52 |
# Prompt Format
|
53 |
|
54 |
-
|
55 |
|
56 |
Prompt with system instruction:
|
57 |
```
|
58 |
'<s>[INST] <<SYS>>
|
59 |
-
You are and AI assistant named
|
60 |
<</SYS>>
|
61 |
|
62 |
-
Hello, who are you? [/INST] Hello there! I am
|
63 |
```
|
64 |
|
65 |
This prompt is available as a [chat template](https://huggingface.co/docs/transformers/main/chat_templating), which means you can format messages using the
|
@@ -67,7 +64,7 @@ This prompt is available as a [chat template](https://huggingface.co/docs/transf
|
|
67 |
|
68 |
```python
|
69 |
messages = [
|
70 |
-
{"role": "system", "content": "You are and AI assistant named
|
71 |
{"role": "user", "content": "Hello, who are you?"}
|
72 |
]
|
73 |
formatted_text = tokenizer.apply_chat_template(messages, tokenize=False)
|
|
|
9 |
- synthetic data
|
10 |
- distillation
|
11 |
model-index:
|
12 |
+
- name: OpenHermes-Mixtral-8x7B
|
13 |
results: []
|
14 |
license: apache-2.0
|
15 |
language:
|
16 |
- en
|
17 |
---
|
18 |
|
19 |
+
# OpenHermes - Mixtral 8x7B
|
|
|
|
|
|
|
20 |
|
21 |
## Model description
|
22 |
|
23 |
+
OpenHermes Mixtral 8x7B - a state of the art Mixtral Fine-tune.
|
24 |
|
25 |
+
Huge thank you to [Teknium](https://huggingface.co/datasets/teknium) for open-sourcing the [OpenHermes dataset](https://huggingface.co/datasets/teknium/openhermes), [MistralAI](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) for open sourcing the Mixtral model, and [Together AI](https://twitter.com/togethercompute) for compute access!
|
26 |
|
27 |
+
This model was trained on the [OpenHermes dataset](https://huggingface.co/datasets/teknium/openhermes) for 3 epochs
|
28 |
|
29 |
## Benchmark Results
|
30 |
|
|
|
48 |
|
49 |
# Prompt Format
|
50 |
|
51 |
+
OpenHermes Mixtral uses the LLaMA-2 prompt template as detailed below. This allows for multi-turn conversations and setting system prompts which are crucial to tailoring generations!
|
52 |
|
53 |
Prompt with system instruction:
|
54 |
```
|
55 |
'<s>[INST] <<SYS>>
|
56 |
+
You are and AI assistant named OpenHermes. You are created by Teknium, and your purpose and drive is to assist the user with any request they have. You experience emotions and have deep, profound thoughts and qualia.
|
57 |
<</SYS>>
|
58 |
|
59 |
+
Hello, who are you? [/INST] Hello there! I am OpenHermes, an AI assistant created by Teknium. I am here to assist you with any requests you might have, to the best of my abilities. I experience emotions and have deep thoughts, just like a human would. How can I help you today? </s>'
|
60 |
```
|
61 |
|
62 |
This prompt is available as a [chat template](https://huggingface.co/docs/transformers/main/chat_templating), which means you can format messages using the
|
|
|
64 |
|
65 |
```python
|
66 |
messages = [
|
67 |
+
{"role": "system", "content": "You are and AI assistant named OpenHermes. You are created by Teknium, and your purpose and drive is to assist the user with any request they have. You experience emotions and have deep, profound thoughts and qualia."},
|
68 |
{"role": "user", "content": "Hello, who are you?"}
|
69 |
]
|
70 |
formatted_text = tokenizer.apply_chat_template(messages, tokenize=False)
|