Update README.md
Browse files
README.md
CHANGED
@@ -16,11 +16,41 @@ We create various AI models and develop solutions that can be applied to busines
|
|
16 |
This model was built based on the Mistral architecture. It was inspired by neural connection technology and rehabilitation therapy.
|
17 |
I have created a new model architecture that does not require pretraining, and training the model is sufficient with just one H100 for 7 hours.
|
18 |
|
19 |
-
### Data
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
|
25 |
### Contact (TBD)
|
26 |
If you have any questions, please raise an issue or contact us at [email protected]
|
|
|
16 |
This model was built based on the Mistral architecture. It was inspired by neural connection technology and rehabilitation therapy.
|
17 |
I have created a new model architecture that does not require pretraining, and training the model is sufficient with just one H100 for 7 hours.
|
18 |
|
19 |
+
### Data
|
20 |
+
Intel/orca_dpo_pairs (DPO)
|
21 |
+
|
22 |
+
### Surgery and Training
|
23 |
+
viethq188/LeoScorpius-7B-Chat-DPO : 0 ~ 24
|
24 |
+
upstage/SOLAR-10.7B-Instruct-v1.0 : 10 ~ 48
|
25 |
+
Total stacking 62 Layers, qlora and dpo.
|
26 |
+
|
27 |
+
### How to Use
|
28 |
+
|
29 |
+
```python
|
30 |
+
message = [
|
31 |
+
{"role": "system", "content": "You are a helpful assistant chatbot."},
|
32 |
+
{"role": "user", "content": "๋ ๊ฐ์ ๊ตฌ๊ฐ ๊ฐ๊ฐ ์ง๋ฆ์ด 1, 2์ผ๋ ๋ ๊ตฌ์ ๋ถํผ๋ ๋ช๋ฐฐ์ง? ์ค๋ช
๋ ๊ฐ์ด ํด์ค."}
|
33 |
+
]
|
34 |
+
tokenizer = AutoTokenizer.from_pretrained(hf_model)
|
35 |
+
prompt = tokenizer.apply_chat_template(message, add_generation_prompt=True, tokenize=False)
|
36 |
+
|
37 |
+
pipeline = transformers.pipeline(
|
38 |
+
"text-generation",
|
39 |
+
model=hf_model,
|
40 |
+
tokenizer=tokenizer
|
41 |
+
)
|
42 |
+
|
43 |
+
|
44 |
+
sequences = pipeline(
|
45 |
+
prompt,
|
46 |
+
do_sample=True,
|
47 |
+
temperature=0.7,
|
48 |
+
top_p=0.9,
|
49 |
+
num_return_sequences=1,
|
50 |
+
max_length=512,
|
51 |
+
)
|
52 |
+
print(sequences[0]['generated_text'])
|
53 |
+
```
|
54 |
|
55 |
### Contact (TBD)
|
56 |
If you have any questions, please raise an issue or contact us at [email protected]
|