Update README.md
Browse files
README.md
CHANGED
@@ -10,4 +10,28 @@ inference:
|
|
10 |
temperature: 0.7
|
11 |
tags:
|
12 |
- translation
|
13 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
10 |
temperature: 0.7
|
11 |
tags:
|
12 |
- translation
|
13 |
+
---
|
14 |
+
# Model Card for gpt-sw3-6.7b-v2-translator
|
15 |
+
|
16 |
+
The `gpt-sw3-6.7b-v2-translator` is a finetuned version of `gpt-sw3-6.7b-v2-instruct` on a carefully selected translation pair dataset that was gathered by AI Sweden.
|
17 |
+
|
18 |
+
## How to use:
|
19 |
+
```python
|
20 |
+
import torch
|
21 |
+
from transformers import pipeline, StoppingCriteriaList, StoppingCriteria
|
22 |
+
|
23 |
+
device = "cuda" if torch.cuda.is_available() else "cpu"
|
24 |
+
|
25 |
+
stop_on_token_criteria = StopOnTokenCriteria(stop_token_id=2)
|
26 |
+
pipe = pipeline("text-generation", "AI-Sweden-Models/gpt-sw3-6.7b-v2-translator", device=device)
|
27 |
+
|
28 |
+
text_to_translate = "I like to eat ice cream in the summer."
|
29 |
+
prompt = f"<|endoftext|><s>User: Översätt till Svenska från Engelska\n{text_to_translate}<s>Bot:"
|
30 |
+
|
31 |
+
response = pipe(prompt, max_length=768, stopping_criteria=StoppingCriteriaList([stop_on_token_criteria])))
|
32 |
+
print(response[0]["generated_text"].split("<s>Bot: ")[-1])
|
33 |
+
|
34 |
+
>>> "Jag tycker om att äta glass på sommaren."
|
35 |
+
```
|
36 |
+
|
37 |
+
## Dataset:
|