Update README.md
Browse files
README.md
CHANGED
@@ -3,14 +3,18 @@ license: apache-2.0
|
|
3 |
tags:
|
4 |
- moe
|
5 |
- merge
|
|
|
|
|
|
|
6 |
---
|
7 |
|
8 |
# Megatron_v3_2x7B
|
9 |
|
10 |
-
Megatron_v3_2x7B is a Mixure of Experts (MoE).
|
|
|
11 |
|
12 |
|
13 |
-
## 💻 Usage
|
14 |
|
15 |
```python
|
16 |
!pip install -qU transformers bitsandbytes accelerate
|
|
|
3 |
tags:
|
4 |
- moe
|
5 |
- merge
|
6 |
+
language:
|
7 |
+
- en
|
8 |
+
- tr
|
9 |
---
|
10 |
|
11 |
# Megatron_v3_2x7B
|
12 |
|
13 |
+
Megatron_v3_2x7B is a bilingual Mixure of Experts (MoE) which can comprehend and speak English/Turkish.
|
14 |
+
Megatron, MoE mimarisine sahip Türkçe ve İngilizce talimatları anlayan ve cevap veren bir modeldir.
|
15 |
|
16 |
|
17 |
+
## 💻 Usage/Kullanımı
|
18 |
|
19 |
```python
|
20 |
!pip install -qU transformers bitsandbytes accelerate
|