Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
# Merged-Llama-Adapters-317-320
|
2 |
|
3 |
A merged LoRA adapter combining four fine-tuned adapters (317-320) for the Llama-3.1-8B language model.
|
@@ -141,5 +154,4 @@ This merged adapter is part of independent individual research work. While the c
|
|
141 |
- This work is provided "as is" without warranties or conditions of any kind
|
142 |
- This is an independent research project and not affiliated with any organization
|
143 |
- Attribution is appreciated but not required
|
144 |
-
- For full license details, see: https://www.apache.org/licenses/LICENSE-2.0
|
145 |
-
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
language:
|
4 |
+
- en
|
5 |
+
base_model:
|
6 |
+
- meta-llama/Llama-3.3-70B-Instruct
|
7 |
+
pipeline_tag: text-generation
|
8 |
+
tags:
|
9 |
+
- lora
|
10 |
+
- adapter
|
11 |
+
- writing
|
12 |
+
- CoT
|
13 |
+
---
|
14 |
# Merged-Llama-Adapters-317-320
|
15 |
|
16 |
A merged LoRA adapter combining four fine-tuned adapters (317-320) for the Llama-3.1-8B language model.
|
|
|
154 |
- This work is provided "as is" without warranties or conditions of any kind
|
155 |
- This is an independent research project and not affiliated with any organization
|
156 |
- Attribution is appreciated but not required
|
157 |
+
- For full license details, see: https://www.apache.org/licenses/LICENSE-2.0
|
|