Update README.md
Browse files
README.md
CHANGED
@@ -8,12 +8,12 @@ tags:
|
|
8 |
- large language model
|
9 |
- h2o-llmstudio
|
10 |
inference: false
|
11 |
-
thumbnail: https://
|
12 |
---
|
13 |
# Model Card
|
14 |
## Summary
|
15 |
|
16 |
-
This model was trained using
|
17 |
- Base model: [tiiuae/falcon-7b](https://huggingface.co/tiiuae/falcon-7b)
|
18 |
|
19 |
|
@@ -62,7 +62,6 @@ print(generate_text.preprocess("Why is drinking water so healthy?")["prompt_text
|
|
62 |
<|prompt|>Why is drinking water so healthy?<|endoftext|><|answer|>
|
63 |
```
|
64 |
|
65 |
-
Alternatively, you can download [h2oai_pipeline.py](h2oai_pipeline.py), store it alongside your notebook, and construct the pipeline yourself from the loaded model and tokenizer. If the model and the tokenizer are fully supported in the `transformers` package, this will allow you to set `trust_remote_code=False`.
|
66 |
|
67 |
```python
|
68 |
import torch
|
@@ -168,7 +167,6 @@ RWForCausalLM(
|
|
168 |
|
169 |
## Model Configuration
|
170 |
|
171 |
-
This model was trained using H2O LLM Studio and with the configuration in [cfg.yaml](cfg.yaml). Visit [H2O LLM Studio](https://github.com/h2oai/h2o-llmstudio) to learn how to train your own large language models.
|
172 |
|
173 |
|
174 |
## Model Validation
|
|
|
8 |
- large language model
|
9 |
- h2o-llmstudio
|
10 |
inference: false
|
11 |
+
thumbnail: https://static.wixstatic.com/media/bdee4e_d0af74523fa64a998d4cfb894e8cd3bb~mv2.png/v1/crop/x_40,y_663,w_1954,h_663/fill/w_342,h_116,al_c,q_85,usm_0.66_1.00_0.01,enc_auto/PAIX%20Logo%20(2).png
|
12 |
---
|
13 |
# Model Card
|
14 |
## Summary
|
15 |
|
16 |
+
This model was trained using By PAIX.Cloud
|
17 |
- Base model: [tiiuae/falcon-7b](https://huggingface.co/tiiuae/falcon-7b)
|
18 |
|
19 |
|
|
|
62 |
<|prompt|>Why is drinking water so healthy?<|endoftext|><|answer|>
|
63 |
```
|
64 |
|
|
|
65 |
|
66 |
```python
|
67 |
import torch
|
|
|
167 |
|
168 |
## Model Configuration
|
169 |
|
|
|
170 |
|
171 |
|
172 |
## Model Validation
|