|
--- |
|
license: apache-2.0 |
|
base_model: h2oai/h2o-danube2-1.8b-base |
|
datasets: |
|
- ajibawa-2023/Code-290k-ShareGPT |
|
language: |
|
- en |
|
library_name: transformers |
|
tags: |
|
- llama-factory |
|
- unsloth |
|
--- |
|
# h2o-danube2 with ChatML template |
|
|
|
This model was first fine-tuned with [BAdam](https://arxiv.org/abs/2404.02827 "BAdam: A Memory Efficient Full Parameter Optimization Method for Large Language Models") on [ajibawa-2023/Code-290k-ShareGPT](https://huggingface.co/datasets/ajibawa-2023/Code-290k-ShareGPT) using LLama-Factory. |
|
|
|
## Template |
|
|
|
```jinja |
|
<|im_start|>system |
|
You are a helpful coding assistant.<|im_end|> |
|
<|im_start|>user |
|
{{instruction}}<|im_end|> |
|
<|im_start|>assistant |
|
{{response}}<|im_end|> |
|
``` |
|
|
|
### BAdam config |
|
|
|
```yaml |
|
### model |
|
model_name_or_path: danube2-base-chatml |
|
|
|
### method |
|
stage: sft |
|
do_train: true |
|
finetuning_type: full |
|
use_badam: true |
|
badam_switch_mode: ascending |
|
badam_switch_interval: 50 |
|
badam_verbose: 1 |
|
badam_start_block: 8 |
|
seed: 8 |
|
|
|
### dataset |
|
dataset: code_290k |
|
template: hermes_chatml |
|
cutoff_len: 8192 |
|
overwrite_cache: false |
|
preprocessing_num_workers: 12 |
|
|
|
### output |
|
output_dir: code-290k-chatml-badam |
|
logging_steps: 5 |
|
save_steps: 1 |
|
save_strategy: epoch |
|
plot_loss: true |
|
overwrite_output_dir: false |
|
|
|
### train |
|
per_device_train_batch_size: 2 |
|
gradient_accumulation_steps: 8 |
|
learning_rate: 0.00001 |
|
num_train_epochs: 1 |
|
lr_scheduler_type: constant_with_warmup |
|
warmup_ratio: 0.01 |
|
bf16: true |
|
flash_attn: fa2 |
|
|
|
### eval |
|
val_size: 0.01 |
|
per_device_eval_batch_size: 1 |
|
eval_strategy: steps |
|
eval_steps: 1000 |
|
``` |
|
|
|
### BAdam training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | |
|
|:-------------:|:------:|:-----:|:---------------:| |
|
| 0.7404 | 0.0559 | 1000 | 0.7784 | |
|
| 0.7858 | 0.1118 | 2000 | 0.7702 | |
|
| 0.7274 | 0.1677 | 3000 | 0.7604 | |
|
| 0.6956 | 0.2236 | 4000 | 0.7570 | |
|
| 0.7711 | 0.2795 | 5000 | 0.7541 | |
|
| 0.7643 | 0.3354 | 6000 | 0.7518 | |
|
| 0.8255 | 0.3913 | 7000 | 0.7496 | |
|
| 0.7456 | 0.4472 | 8000 | 0.7483 | |
|
| 0.7718 | 0.5031 | 9000 | 0.7447 | |
|
| 0.6693 | 0.5590 | 10000 | 0.7445 | |
|
| 0.7409 | 0.6149 | 11000 | 0.7433 | |
|
| 0.7319 | 0.6709 | 12000 | 0.7424 | |
|
| 0.7636 | 0.7268 | 13000 | 0.7415 | |
|
| 0.7504 | 0.7827 | 14000 | 0.7414 | |
|
| 0.7735 | 0.8386 | 15000 | 0.7374 | |
|
| 0.7438 | 0.8945 | 16000 | 0.7375 | |
|
| 0.839 | 0.9504 | 17000 | 0.7373 | |
|
|