File size: 2,464 Bytes
7fc6328 98ac85a d97412f 7fc6328 002bd38 7fc6328 98ac85a 7fc6328 d97412f 7fc6328 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 |
---
library_name: transformers
base_model: fahadqazi/testts1234
tags:
- generated_from_trainer
model-index:
- name: testts1234
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# testts1234
This model is a fine-tuned version of [fahadqazi/testts1234](https://huggingface.co/fahadqazi/testts1234) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4003
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 2
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 200
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-------:|:-----:|:---------------:|
| 0.4412 | 1.9231 | 500 | 0.4350 |
| 0.4381 | 3.8462 | 1000 | 0.4291 |
| 0.4361 | 5.7692 | 1500 | 0.4394 |
| 0.4382 | 7.6923 | 2000 | 0.4251 |
| 0.4304 | 9.6154 | 2500 | 0.4305 |
| 0.4321 | 11.5385 | 3000 | 0.4226 |
| 0.423 | 13.4615 | 3500 | 0.4271 |
| 0.4204 | 15.3846 | 4000 | 0.4234 |
| 0.4212 | 17.3077 | 4500 | 0.4218 |
| 0.4171 | 19.2308 | 5000 | 0.4209 |
| 0.4116 | 21.1538 | 5500 | 0.4145 |
| 0.4109 | 23.0769 | 6000 | 0.4083 |
| 0.4045 | 25.0 | 6500 | 0.4100 |
| 0.4058 | 26.9231 | 7000 | 0.4095 |
| 0.3992 | 28.8462 | 7500 | 0.4078 |
| 0.402 | 30.7692 | 8000 | 0.4072 |
| 0.3987 | 32.6923 | 8500 | 0.4036 |
| 0.396 | 34.6154 | 9000 | 0.4030 |
| 0.3912 | 36.5385 | 9500 | 0.4010 |
| 0.3937 | 38.4615 | 10000 | 0.4003 |
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
|