--- license: gpl-3.0 base_model: ckiplab/bert-base-chinese tags: - generated_from_trainer model-index: - name: bert-base-chinese-finetuned-QA-b8 results: [] --- # bert-base-chinese-finetuned-QA-b8 This model is a fine-tuned version of [ckiplab/bert-base-chinese](https://huggingface.co/ckiplab/bert-base-chinese) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.3405 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 1.9325 | 0.14 | 500 | 1.2076 | | 1.1199 | 0.29 | 1000 | 1.0315 | | 1.0118 | 0.43 | 1500 | 0.9836 | | 0.9398 | 0.58 | 2000 | 0.9762 | | 0.9526 | 0.72 | 2500 | 0.9374 | | 0.9142 | 0.87 | 3000 | 0.8783 | | 0.8265 | 1.01 | 3500 | 0.9919 | | 0.6091 | 1.16 | 4000 | 0.9613 | | 0.6303 | 1.3 | 4500 | 0.9769 | | 0.6161 | 1.45 | 5000 | 0.9882 | | 0.6109 | 1.59 | 5500 | 0.9160 | | 0.5887 | 1.73 | 6000 | 0.9105 | | 0.5811 | 1.88 | 6500 | 0.9812 | | 0.5638 | 2.02 | 7000 | 1.0669 | | 0.4174 | 2.17 | 7500 | 1.2101 | | 0.3958 | 2.31 | 8000 | 1.2186 | | 0.4032 | 2.46 | 8500 | 1.1691 | | 0.4183 | 2.6 | 9000 | 1.0890 | | 0.4247 | 2.75 | 9500 | 1.0721 | | 0.3917 | 2.89 | 10000 | 1.1714 | | 0.3738 | 3.04 | 10500 | 1.1794 | | 0.29 | 3.18 | 11000 | 1.2494 | | 0.326 | 3.32 | 11500 | 1.2822 | | 0.3076 | 3.47 | 12000 | 1.3214 | | 0.3071 | 3.61 | 12500 | 1.2968 | | 0.2797 | 3.76 | 13000 | 1.3410 | | 0.3192 | 3.9 | 13500 | 1.3405 | ### Framework versions - Transformers 4.34.0 - Pytorch 1.13.1+cu116 - Datasets 2.14.5 - Tokenizers 0.14.1