metadata
license: apache-2.0
base_model: distilbert/distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: API_Detector_3_Distilbert
results: []
my_awesome_model
This model is a fine-tuned version of distilbert/distilbert-base-uncased on Custom dataset (3 classes). It achieves the following results on the evaluation set:
- Loss: 0.6536
- Accuracy: 0.9437
Model description
Due to lack of training data, this model is not recognizing difference between "order" and "orderline" in sentences. So classify "ORDERLINE" to "MAOORDER" when word "orderline" is in sentence. This can be solved by one more 'if' condition in further script and this is more efficient way than more training data and its train.
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 1.0 | 15 | 0.8108 | 0.8187 |
No log | 2.0 | 30 | 0.6536 | 0.9437 |
Framework versions
- Transformers 4.40.1
- Pytorch 2.1.0
- Datasets 2.19.0
- Tokenizers 0.19.1