DistilBERT base uncased, fine-tuned for NER using the conll03 english dataset. Note that this model is not sensitive to capital letters β "english" is the same as "English". For the case sensitive version, please use elastic/distilbert-base-cased-finetuned-conll03-english.
Versions
- Transformers version: 4.3.1
- Datasets version: 1.3.0
Training
$ run_ner.py \
--model_name_or_path distilbert-base-uncased \
--label_all_tokens True \
--return_entity_level_metrics True \
--dataset_name conll2003 \
--output_dir /tmp/distilbert-base-uncased-finetuned-conll03-english \
--do_train \
--do_eval
After training, we update the labels to match the NER specific labels from the dataset conll2003
- Downloads last month
- 13,524
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for elastic/distilbert-base-uncased-finetuned-conll03-english
Dataset used to train elastic/distilbert-base-uncased-finetuned-conll03-english
Spaces using elastic/distilbert-base-uncased-finetuned-conll03-english 8
Evaluation results
- Accuracy on conll2003validation set verified0.985
- Precision on conll2003validation set verified0.988
- Recall on conll2003validation set verified0.990
- F1 on conll2003validation set verified0.989
- loss on conll2003validation set verified0.067