Description

NonToxicCivilBert or 'CivilBert' is a further finetuned model of bert-based-uncased using only the non-toxic data of the Jigsaw-unintended-bias dataset to make the predicted tokens less toxic. We are working further to make this model better and less toxic.

You can use it directly using the transformers:

  • from transformers import AutoTokenizer, AutoModelForMaskedLM
  • tokenizer = AutoTokenizer.from_pretrained("Ashokajou51/NonToxicCivilBert")
  • model = AutoModelForMaskedLM.from_pretrained("Ashokajou51/NonToxicCivilBert")
Downloads last month
13
Safetensors
Model size
108M params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train Ashokajou51/NonToxicCivilBert