UzRoBerta model.
Pre-prepared model in Uzbek (Cyrillic and latin script) to model the masked language and predict the next sentences.
Training data.
UzBERT model was pretrained on ≈2M news articles (≈3Gb). widget:
- text: "Is this review positive or negative? Review: Best cast iron skillet you will every buy." example_title: "Sentiment analysis"
- text: "Barack Obama nominated Hilary Clinton as his secretary of state on Monday. He chose her because she had ..." example_title: "Coreference resolution"