---
language:
- en
thumbnail:
tags:
- text generation
license: cc
datasets:
- quotes-500K
metrics:
- perplexity
---
# Quotes Generator
## Model description
This is a GPT2 model fine-tuned on the Quotes-500K dataset.
## Intended uses & limitations
For a given user prompt, it can generate motivational quotes starting with it.
#### How to use
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("nandinib1999/quote-generator")
model = AutoModelWithLMHead.from_pretrained("nandinib1999/quote-generator")
```
## Training data
This is the distribution of the total dataset into training, validation and test dataset for the fine-tuning task.
train |
349796 |
validation |
99942 |
test |
49971 |
## Training procedure
The model was fine-tuned using the Google Colab GPU for one epoch. The weights of the pre-trained GPT2 model were used as a base.
## Eval results
Epoch |
Perplexity |
1 |
15.180 |