Update README.md
Browse files
README.md
CHANGED
@@ -6,4 +6,10 @@ Are you frequently asked google-able Trivia questions and annoyed by it? Well, t
|
|
6 |
|
7 |
This text gen model is a GPT-2 774M Parameter Size L Model, first trained on [Wizard of Wikipedia](https://parl.ai/projects/wizard_of_wikipedia/) for 40k steps (34/36 layers frozen for the fine-tuning), and then subsequently trained for 40k steps on a parsed variant of [Natural Questions](https://ai.google.com/research/NaturalQuestions)(**also** 34/36 layers frozen for the fine-tuning) to accidentally create this model.
|
8 |
|
9 |
-
Note that because the model was originally trained for use in a [chatbot application](https://github.com/pszemraj/ai-msgbot), it uses a named conversation dialogue structure, _, i.e. the questions are asked by person alpha, and responded to by person beta_. Even if you don't specify person alpha, it should hopefully respond to any question.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6 |
|
7 |
This text gen model is a GPT-2 774M Parameter Size L Model, first trained on [Wizard of Wikipedia](https://parl.ai/projects/wizard_of_wikipedia/) for 40k steps (34/36 layers frozen for the fine-tuning), and then subsequently trained for 40k steps on a parsed variant of [Natural Questions](https://ai.google.com/research/NaturalQuestions)(**also** 34/36 layers frozen for the fine-tuning) to accidentally create this model.
|
8 |
|
9 |
+
Note that because the model was originally trained for use in a [chatbot application](https://github.com/pszemraj/ai-msgbot), it uses a named conversation dialogue structure, _, i.e. the questions are asked by person alpha, and responded to by person beta_. Even if you don't specify person alpha, it should hopefully respond to any question.
|
10 |
+
|
11 |
+
## Example Prompt
|
12 |
+
|
13 |
+
- the default examples are not great
|
14 |
+
- you can type in any trivia question or delete the example and write `what` or `when` in there, and it will generate the rest of the trivia question **and the answer**!
|
15 |
+
|