File size: 2,403 Bytes
bc5f858
 
 
 
 
 
 
 
8ee843c
 
bc5f858
 
 
b35f49b
bc5f858
b35f49b
bc5f858
b35f49b
bc5f858
b35f49b
bc5f858
 
 
 
f1fbff9
b641d71
bc5f858
9d3de33
00571fc
9d3de33
f1fbff9
bc5f858
 
 
 
6eac879
36f2552
 
 
 
e02c194
 
dda9ef1
dd34cd3
 
 
 
 
 
dda9ef1
6eac879
 
 
 
dda9ef1
 
 
 
93ca32b
dda9ef1
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
---

language:
- en
tags:
- text-generation
- gpt2
- gpt
- trivia
- chatbot
license: mit

widget:
- text: "how many ping-pong balls fit inside a standard 747 jet aeroplane?\nperson beta:\n\n"
  example_title: "ping-pong"
- text: "What is the capital of Uganda?\nperson beta:\n\n"
  example_title: "geography"
- text: "What is the most popular TV show of all time?\nperson beta:\n\n"
  example_title: "pseudo-culture"
- text: "A man pushes his car to a hotel and tells the owner he’s bankrupt. Why?\nperson beta:\n\n"
  example_title: "brain teaser"

inference:
  parameters:
    min_length: 2
    max_length: 32
    no_repeat_ngram_size: 2
    do_sample: False
    num_beams: 4
    early_stopping: True
    repetition_penalty: 2.1
    

---

# Ballpark Trivia: Size XL

**Check out a demo on HF Spaces [here](https://huggingface.co/spaces/pszemraj/ballpark-trivia).**

Are you frequently asked google-able Trivia questions and annoyed by it? Well, this is the model for you! Ballpark Trivia Bot answers any trivia question with something that sounds plausible but is probably not 100% correct. One might say.. the answers are in the right ballpark. 

This is by far the largest model trained and should be _more_ credible in its answers or at least able to handle more kinds of questions.

``` 
what is the temperature of dry ice in kelvin

person beta: 
194.65 K
```

## Training 
This text gen model is a GPT-2 ~1.5 B Parameter Size XL Model, first trained on [Wizard of Wikipedia](https://parl.ai/projects/wizard_of_wikipedia/) for 40k steps (**33**/36 layers frozen for the fine-tuning), and then subsequently trained for 40k steps on a parsed variant of [Natural Questions](https://ai.google.com/research/NaturalQuestions)(then **34**/36 layers frozen for the second fine-tuning) to accidentally create this model. 

Note that because the model was originally trained for use in a [chatbot application](https://github.com/pszemraj/ai-msgbot), it uses a named conversation dialogue structure, _i.e. the questions are asked by person alpha, and responded to by person beta_. Even if you don't specify person alpha in the prompt, it hopefully responds to any question.


## Example Prompt

- the default examples are not great
- you can type in any trivia question or delete the example and write `what`  or `when` in there, and it will generate the rest of the trivia question **and the answer**!