Tillmann Schwörer commited on
Commit
6a69b91
·
1 Parent(s): 1cc15e4

Add application file

Browse files
Files changed (1) hide show
  1. app.py +10 -0
app.py ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ import gradio as gr
2
+
3
+ gr.load(
4
+ name="tillschwoerer/roberta-base-finetuned-toxic-comment-detection",
5
+ src="models",
6
+ title="Toxic comment detection",
7
+ description="Say if the prompt is toxic or not.",
8
+ article="Check out the [model repo](https://huggingface.co/tillschwoerer/roberta-base-finetuned-toxic-comments-detection) that this demo is based off of.",
9
+ examples=[["I hate you!"], ['I love you!'], ['I am a cat.']],
10
+ ).launch()