Various useful datasets with preference optimization
Nicholas Beerbower PRO
nbeerbower
AI & ML interests
QLoRA finetuning and merging LLMs for fun
Recent Activity
liked
a dataset
1 day ago
interstellarninja/json-mode-dpo-prompts
liked
a dataset
1 day ago
interstellarninja/tool-calls-dpo
updated
a dataset
2 days ago
nbeerbower/reddit-dpo
Organizations
models
137
nbeerbower/llama-3-gutenberg-8B
Text Generation
•
Updated
•
144
•
8
nbeerbower/SmolNemo-12B-FFT-experimental
Text Generation
•
Updated
•
17
nbeerbower/Nemo-Loony-12B-experimental
Text Generation
•
Updated
•
17
nbeerbower/Mistral-Nemo-Moderne-12B-FFT-experimental
Text Generation
•
Updated
•
20
•
1
nbeerbower/Mistral-Gutenberg-Doppel-7B-FFT
Text Generation
•
Updated
•
53
•
2
nbeerbower/Qwen2.5-Gutenberg-Doppel-14B
Text Generation
•
Updated
•
144
•
11
nbeerbower/Qwen2.5-Gutenberg-Doppel-32B
Text Generation
•
Updated
•
139
•
6
nbeerbower/Mistral-Nemo-Prism-12B
Text Generation
•
Updated
•
46
•
3
nbeerbower/Mistral-Nemo-Prism-12B-v2
Text Generation
•
Updated
•
69
•
3
nbeerbower/Mistral-Nemo-Prism-12B-v3
Text Generation
•
Updated
•
13
datasets
8
nbeerbower/reddit-dpo
Viewer
•
Updated
•
76.9k
•
6
nbeerbower/cover-images
Viewer
•
Updated
•
4
•
234
•
1
nbeerbower/gutenberg-moderne-dpo
Viewer
•
Updated
•
346
•
50
•
2
nbeerbower/gutenberg2-dpo
Viewer
•
Updated
•
293
•
69
•
18
nbeerbower/Schule-DPO
Viewer
•
Updated
•
34
•
33
•
1
nbeerbower/Arkhaios-DPO
Viewer
•
Updated
•
222
•
70
•
8
nbeerbower/Purpura-DPO
Viewer
•
Updated
•
230
•
45
•
7
nbeerbower/bible-dpo
Viewer
•
Updated
•
31.1k
•
38