cloudyu commited on
Commit
74d460b
·
verified ·
1 Parent(s): eb2155c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -17,6 +17,6 @@ tags:
17
  * [DPO Trainer](https://huggingface.co/docs/trl/main/en/dpo_trainer)
18
 
19
 
20
- Metrics improved by DPO traingin on 100 step
21
  ![Metrsc improment](mixtral-dpo.jpg)
22
 
 
17
  * [DPO Trainer](https://huggingface.co/docs/trl/main/en/dpo_trainer)
18
 
19
 
20
+ Metrics improved by DPO traingin after 100 steps
21
  ![Metrsc improment](mixtral-dpo.jpg)
22