Update README.md
Browse files
README.md
CHANGED
@@ -33,6 +33,10 @@ We evaluate GRM 2B on the [reward model benchmark](https://huggingface.co/spaces
|
|
33 |
|
34 |
## Usage
|
35 |
**Note: Please download the `model.py` file from this repository to ensure the structure is loaded correctly and verify that the `v_head` is properly initialized.**
|
|
|
|
|
|
|
|
|
36 |
```
|
37 |
import torch
|
38 |
from transformers import AutoTokenizer, AutoModelForSequenceClassification
|
|
|
33 |
|
34 |
## Usage
|
35 |
**Note: Please download the `model.py` file from this repository to ensure the structure is loaded correctly and verify that the `v_head` is properly initialized.**
|
36 |
+
|
37 |
+
If you use the following example, the warning "Some weights of the model checkpoint at ... were not used when initializing LlamaForCausalLM" can be just omitted. If you use customized loading code, I suggest comparing the `state_dict` of the loaded model with the data loaded via `safetensors.safe_open(xx.safetensors)` or `torch.load(xx.bin)`. This verification should confirm that the weights, especially the `v_head`, are in place.
|
38 |
+
|
39 |
+
|
40 |
```
|
41 |
import torch
|
42 |
from transformers import AutoTokenizer, AutoModelForSequenceClassification
|