Added model_type in config.json

Hi @vishwasthedeveloper ,

I am still getting the Error:

ValueError: Unrecognized model in FasterDecoding/medusa-vicuna-7b-v1.3. Should have a model_type key in its config.json, or contain one of the following strings in its name: albert, align...

Even though I have changed the config, by adding `model_type'. The way how I loaded the config is a s follows:

import json
with open("config.json", "r") as f:
    config = json.load(f)
print(config)
{'base_model_name_or_path': 'FasterDecoding/medusa-vicuna-7b-v1.3',
 'medusa_num_heads': 2,
 'medusa_num_layers': 1,
 'transformers_version': '4.31.0',
 'model_type': 'albert'}

Then loading the model with the config parameter.

from transformers import AutoModel
model = AutoModel.from_pretrained("FasterDecoding/medusa-vicuna-7b-v1.3", config = config)

Any idea why this might be happening? @Gsunshine @jamesliu1 @tianlecai @yli3521

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment