Error when loading model.
#5
by
sjoshi-wm
- opened
Tried to get the transformers
usage example working.
Traceback:
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
File ~/anaconda3/envs/conda_contextual_rep/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py:1038, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1037 try:
-> 1038 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1039 except KeyError:
File ~/anaconda3/envs/conda_contextual_rep/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py:740, in _LazyConfigMapping.__getitem__(self, key)
739 if key not in self._mapping:
--> 740 raise KeyError(key)
741 value = self._mapping[key]
KeyError: 'modernbert'
During handling of the above exception, another exception occurred:
ValueError Traceback (most recent call last)
Cell In[10], line 20
17 documents = ["search_document: TSNE is a dimensionality reduction algorithm created by Laurens van Der Maaten"]
19 tokenizer = AutoTokenizer.from_pretrained("nomic-ai/modernbert-embed-base")
---> 20 model = AutoModel.from_pretrained("nomic-ai/modernbert-embed-base")
22 encoded_queries = tokenizer(queries, padding=True, truncation=True, return_tensors="pt")
23 encoded_documents = tokenizer(documents, padding=True, truncation=True, return_tensors="pt")
File ~/anaconda3/envs/conda_contextual_rep/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py:526, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
523 if kwargs.get("quantization_config", None) is not None:
524 _ = kwargs.pop("quantization_config")
--> 526 config, kwargs = AutoConfig.from_pretrained(
527 pretrained_model_name_or_path,
528 return_unused_kwargs=True,
529 trust_remote_code=trust_remote_code,
530 code_revision=code_revision,
531 _commit_hash=commit_hash,
532 **hub_kwargs,
533 **kwargs,
534 )
536 # if torch_dtype=auto was passed here, ensure to pass it on
537 if kwargs_orig.get("torch_dtype", None) == "auto":
File ~/anaconda3/envs/conda_contextual_rep/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py:1040, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1038 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1039 except KeyError:
-> 1040 raise ValueError(
1041 f"The checkpoint you are trying to load has model type `{config_dict['model_type']}` "
1042 "but Transformers does not recognize this architecture. This could be because of an "
1043 "issue with the checkpoint, or because your version of Transformers is out of date."
1044 )
1045 return config_class.from_dict(config_dict, **unused_kwargs)
1046 else:
1047 # Fallback: use pattern matching on the string.
1048 # We go from longer names to shorter names to catch roberta before bert (for instance)
ValueError: The checkpoint you are trying to load has model type `modernbert` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
Env:
torch 1.13.1
transformers 4.47.1
The ModernBert architecture is so new that it's not yet included in a transformers
release. Because of this, until the next release, you have to use:
pip install -U pip+https://github.com/huggingface/transformers.git
- Tom Aarsen
zpn
changed discussion status to
closed