What kind of error is this?

#74
by programindz - opened

I have checked the transformers version. Still there is some issue. Check this out:


KeyError Traceback (most recent call last)
File ~/miniconda3/envs/gputorch/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py:1034, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1033 try:
-> 1034 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1035 except KeyError:

File ~/miniconda3/envs/gputorch/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py:736, in _LazyConfigMapping.getitem(self, key)
735 if key not in self._mapping:
--> 736 raise KeyError(key)
737 value = self._mapping[key]

KeyError: 'modernbert'

During handling of the above exception, another exception occurred:

ValueError Traceback (most recent call last)
Cell In[13], line 5
3 model_id = "answerdotai/ModernBERT-base"
4 tokenizer = AutoTokenizer.from_pretrained(model_id)
----> 5 model = AutoModelForMaskedLM.from_pretrained(model_id)
7 text = "The capital of France is [MASK]."
8 inputs = tokenizer(text, return_tensors="pt")

File ~/miniconda3/envs/gputorch/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py:526, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
523 if kwargs.get("quantization_config", None) is not None:
524 _ = kwargs.pop("quantization_config")
--> 526 config, kwargs = AutoConfig.from_pretrained(
527 pretrained_model_name_or_path,
528 return_unused_kwargs=True,
529 trust_remote_code=trust_remote_code,
530 code_revision=code_revision,
531 _commit_hash=commit_hash,
532 **hub_kwargs,
533 **kwargs,
534 )
536 # if torch_dtype=auto was passed here, ensure to pass it on
537 if kwargs_orig.get("torch_dtype", None) == "auto":

File ~/miniconda3/envs/gputorch/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py:1036, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1034 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1035 except KeyError:
-> 1036 raise ValueError(
1037 f"The checkpoint you are trying to load has model type {config_dict['model_type']} "
1038 "but Transformers does not recognize this architecture. This could be because of an "
1039 "issue with the checkpoint, or because your version of Transformers is out of date."
1040 )
1041 return config_class.from_dict(config_dict, **unused_kwargs)
1042 else:
1043 # Fallback: use pattern matching on the string.
1044 # We go from longer names to shorter names to catch roberta before bert (for instance)

ValueError: The checkpoint you are trying to load has model type modernbert but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

programindz changed discussion status to closed
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment