ModuleNotFoundError: No module named 'transformers_modules.VisualPRM-8B-v1'
Can you provide the version of requiremented packages?
please set trust_remote_code=True when loading the model. If you load the model using multiple processes, a ModuleNotFoundError may occasionally occur, which is a known issue in the transformers library.
please set trust_remote_code=True when loading the model. If you load the model using multiple processes, a ModuleNotFoundError may occasionally occur, which is a known issue in the transformers library.
I downloaded it to local_path, use model = AutoModel.from_pretrained( local_path, trust_remote_code=True, local_files_only=True, low_cpu_mem_usage=True, torch_dtype=torch.bfloat16, ).eval()
. It raised ModuleNotFoundError: No module named 'transformers_modules.VisualPRM-8B-v1
. What should I change?
It seems the issue was caused by the repository name containing a special character .
. We have changed the name from OpenGVLab/VisualPRM-8B-v1.1 to OpenGVLab/VisualPRM-8B-v1_1. Thanks for the reminder!