(
model_id: typing.Union[str, pathlib.Path]
from_transformers: bool = False
force_download: bool = False
use_auth_token: typing.Optional[str] = None
cache_dir: typing.Optional[str] = None
subfolder: typing.Optional[str] = ''
**model_kwargs
)
→
OptimizedModel
Parameters
Union[str, Path]) —
Can be either:bert-base-uncased, or namespaced under a
user or organization name, like dbmdz/bert-base-german-cased.~OptimizedModel.save_pretrained,
e.g., ./my_model_directory/.bool, optional, defaults to False) —
Defines whether the provided model_id contains a vanilla Transformers checkpoint.
bool, optional, defaults to True) —
Whether or not to force the (re-)download of the model weights and configuration files, overriding the
cached versions if they exist.
str, optional, defaults to None) —
The token to use as HTTP bearer authorization for remote files. If True, will use the token generated
when running transformers-cli login (stored in ~/.huggingface).
str, optional, defaults to None) —
Path to a directory in which a downloaded pretrained model configuration should be cached if the
standard cache should not be used.
bool, optional, defaults to False) —
Whether or not to only look at local files (i.e., do not try to download the model).
str, optional, defaults to "") —
In case the relevant files are located inside a subfolder of the model repo either locally or on huggingface.co, you can
specify the folder name here.
Returns
OptimizedModel
The loaded optimized model.
Instantiate a pretrained model from a pre-trained model configuration.
( save_directory: typing.Union[str, os.PathLike] push_to_hub: bool = False **kwargs )
Parameters
str or os.PathLike) —
Directory to which to save. Will be created if it doesn’t exist.
bool, optional, defaults to False) —
Whether or not to push your model to the Hugging Face model hub after saving it.
Using push_to_hub=True will synchronize the repository you are pushing to with save_directory,
which requires save_directory to be a local clone of the repo you are pushing to if it’s an existing
folder. Pass along temp_dir=True to use a temporary directory instead.
Save a model and its configuration file to a directory, so that it can be re-loaded using the from_pretrained() class method.