Optimum documentation
OptimizedModel
OptimizedModel
from_pretrained
< source >(
model_id: typing.Union[str, pathlib.Path]
from_transformers: bool = False
force_download: bool = False
use_auth_token: typing.Optional[str] = None
cache_dir: typing.Optional[str] = None
subfolder: typing.Optional[str] = ''
**model_kwargs
)
→
OptimizedModel
Parameters
-
model_id (
Union[str, Path]
) — Can be either:- A string, the model id of a pretrained model hosted inside a model repo on huggingface.co.
Valid model ids can be located at the root-level, like
bert-base-uncased
, or namespaced under a user or organization name, likedbmdz/bert-base-german-cased
. - A path to a directory containing a model saved using
~OptimizedModel.save_pretrained
, e.g.,./my_model_directory/
.
- A string, the model id of a pretrained model hosted inside a model repo on huggingface.co.
Valid model ids can be located at the root-level, like
-
from_transformers (
bool
, optional, defaults toFalse
) — Defines whether the providedmodel_id
contains a vanilla Transformers checkpoint. -
force_download (
bool
, optional, defaults toTrue
) — Whether or not to force the (re-)download of the model weights and configuration files, overriding the cached versions if they exist. -
use_auth_token (
str
, optional, defaults toNone
) — The token to use as HTTP bearer authorization for remote files. IfTrue
, will use the token generated when runningtransformers-cli login
(stored in~/.huggingface
). -
cache_dir (
str
, optional, defaults toNone
) — Path to a directory in which a downloaded pretrained model configuration should be cached if the standard cache should not be used. -
local_files_only(
bool
, optional, defaults toFalse
) — Whether or not to only look at local files (i.e., do not try to download the model). -
subfolder (
str
, optional, defaults to""
) — In case the relevant files are located inside a subfolder of the model repo either locally or on huggingface.co, you can specify the folder name here.
Returns
OptimizedModel
The loaded optimized model.
Instantiate a pretrained model from a pre-trained model configuration.
save_pretrained
< source >( save_directory: typing.Union[str, os.PathLike] push_to_hub: bool = False **kwargs )
Parameters
-
save_directory (
str
oros.PathLike
) — Directory to which to save. Will be created if it doesn’t exist. -
push_to_hub (
bool
, optional, defaults toFalse
) — Whether or not to push your model to the Hugging Face model hub after saving it.Using
push_to_hub=True
will synchronize the repository you are pushing to withsave_directory
, which requiressave_directory
to be a local clone of the repo you are pushing to if it’s an existing folder. Pass alongtemp_dir=True
to use a temporary directory instead.
Save a model and its configuration file to a directory, so that it can be re-loaded using the from_pretrained() class method.