Python module
hf_utils
Utilities for interacting with HuggingFace Files/Repos.
HuggingFaceFile
class max.pipelines.hf_utils.HuggingFaceFile(repo_id: str, filename: str, revision: str | None = None)
A simple object for tracking Hugging Face model metadata. The repo_id will frequently be used to load a tokenizer, whereas the filename is used to download model weights.
download()
Download the file and return the file path where the data is saved locally.
exists()
exists() → bool
filename
filename*: str*
repo_id
repo_id*: str*
revision
size()
download_weight_files()
max.pipelines.hf_utils.download_weight_files(huggingface_model_id: str, filenames: list[str], revision: str | None = None, force_download: bool = False, max_workers: int = 8) → list[pathlib.Path]
Provided a HuggingFace model id, and filenames, download weight files : and return the list of local paths.
-
Parameters:
- huggingface_model_id – The huggingface model identifier, ie. modularai/llama-3.1
- filenames – A list of file paths relative to the root of the HuggingFace repo. If files provided are available locally, download is skipped, and the local files are used.
- revision – The HuggingFace revision to use. If provided, we check our cache directly without needing to go to HuggingFace directly, saving a network call.
- force_download – A boolean, indicating whether we should force the files to be redownloaded, even if they are already available in our local cache, or a provided path.
- max_workers – The number of worker threads to concurrently download files.
get_architectures_from_huggingface_repo()
max.pipelines.hf_utils.get_architectures_from_huggingface_repo(model_path: str, trust_remote_code: bool = False) → list[str]
Was this page helpful?
Thank you! We'll create more content like this.
Thank you for helping us improve!