Hub Integration

The Inference Providers is tightly integrated with the Hugging Face Hub. No matter in which service you use it, the usage and billing will be centralized on your Hugging Face account.

Model search

When listing models on the Hub, you can filter to select models deployed on the inference provider for your choice. For example, to list all models deployed on Fireworks AI infra: https://huggingface.co/models?inference_provider=fireworks-ai.

It is also possible to select multiple providers or even all of them to filter all models that are available on at least 1 provider: https://huggingface.co/models?inference_provider=all.

Features using Inference Providers

Several Hugging Face features utilize the Inference Providers and count towards your monthly credits. The included monthly credits for PRO and Enterprise should cover moderate usage of these features for most users.

User Settings

In your user account settings, you are able to:

< > Update on GitHub