Spaces:
Sleeping
Sleeping
title: st.cache_resource | |
slug: /develop/api-reference/caching-and-state/st.cache_resource | |
description: st.cache_resource is used to cache functions that return shared global resources (e.g. database connections, ML models). | |
<Tip> | |
This page only contains information on the `st.cache_resource` API. For a deeper dive into caching and how to use it, check out [Caching](/develop/concepts/architecture/caching). | |
</Tip> | |
<Autofunction function="streamlit.cache_resource" oldName="streamlit.experimental_singleton" /> | |
<Autofunction function="streamlit.cache_resource.clear" oldName="streamlit.experimental_singleton.clear" /> | |
#### Example | |
In the example below, pressing the "Clear All" button will clear _all_ cache_resource caches. i.e. Clears cached global resources from all functions decorated with `@st.cache_resource`. | |
```python | |
import streamlit as st | |
from transformers import BertModel | |
@st.cache_resource | |
def get_database_session(url): | |
# Create a database session object that points to the URL. | |
return session | |
@st.cache_resource | |
def get_model(model_type): | |
# Create a model of the specified type. | |
return BertModel.from_pretrained(model_type) | |
if st.button("Clear All"): | |
# Clears all st.cache_resource caches: | |
st.cache_resource.clear() | |
``` | |
<Autofunction function="CachedFunc.clear" /> | |
## Using Streamlit commands in cached functions | |
### Static elements | |
Since version 1.16.0, cached functions can contain Streamlit commands! For example, you can do this: | |
```python | |
from transformers import pipeline | |
@st.cache_resource | |
def load_model(): | |
model = pipeline("sentiment-analysis") | |
st.success("Loaded NLP model from Hugging Face!") # ๐ Show a success message | |
return model | |
``` | |
As we know, Streamlit only runs this function if it hasnโt been cached before. On this first run, the `st.success` message will appear in the app. But what happens on subsequent runs? It still shows up! Streamlit realizes that there is an `st.` command inside the cached function, saves it during the first run, and replays it on subsequent runs. Replaying static elements works for both caching decorators. | |
You can also use this functionality to cache entire parts of your UI: | |
```python | |
@st.cache_resource | |
def load_model(): | |
st.header("Data analysis") | |
model = torchvision.models.resnet50(weights=ResNet50_Weights.DEFAULT) | |
st.success("Loaded model!") | |
st.write("Turning on evaluation mode...") | |
model.eval() | |
st.write("Here's the model:") | |
return model | |
``` | |
### Input widgets | |
You can also use [interactive input widgets](/develop/api-reference/widgets) like `st.slider` or `st.text_input` in cached functions. Widget replay is an experimental feature at the moment. To enable it, you need to set the `experimental_allow_widgets` parameter: | |
```python | |
@st.cache_resource(experimental_allow_widgets=True) # ๐ Set the parameter | |
def load_model(): | |
pretrained = st.checkbox("Use pre-trained model:") # ๐ Add a checkbox | |
model = torchvision.models.resnet50(weights=ResNet50_Weights.DEFAULT, pretrained=pretrained) | |
return model | |
``` | |
Streamlit treats the checkbox like an additional input parameter to the cached function. If you uncheck it, Streamlit will see if it has already cached the function for this checkbox state. If yes, it will return the cached value. If not, it will rerun the function using the new slider value. | |
Using widgets in cached functions is extremely powerful because it lets you cache entire parts of your app. But it can be dangerous! Since Streamlit treats the widget value as an additional input parameter, it can easily lead to excessive memory usage. Imagine your cached function has five sliders and returns a 100 MB DataFrame. Then weโll add 100 MB to the cache for _every permutation_ of these five slider values โ even if the sliders do not influence the returned data! These additions can make your cache explode very quickly. Please be aware of this limitation if you use widgets in cached functions. We recommend using this feature only for isolated parts of your UI where the widgets directly influence the cached return value. | |
<Warning> | |
Support for widgets in cached functions is currently experimental. We may change or remove it anytime without warning. Please use it with care! | |
</Warning> | |
<Note> | |
Two widgets are currently not supported in cached functions: `st.file_uploader` and `st.camera_input`. We may support them in the future. Feel free to [open a GitHub issue](https://github.com/streamlit/streamlit/issues) if you need them! | |
</Note> | |