File size: 3,004 Bytes
7a07d47 4ea6260 03d4258 4ea6260 8d2a085 03d4258 8d2a085 03d4258 4ea6260 8d2a085 03d4258 4ea6260 8d2a085 4ea6260 8d2a085 4ea6260 8d2a085 4ea6260 f7f7b5d 4ea6260 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 |
---
viewer: false
tags: [uv-script]
---
# 🗺️ Atlas Export - One-Click Embedding Visualizations
Generate and deploy interactive embedding visualizations to HuggingFace Spaces with a single command.
## Quick Start
```bash
# Create a Space from any text dataset
uv run atlas-export.py stanfordnlp/imdb --space-name my-imdb-viz
# Your Space will be live at:
# https://huggingface.co/spaces/YOUR_USERNAME/my-imdb-viz
```
## Examples
### Image Datasets
```bash
# Visualize image datasets with CLIP
uv run atlas-export.py \
beans \
--space-name bean-disease-atlas \
--image-column image \
--model openai/clip-vit-base-patch32
```
### Custom Embeddings
```bash
# Use a specific embedding model
uv run atlas-export.py \
wikipedia \
--space-name wiki-viz \
--model nomic-ai/nomic-embed-text-v1.5 \
--text-column text \
--sample 50000
```
### Pre-computed Embeddings
```bash
# If you already have embeddings in your dataset
uv run atlas-export.py \
my-dataset-with-embeddings \
--space-name my-viz \
--no-compute-embeddings \
--x-column umap_x \
--y-column umap_y
```
### GPU Acceleration (HF Jobs)
```bash
# First, get your HF token (if not already set)
python -c "from huggingface_hub import get_token; print(get_token())"
# Run on HF Jobs with GPU using experimental UV support
hf jobs uv run --flavor t4-small \
-s HF_TOKEN=your-token-here \
https://huggingface.co/datasets/uv-scripts/build-atlas/raw/main/atlas-export.py \
stanfordnlp/imdb \
--space-name imdb-viz \
--model sentence-transformers/all-mpnet-base-v2 \
--sample 10000
```
Note: Replace `your-token-here` with your actual token. Available GPU flavors: `t4-small`, `t4-medium`, `l4x1`, `a10g-small`.
## Key Options
| Option | Description | Default |
|--------|-------------|---------|
| `dataset_id` | HuggingFace dataset to visualize | Required |
| `--space-name` | Name for your Space | Required |
| `--model` | Embedding model to use | Auto-selected |
| `--text-column` | Column containing text | "text" |
| `--image-column` | Column containing images | None |
| `--sample` | Number of samples to visualize | All |
| `--split` | Dataset split to use | "train" |
| `--local-only` | Generate locally without deploying | False |
| `--output-dir` | Local output directory | Temp dir |
| `--hf-token` | HuggingFace API token | From env/CLI |
Run without arguments to see all options and more examples.
## How It Works
1. Loads dataset from HuggingFace Hub
2. Generates embeddings (or uses pre-computed)
3. Creates static web app with embedded data
4. Deploys to HF Space
The resulting visualization runs entirely in the browser using WebGPU acceleration.
## Credits
Built on [Embedding Atlas](https://github.com/apple/embedding-atlas) by Apple. See the [documentation](https://apple.github.io/embedding-atlas/) for more details about the underlying technology.
---
Part of the [UV Scripts](https://huggingface.co/uv-scripts) collection 🚀 |