Running 2.24k 2.24k The Ultra-Scale Playbook ๐ The ultimate guide to training LLM on large GPU Clusters
view article Article Fine-tuning LLMs to 1.58bit: extreme quantization made easy Sep 18, 2024 โข 225
Running on CPU Upgrade 1.9k 1.9k Stable Diffusion XL on TPUv5e ๐ Generate images from text prompts with various styles