GPU Cloud by VRAM
Browse cloud GPUs by VRAM tier. Find the right GPU for your model size.
Last updated April 19, 2026 · Data refreshed every 6 hours
16GB VRAM
Stable Diffusion, ~7B model inference, fine-tuning small models
24GB VRAM
13B model inference, SDXL, lightweight training
40GB VRAM
30B model inference, fine-tuning medium models
48GB VRAM
34B inference, multi-batch serving, image generation pipelines
80GB VRAM
70B model inference, fine-tuning large models
141GB VRAM
70B at full precision, multi-batch 70B serving
180GB VRAM
180B+ model inference, frontier model training