Skip to main content
providersanalysispricing

Cloud GPU Provider Inventory Report 2026: Who Has the Most GPUs?

We measured real GPU inventory across 54 cloud providers. GCP holds 40% of all listed instances. RunPod has 32 GPU models — the widest selection. Vast.ai and Verda beat everyone on H100 pricing. Full data inside.

April 1, 20269 min read
GPU Instance Inventory by Provider
5,124 live instances · 54 providers · April 2026
GCP
2,044
39.9%
RunPod
712
13.9%
AWS
619
12.1%
Azure
460
9%
OCI
340
6.6%
Shadeform
183
3.6%
Vast.ai
128
2.5%
Lambda Labs
120
2.3%
Others (46)
518
10.1%

Which cloud GPU provider actually has inventory when you need it? We track 5,124 live instances across 54 providers in real time. The distribution is far more concentrated than most AI teams realize: four providers (GCP, RunPod, AWS, Azure) hold 75% of all listed instances.

The Full Provider Inventory Table

ProviderInstancesGPU ModelsMarket Share
GCP2,0441739.9%
RunPod7123213.9%
AWS619712.1%
Azure46049.0%
OCI34066.6%
Shadeform183243.6%
Vast.ai128212.5%
Lambda Labs12062.3%
Vultr9041.8%
Verda86101.7%
Nebius4840.9%
CloudRift4030.8%
TensorDock1160.2%
Crusoe1160.2%
Hyperstack10100.2%

Remaining 39 providers hold <3% combined. Data: gputracker.dev, April 2026.

Key Findings

1. GCP Dominates Instance Count — But Not Price

GCP lists 2,044 instances — 39.9% of the entire tracked market. But most of those are older GPUs (T4, L4, V100) across many regions and configurations. For H100 at competitive prices, Vast.ai ($0.53/hr) and Verda ($0.80/hr) are 2-3x cheaper than GCP ($1.41/hr).

2. RunPod Has the Widest GPU Selection

RunPod lists 32 distinct GPU models — twice as many as the next competitor (Shadeform with 24). If you need a specific GPU — A40, RTX 6000 Ada, H200, B200, or even an older RTX 3090 — RunPod is the most likely place to find it.

3. AWS Has the Narrowest Selection

Despite having 619 instances, AWS only offers 7 GPU models. Hyperscalers optimize for consistency, not variety. If your workload needs a specific GPU not in their lineup — or needs the cheapest H100 — a specialized provider is a better fit.

4. The Long Tail: 46 Providers Share 10%

The bottom 46 providers — including Datacrunch, Scaleway, OVHcloud, TensorDock, and 42 others — collectively account for ~10% of listed inventory. Many of these offer the best prices for niche GPUs (e.g., A40, L40S, MI300X) or specific regional requirements.

H100 Inventory: Where to Find It Cheapest

The H100 is available across 10+ providers. Here is the lowest on-demand price available from each provider as of April 2026:

ProviderH100 From
Vast.ai (spot / marketplace)$0.53/hr
Verda (on-demand)$0.80/hr
RunPod (community cloud)$1.25/hr
Nebius (on-demand)$1.25/hr
GCP (preemptible)$1.41/hr
Crusoe (on-demand)$1.60/hr
Shadeform (aggregated)$1.66/hr
Latitude.sh (on-demand)$1.68/hr
E2E Networks (on-demand)$1.69/hr
Cudo Compute (on-demand)$1.87/hr

Stay ahead on GPU pricing

Get weekly GPU price reports, new hardware analysis, and cost optimization tips. Join engineers and researchers who save thousands on cloud compute.

No spam. Unsubscribe anytime. We respect your inbox.

Find the cheapest GPU for your workload

Compare real-time prices across tracked cloud providers and marketplaces with 5,000+ instances. Updated every 6 hours.

Compare GPU Prices →

Related Articles