general

Cloud GPU Prices Vary 61x Across Providers

Cloud GPU pricing analysis reveals up to 61-fold price differences between providers, helping businesses compare costs for AI workloads, machine learning, and

Someone built a comparison tool that exposes wild price differences for renting cloud GPUs - we’re talking 13x to 61x markups depending on provider.

Quick example: an H100 80GB ranges from $0.80/hr (VERDA) to $11.10/hr (LeaderGPU). Running that 24/7 means paying either $576/month or $7,992/month for identical hardware.

The site tracks live pricing across 25 providers: https://gpuperhour.com

Some standout spreads they found:

  • V100 16GB: $0.05/hr to $3.06/hr (61x difference)
  • A100 80GB: $0.45/hr to $3.57/hr (8x difference)
  • RTX 4090: $0.33/hr to $3.30/hr (10x difference)

Covers RunPod, Vast.ai, Lambda Labs, AWS, and a bunch of smaller providers most people haven’t heard of. Can filter by VRAM, region, spot vs on-demand.

Pretty useful for anyone doing fine-tuning or training who doesn’t want to overpay by thousands per month.