Not everyone has a beefy GPU at home. Cloud platforms offer GPU access at various price points, from free to enterprise-grade. Here is a comparison of popular options.
Platform Comparison
| Platform | GPU Options | Price Range | Best For |
|---|---|---|---|
| Google Colab | T4 (free), A100 (Pro) | Free to $50/mo | Learning, quick experiments |
| Lambda Cloud | A100, H100 | $1.10+/hr per GPU | Training, extended jobs |
| RunPod | A100, H100, RTX 4090 | $0.40+/hr | Flexible on-demand GPU |
| AWS (SageMaker) | A10G, A100, P5 (H100) | $1+/hr | Enterprise, production |
| Google Cloud (Vertex AI) | T4, A100, TPU v5 | $0.35+/hr | Enterprise, TPU workloads |
| Modal | A100, H100 | Pay per second | Serverless GPU functions |
| Vast.ai | Various consumer and datacenter GPUs | $0.10+/hr | Budget GPU rentals |
Key Insight: Start Free, Scale Up
For following along with this textbook, start with Google Colab (free T4 GPU). It can run 7B models in 4-bit quantization and handle LoRA fine-tuning. When you need more power or longer sessions, move to RunPod or Lambda for on-demand A100 access. Reserve enterprise cloud platforms (AWS, GCP) for production deployments where you need managed services, SLAs, and team collaboration features.
Google Colab Quick Start
This snippet installs the required packages in a Google Colab notebook and verifies GPU availability.
# First cell: install libraries
!pip install -q transformers peft trl bitsandbytes accelerate datasets
# Second cell: check GPU
!nvidia-smi
# Third cell: mount Google Drive for persistent storage
from google.colab import drive
drive.mount('/content/drive')
# Fourth cell: set your Hugging Face token
from google.colab import userdata
hf_token = userdata.get('HF_TOKEN') # stored in Colab Secrets
Code Fragment D.5.1: A typical Google Colab setup sequence: install packages, verify the GPU, mount Google Drive for persistent storage, and load API tokens from Colab Secrets.