Canvas CloudAI
Canvas Cloud AI

GPU

intermediate
hardware
Enhanced Content

Definition

Graphics Processing Unit - specialized hardware designed for parallel processing that accelerates AI training and graphics rendering. Like having thousands of workers doing calculations simultaneously instead of one at a time.

Real-World Example

Training large AI models on GPUs can be 100x faster than using regular CPUs because GPUs can process many calculations at once.

Cloud Provider Equivalencies

All major clouds provide GPUs primarily through virtual machine offerings. AWS, Azure, GCP, and OCI let you run GPU-backed instances/VMs for AI/ML training, inference, and graphics workloads. The main differences are available GPU models, instance families, networking/storage options, and regional availability.

AWS
Amazon EC2 Accelerated Computing (GPU instances, e.g., P5, P4d, G5, G6)
AZ
Azure Virtual Machines (GPU VMs, e.g., ND, NC, NV series)
GCP
Compute Engine GPU (NVIDIA GPUs attached to VMs; also A2 accelerator-optimized VMs)
OCI
OCI Compute GPU Instances (e.g., NVIDIA A100/H100-based shapes, depending on region)

Explore More Cloud Computing Terms