Canvas CloudAI
Canvas Cloud AI

Batch Inference

intermediate
ai & ml
Enhanced Content

Definition

Processing large volumes of data through an AI model all at once rather than one item at a time. Like grading a stack of exams together rather than waiting for students to submit them individually.

Real-World Example

An e-commerce company runs batch inference overnight to generate product recommendations for all millions of users at once.

Cloud Provider Equivalencies

All four clouds support running offline (non-real-time) predictions over large datasets. AWS uses SageMaker Batch Transform jobs, Azure uses Azure ML batch endpoints, GCP uses Vertex AI Batch Prediction jobs, and OCI commonly runs batch scoring as scheduled Data Science Jobs that load a model and write predictions to object storage or a database.

AWS
Amazon SageMaker Batch Transform
AZ
Azure Machine Learning batch endpoints
GCP
Vertex AI Batch Prediction
OCI
OCI Data Science Jobs (for batch scoring) / OCI Data Science Model Deployment (batch via jobs)

Explore More Cloud Computing Terms