Multi-model serving platform on OCI with canary deployments via OKE, A/B testing, OCI Cache feature store, and automatic model rollback.
Difficulty: intermediate
Tags: ai, mlops, model-serving, canary, feature-store, oci
A model serving platform hosts multiple ML models behind a unified API, supporting canary deployments (route 5% of traffic to the new model), A/B testing for model comparison, and automatic rollback if error rates spike. This OCI-native design uses OKE for independent model deployments. The OCI Cache feature store ensures consistent feature computation between training and serving, eliminating the common train/serve skew problem.