Pipeline for continuously ingesting and processing real-time data flows. Like a conveyor belt in a factory that moves items from one station to the next without stopping.
A ride-sharing app uses Amazon Kinesis Data Streams to capture millions of GPS location updates per second, routing them through processing stages that calculate ETAs, detect surge pricing zones, and update driver availability in real time.
All four are managed event-stream ingestion services used as the front door of a data streaming pipeline. They accept high-throughput event data from producers and let multiple consumers process it in near real time. Processing is typically done with companion services (e.g., AWS Kinesis Data Analytics/Glue/Lambda, Azure Stream Analytics/Functions/Databricks, GCP Dataflow/Cloud Functions, OCI Data Flow/Functions).