As you prepare to create a stream in Foundry, it is important to consider the latency and throughput expectations that define your stream. This page will present some questions to consider regarding both latency and throughput performance for your stream use case.
Latency is the speed at which stream records are processed. Latency is a core performance component that defines realtime streams, and the speed expectations from when your records process through a stream and arrive at their destination can have real-world impact. For example, stream latency determines how quickly alerts are triggered for airline flight delays or supply chain issues that require immediate action. The factors that impact latency are multi-faceted, but some of the most significant considerations are listed below.
A standard streaming pipeline can run through the following stages in under 15 seconds:
As indicated above, there are three major factors that influence the end-to-end latency of a streaming pipeline:
Throughput is the amount of records that can be processed over a period of time. Throughput is often equally as important as latency for measuring the performance of a low latency pipeline, and some of the most significant considerations are listed below: