A Reference Guide to Stream Processing

The traditional approach to processing data at scale is batching; the premise of which is that all the data is available in the system of record before the processing starts. In the case of failures, the whole job can be simply restarted.

While quite simple and robust, the batching approach clearly introduces a large latency between gathering the data and being ready to act upon it. The goal of stream processing is to overcome this latency by processing the live, raw data immediately as it arrives and meets the challenges of incremental processing, scalability and fault tolerance.In this white paper, sponsored by Hazelcast, you’ll learn: 

  • Use cases that benefit from stream processing
  • What the building blocks of a stream processing solution are
  • Key concepts used when building a streaming pipeline
  • Hands-on examples based on Hazelcast Jet®