Why Nstream

Streaming Data Applications

Turn streaming data into real-time state

Streaming data conveys information about change, but applications express business logic as a function of the state. Reconstructing the prior state usually requires one or more database queries per incoming message, which comes at a high cost to latency, infrastructure load, and architectural complexity. By continuously tracking the real-time state of each entity, and updating the in-memory state upon receipt of new data, Streaming data applications built with the Nstream platform can instantly interpret and act on incoming events without having to wait on external queries by continuously tracking the real-time state of each entity and updating the in-memory state upon receipt of new data. The Nstream approach decreases latency, simplifies application architecture, reduces infrastructure footprint, and increases developer productivity.

Continuously evaluate stateful business logic

Traditional applications use business logic to derive states from data. Anomaly detection, by way of example, represents the derived state of “something unexpected happening,” which may be derived from data indicating unusual sensor readings. Streaming data applications chain dependent state computations together, recomputing higher order output states whenever any lower order input state changes. Continuing the example, whenever any state that could indicate an anomalous condition changes, the anomaly detection logic is immediately re-run. The streaming data application approach ensures that the complete real-time state of a model—including all derived states—always reflects the real-time state of the real world.

Stream incremental updates to API clients

Query costs grow in proportion to how often clients poll for changes. Streaming APIs, on the other hand, push data as it happens. Much more efficiently and at the latency of the network. This makes streaming data applications built on the Nstream Platform vastly more performant, efficient, and scalable than polling-based architectures when low latency is desired. The fact that API clients and user interfaces update as fast as physics allows is just bonus points.