Streaming APIs

Stream Incremental Updates to API clients

Observe real‑time outputs of business logic without having to poll for changes.

Nstream portal UI

Why streaming APIs

Real‑time apps don't REST

Request-response APIs can only ever be as real-time as the rate at which clients poll for changes. Streaming data applications, by contrast, natively operate at the latency of the network.

Continuous consistency

Keep API clients synchronized at the latency of the network, completely eliminating polling overhead in the process.

Inversion of flow‑control

Dynamically subscribe to individual entities, and multiplex thousands of concurrent streams over a single connection.

Real‑time composition

Compose real-time microservices without multiplying polling intervals or consuming overly broad topics at every layer.

What streaming APIs enable

Event listeners for data

If a UI wants to know when a button is clicked, it doesn’t sit in a tight loop and repeatedly ask, “was the button clicked?” Streaming APIs make observing real-time changes to remote state as easy as registering an event handler.

Continuous state transfer.
Streaming APIs augment REST APIs with the ability to push ongoing responses to open requests for any endpoint URI.
Efficient multiplexing.
Thousands of concurrent streams can be tunneled through a single network connection, with minimal per‑stream overhead.
Half‑ping latency.
State synchronization semantics enable streaming APIs to avoid buffering stale updates, ensuring every packet transmits the freshest data.

How streaming APIs work

Point-to-point pub-sub

Instead of broadcasting homogenous messages to all consumers, streaming APIs publish specific updates to each individual subscriber based on the client’s identity, credentials, and last known state.

Incremental updates.
Precise delta updates minimize bandwidth utilization, and obviate the need for expensive scraping of datasets to determine what changed.
Dynamic transformation.
On-demand generation of update messages eliminates the need for costly and high latency bulk normalization steps.
Low fixed overhead.
Point-to-point streaming enables developers to craft rich real‑time APIs that only incur the load for what clients actually consume.

More platform capabilities