Data Connectors

Incorporate Context from Any Source

Configure bidirectional connectors to in‑motion and at‑rest data sources.

Nstream portal UI

Why data connectors matter

Data is meaningless without context

Nstream data connectors make it easy to enrich stateful entities with related context from all relevant data sources. Giving each entity a comprehensive real-time picture of its real-world environment enables context-sensitive insights and situationally-aware automated action.

Messaging connectors

Subscribe to message broker topics and route incoming messages to relevant stateful entities for real-time processing.

Database connectors

Periodically query databases for related context, and cache the results in stateful entities for fast in-memory access.

Egress connectors

Publish up-leveled synthetic events back to message brokers, or periodically sample and batch write derivative big data sets.

What data connectors enable

Ubiquitous context

Continuously join multiple in-motion and at-rest data sources using stateful entities as deterministic points of intersection.

Kafka / Confluent

Consume high volume partitioned topics, and selectively publish up-leveled and aggregated insights back to Kafka.Learn more about Kafka Ingress

Pulsar

Keep up with low-latency Pulsar topics, and publish back high level state changes that can't be computed by Pulsar Functions.Learn more about Pulsar Ingress

RabbitMQ

Coordinate with event-driven microservices and other enterprise systems that communicate via AMQP queues.

JDBC

Periodically query infrequently changing relational context in the background and cache the results in stateful entities.Learn more about JDBC Ingress

HTTP

Poll REST APIs at low rates, and map the responses onto stateful entities where they can be efficiently accessed at high rates.Learn more about HTTP Ingress

File

Replay data from CSV files to get up and running quickly with snapshots of streaming and relational datasets.

Druid

Streamline real-time analytics with Apache Druid, harnessing its ability to ingest massive volumes of event data while enabling sub-second queries.Learn more about Druid Ingress

NATS

Leverage Nstream’s real-time data integration with our tailored NATS connectors enabling lightweight, high-performance messaging for microservices, IoT, and distributed systems.Learn more about NATS Ingress

RabbitMQ

Efficiently ingest data from RabbitMQ queues into nStream, processing real-time messages with custom logic and republishing refined insights.Learn more about RabbitMQ Ingress

How data connectors work

From data parallel to entity parallel

Traditional data parallel architecture struggle to join multiple high rate data sources together at low latency due to inconsistent scattering of data across machines. Nstream’s entity parallel architecture, by contrast, deterministically maps ingested data onto stateful entities, enabling business logic to compute on joined state with memory latency.

Uniform partitioning.
All data about a given uniquely identifiable object gets routed the same stateful entity, effectively joining all data sources together at ingestion.
Seamless serialization.
Automatic detection and decoding of JSON, XML, CSV, and Avro enables business logic to work directly with format-agnostic structured data.
Just another Web Agent.
Data connectors run as ordinary Web Agents, making them flexible and easy to implement, deploy and manage with a single set of tooling.

More platform capabilities