September 26-27, 2023 - Meet us at Current hosted by Confluent! Stop by booth #211 to hear how you can build streaming data applications in minutes, not months. / Register Now

Resources

Blog

Engineering

Nstream and SwimOS: Understanding the Difference

In Q1 2023, our company rebranded from Swim.inc to Nstream. The goal of this shift was to align better the company, our culture, and our value proposition with the evolution of streaming data applications. While our name and brand have changed, at our core, we continue to offer customers the ability to build streaming data applications in minutes, not months.

Engineering

How Nstream Lowers TCO by 70%

Companies require actionable insights from the vast amounts of data they gather. However, implementing a data processing technology stack to effectively deliver those insights and execute complex use cases in a timely manner quickly becomes prohibitively expensive.

Engineering

Nstream Demo: Building a Real-Time Streaming Data Application

Are you looking for better ways to model your business’s ever-evolving state? Join Fred Patton, Nstream Developer Evangelist, as he walks you through a simple demo of how to build streaming data applications. He’ll demonstrate how to model your business, including all your digital and physical entities, whether you operate on-premises, in the Cloud, or both. The result: empowering you to make better contextual-based business decisions in real time.  This demo will cover the following:

  • Overview of streaming data applications and how they can transform your business
  • Real-time visualization and understanding of your business’s state
  • Example of a real-time traffic application in Palo Alto
  • Steps to build your streaming application to model your business operations
Engineering

How Does My Business Benefit from Real-Time Streaming Applications?

Many use cases that exemplify how we think about the future of business revolve around organizations’ ability to make sophisticated decisions with agility and speed. Consider customer 360 initiatives. A company might offer a specific individual a coupon code while they’re browsing the website or at the store. To do so, that company must have real-time visibility into its customer and company data and be able to send that coupon instantaneously.

Engineering

How Real-Time Business Logic Leads to Real-Time Business Decisions

On its surface, business logic might look like a long list of tactical rules, procedures, guidelines, and best practices that guide thousands of small decisions a company makes daily. However, thinking of business logic like a Monet painting is helpful – all the small details add up. What might look like hundreds of tiny decisions up close, reveals a larger picture of how a company operates and makes critical business and customer decisions.

Engineering

How to Achieve Real-Time Insights for Fraud Detection

As the volume of global digital activity and financial transactions continues to grow, instances of fraud have become more prevalent and more difficult to detect. Fraud detection is a major concern across industries because of what’s at stake. It’s critical for businesses to preserve reputation and revenue, reduce financial risk, and maintain high customer satisfaction.

Engineering

The State of Streaming Data

The application layer will be the next domino to fall in the ongoing transformation from batch to streaming. And one particular quirk of streaming data—the distinction between state and change—will play an outsized role in determining the outcome.

Engineering

The State of Streaming Data

The application layer will be the next domino to fall in the ongoing transformation from batch to streaming. And one particular quirk of streaming data—the distinction between state and change—will play an outsized role in determining the outcome.

Engineering

Akka vs Nstream: The Key Differences

The Nstream platform simplifies the development and maintenance of real-time streaming applications. Akka gives developers the opportunity to easily create powerful, message-driven concurrent and distributed applications.

Engineering

What is Contextual Data? And, How is it Evolving?

Imagine you’re a credit card company and a charge comes in. All you know is that this charge is larger than the customer’s previous charge. Is it fraud or not? If it is and you allow it, you’ll lose money, and the fraud may continue; if it’s not and you deny it, you’ll frustrate or even lose a genuine customer.

Engineering

What are Digital Twins? And What's the Next Evolution?

Simply defined, digital twins are virtual representations of physical objects. Whether duplicating a machine (like how every Tesla has a digital counterpart to allow for closer monitoring of systems), or a real-world process (like the tracking of changing traffic lights at an intersection), digital twins are created by streams of real-time data.

Engineering

The Business Value of a Real-time User Interface

Latency is an issue that would upset a gamer playing online or frustrate a movie-watcher choosing between films. Unfortunately, it has become a normalized part of modern user interfaces in the business world. This issue is especially concerning for enterprise UIs, such as business intelligence dashboards because decisions are made using supposedly fresh data and insights that are not truly real-time.

Engineering

Let Event Streams Auto-Build Your Dataflow Pipeline

Brokers don’t run applications - they are a buffer between the real world and an application that analyzes events. An event stream processor (in Apache Kafka/Pulsar terms) or dataflow pipeline is an application that, given the event schema, analyzes the stream continuously to derive insights.

Engineering

Forget Data - It's State That Matters

Streaming data contains events that are updates to the states of applications, devices, or infrastructure. When choosing an architecture to process events, the role of the broker, such as Apache Kafka or Pulsar, is crucial - it has to scale and meet application performance needs - but it’s necessarily limited to the data domain. Even using a stream processing capability such as Kafka streams that triggers computation for events that match rules, leaves an enormous amount of complexity for the developer to manage - and that’s all about understanding the state of the system. Here’s why you care about the state of the system and not its raw data:

Engineering

Brokers Aren't Databases

The rise of event streaming as a new class of enterprise data that demands continuous analysis is uncontroversial. What’s puzzling is the approaches being taken by the event streaming community to the storage of event data and the semantics they seek to achieve for event processing applications.