Confluent Cloud Tutorial
The Kafka Vehicle Tutorial demonstrated how to build an Nstream application against a user-managed Kafka broker. In this tutorial, we recreate the same application against a SaaS managed broker, namely Confluent Cloud.
Recall that the Kafka tutorial:
- Efficiently consumes JSON-valued messages from a Kafka topic
- Transforms those messages into useful insights
- Serves those insights using Nstream Web Agents as real-time, webpage-subscribable streams
This tutorial further demonstrates how to:
- Configure Nstream to work against a Confluent Cloud topic
- Consume Avro-valued messages with the help of a Confluent Cloud Schema Registry
A standalone project containing all the discussed code can be found and cloned from here.
Nstream Library Dependencies
Dependencies
Gradle
implementation 'io.nstream:nstream-adapter-common:4.13.21'
implementation 'io.nstream:nstream-adapter-confluent:4.13.21'
implementation 'io.nstream:nstream-adapter-runtime:4.13.21'
Maven
<dependency>
<groupId>io.nstream</groupId>
<artifactId>nstream-adapter-common</artifactId>
<version>4.13.21</version>
<type>module</type> <!-- Remove or comment this line for non-modular projects -->
</dependency>
<dependency>
<groupId>io.nstream</groupId>
<artifactId>nstream-adapter-confluent</artifactId>
<version>4.13.21</version>
<type>module</type> <!-- Remove or comment this line for non-modular projects -->
</dependency>
<dependency>
<groupId>io.nstream</groupId>
<artifactId>nstream-adapter-runtime</artifactId>
<version>4.13.21</version>
<type>module</type> <!-- Remove or comment this line for non-modular projects -->
</dependency>
Differences From Plain Kafka
The main, JSON-valued version of this application mimics the Kafka Vehicle Tutorial very closely (down to concrete class names), so we will not repeat any business logic explanations here. You will be able to follow along as long as you keep the following differences in mind:
-
KafkaIngestingPatch
is replaced withConfluentIngestingPatch
-
(simulator only)
KafkaPublishingAgent
is replaced withConfluentPublishingAgent
- The Kafka tutorial contained a single
server.recon
configuration file for both the Swim server and the Kafka consumer. When using Confluent Cloud, the consumer configuration contains confidential secrets and thus should not be statically available. e.g. will be independently loaded via a Kubernetes Secret in a real deployment. To encourage this pattern, we have modifiedserver.recon
to dynamically load configuration portions from external files and variables.
Schema Registry Variation
Using a Schema Registry enables:
- Easy use of nontrivial data formats such as Avro
- Data consistency even if the schemas evolve over time
and other useful features.
If we use Avro, Confluent Cloud requires the following:
- The
KafkaAvroDeserializer
to be on the classpath, achievable via Gradle dependency'io.confluent:kafka-avro-serializer:7.4.0'
- A topic in Confluent Cloud configured with at least one of a key schema or a value schema (typically just the latter)
- An API key/secret pair for the topic
- A schema registry URL
- An API key/secret pair for the schema registry
# consumer.properties
key.deserializer=org.apache.kafka.common.serialization.IntegerDeserializer
auto.offset.reset=latest
group.id=fooGroup
value.deserializer=io.confluent.kafka.serializers.KafkaAvroDeserializer
Note: the properties beginning with ccloud
below are a shorthand available when using basic authentication (the default when creating a Confluent Cloud topic).
Drop these properties in favor what is outlined in Kafka documentation if using other methods.
# secret.properties
bootstrap.servers=...
ccloud.api.key=...
ccloud.api.secret=...
schema.registry.url=...
ccloud.schema.registry.key=...
ccloud.schema.registry.secret=...
That’s really it!
You are free to use seamlessly use no-code or low-code ConfluentIngestingPatch
variants as you wish.
Final Notes
Congratulations on building the backend for an end-to-end streaming application against SaaS-managed data!
You may verify your progress with the same general purpose UI and swim-cli
commands from the Kafka Vehicle Tutorial, linked here.
Nstream is licensed under the Redis Source Available License 2.0 (RSALv2).