Confluent Cloud Tutorial
The Kafka Vehicle Tutorial demonstrated how to build an Nstream application against a user-managed Kafka broker. In this tutorial, we recreate the same application against a SaaS managed broker, namely Confluent Cloud.
Recall that the Kafka tutorial:
- Efficiently consumes JSON-valued messages from a Kafka topic
- Transforms those messages into useful insights
- Serves those insights using Nstream Web Agents as real-time, webpage-subscribable streams
This tutorial further demonstrates how to:
- Configure Nstream to work against a Confluent Cloud topic
- Consume Avro-valued messages with the help of a Confluent Cloud Schema Registry
A standalone project containing all the discussed code can be found and cloned from here.
Nstream Library Dependencies
Dependencies
Gradle
implementation 'io.nstream:nstream-adapter-common:4.2.0.5'
implementation 'io.nstream:nstream-adapter-confluent:4.2.0.5'
implementation 'io.nstream:nstream-adapter-runtime:4.2.0.5'
Maven
<dependency>
<groupId>io.nstream</groupId>
<artifactId>nstream-adapter-common</artifactId>
<version>4.2.0.5</version>
<type>module</type>
</dependency>
<dependency>
<groupId>io.nstream</groupId>
<artifactId>nstream-adapter-confluent</artifactId>
<version>4.2.0.5</version>
<type>module</type>
</dependency>
<dependency>
<groupId>io.nstream</groupId>
<artifactId>nstream-adapter-runtime</artifactId>
<version>4.2.0.5</version>
<type>module</type>
</dependency>
Differences From Plain Kafka
The main, JSON-valued version of this application mimics the Kafka Vehicle Tutorial very closely (down to concrete class names), so we will not repeat any business logic explanations here. You will be able to follow along as long as you keep the following differences in mind:
-
KafkaIngestingPatch
is replaced withConfluentIngestingPatch
-
(simulator only)
KafkaPublishingAgent
is replaced withConfluentPublishingAgent
- The Kafka tutorial contained a single
server.recon
configuration file for both the Swim server and the Kafka consumer. When using Confluent Cloud, the consumer configuration contains confidential secrets and thus should not (but could be if you are okay with the risks) be statically available. e.g. will be independently loaded via a Kubernetes Secret in a real deployment. To encourage this pattern, we have split the originalserver.recon
into a shorterserver.recon
plus a separateconsumer.properties
.
Schema Registry Variation
Using a Schema Registry enables:
- Easy use of nontrivial data formats such as Avro
- Data consistency even if the schemas evolve over time
and other useful features.
If we use Avro, Confluent Cloud requires the following:
- The
KafkaAvroDeserializer
to be on the classpath, achievable via Gradle dependency'io.confluent:kafka-avro-serializer:7.4.0'
- A topic in Confluent Cloud configured with either a key schema or a value schema (typically the latter)
- An API key/secret pair for the topic
- A schema registry URL
- An API key/secret pair for the schema registry
All of these may be supplied directly into the consumer.properties
configuration as follows:
# consumer.properties
bootstrap.servers=pkc-rgm37.us-west-2.aws.confluent.cloud:9092
key.deserializer=org.apache.kafka.common.serialization.IntegerDeserializer
auto.offset.reset=latest
group.id=fooGroup
# ccloud
ccloud.api.key=FIXME: will resemble ALK1LKJ124LKAJD3
ccloud.api.secret=FIXME: will resemble BLahz+AKJHdjaAuexA/SnqXD+AKJHdjaAuex/aksjdn1927AAASJDNnadf121uB8
value.deserializer=io.confluent.kafka.serializers.KafkaAvroDeserializer
schema.registry.url=FIXME: will resemble https://abcd-ab1a1.us-east-2.aws.confluent.cloud
basic.auth.credentials.source=USER_INFO
# In the below property, FIXME1 and FIXME2 will resemble the ccloud properties above
# (and may possibly, but usually will not, be the same; it is good practice to
# have a separate Confluent Cloud API key for the Schema Registry).
basic.auth.user.info=FIXME1:FIXME2
That’s really it!
You are free to use seamlessly use no-code or low-code ConfluentIngestingPatch
variants as you wish.
Final Notes
Congratulations on building the backend for an end-to-end streaming application against SaaS-managed data!
You may verify your progress with the same general purpose UI and swim-cli
commands from the Kafka Vehicle Tutorial, linked here.