Confluent Cloud Tutorial

Confluent Logo

The Kafka Vehicle Tutorial demonstrated how to build an Nstream application against a user-managed Kafka broker. In this tutorial, we recreate the same application against a SaaS managed broker, namely Confluent Cloud.

Recall that the Kafka tutorial:

This tutorial further demonstrates how to:

A standalone project containing all the discussed code can be found and cloned from here.

Nstream Library Dependencies

Dependencies

Gradle

implementation 'io.nstream:nstream-adapter-common:4.13.21'
implementation 'io.nstream:nstream-adapter-confluent:4.13.21'
implementation 'io.nstream:nstream-adapter-runtime:4.13.21'

Maven

<dependency>
  <groupId>io.nstream</groupId>
  <artifactId>nstream-adapter-common</artifactId>
  <version>4.13.21</version>
  <type>module</type> <!-- Remove or comment this line for non-modular projects -->
</dependency>
<dependency>
  <groupId>io.nstream</groupId>
  <artifactId>nstream-adapter-confluent</artifactId>
  <version>4.13.21</version>
  <type>module</type> <!-- Remove or comment this line for non-modular projects -->
</dependency>
<dependency>
  <groupId>io.nstream</groupId>
  <artifactId>nstream-adapter-runtime</artifactId>
  <version>4.13.21</version>
  <type>module</type> <!-- Remove or comment this line for non-modular projects -->
</dependency>

Differences From Plain Kafka

The main, JSON-valued version of this application mimics the Kafka Vehicle Tutorial very closely (down to concrete class names), so we will not repeat any business logic explanations here. You will be able to follow along as long as you keep the following differences in mind:

Schema Registry Variation

Using a Schema Registry enables:

and other useful features.

If we use Avro, Confluent Cloud requires the following:

  1. The KafkaAvroDeserializer to be on the classpath, achievable via Gradle dependency 'io.confluent:kafka-avro-serializer:7.4.0'
  2. A topic in Confluent Cloud configured with at least one of a key schema or a value schema (typically just the latter)
  3. An API key/secret pair for the topic
  4. A schema registry URL
  5. An API key/secret pair for the schema registry
# consumer.properties
key.deserializer=org.apache.kafka.common.serialization.IntegerDeserializer
auto.offset.reset=latest
group.id=fooGroup
value.deserializer=io.confluent.kafka.serializers.KafkaAvroDeserializer

Note: the properties beginning with ccloud below are a shorthand available when using basic authentication (the default when creating a Confluent Cloud topic). Drop these properties in favor what is outlined in Kafka documentation if using other methods.

# secret.properties
bootstrap.servers=...
ccloud.api.key=...
ccloud.api.secret=...
schema.registry.url=...
ccloud.schema.registry.key=...
ccloud.schema.registry.secret=...

That’s really it! You are free to use seamlessly use no-code or low-code ConfluentIngestingPatch variants as you wish.

Final Notes

Congratulations on building the backend for an end-to-end streaming application against SaaS-managed data! You may verify your progress with the same general purpose UI and swim-cli commands from the Kafka Vehicle Tutorial, linked here.


Nstream is licensed under the Redis Source Available License 2.0 (RSALv2).