Configuring and connecting Kafkacat with Red Hat OpenShift Streams for Apache Kafka

Guide
  • Red Hat OpenShift Streams for Apache Kafka 1
  • Updated 19 October 2021
  • Published 13 April 2021

Configuring and connecting Kafkacat with Red Hat OpenShift Streams for Apache Kafka

Guide
Red Hat OpenShift Streams for Apache Kafka 1
  • Updated 19 October 2021
  • Published 13 April 2021

Red Hat OpenShift Streams for Apache Kafka is currently available for Development Preview. Development Preview releases provide early access to a limited set of features that might not be fully tested and that might change in the final GA version. Users should not use Development Preview software in production or for business-critical workloads. Limited documentation is available for Development Preview releases and is typically focused on fundamental user goals.

As a developer of applications and services, you can use Kafkacat to test and debug your Kafka instances in Red Hat OpenShift Streams for Apache Kafka. Kafkacat is a command-line utility for messaging in Apache Kafka 0.8 and later. With Kafkacat, you can produce and consume messages for your Kafka instances directly from the command line, and list topic and partition information for your Kafka instances.

Kafkacat is an open source community tool. Kafkacat is not a part of OpenShift Streams for Apache Kafka and is therefore not supported by Red Hat.

You can install and use Kafkacat to test and debug your Kafka instances in OpenShift Streams for Apache Kafka.

Prerequisites
  • You have a Red Hat account.

  • You have a running Kafka instance in OpenShift Streams for Apache Kafka.

  • JDK 11 or later is installed.

  • For Windows, the latest version of Oracle JDK is installed.

  • You have installed the latest supported version of Kafkacat for your operating system.

    Verifying Kafkacat installation
    $ kafkacat -V
    
    kafkacat - Apache Kafka producer and consumer tool
    https://github.com/edenhill/kafkacat
    Copyright (c) 2014-2019, Magnus Edenhill
    Version 1.6.0 (JSON, Avro, Transactions, librdkafka 1.6.1 builtin.features=gzip,snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,zstd,sasl_oauthbearer)

Configuring Kafkacat to connect to a Kafka instance

To enable Kafkacat to access a Kafka instance, configure the connection using the bootstrap server endpoint for the instance and the generated credentials for your OpenShift Streams for Apache Kafka service account. For Kafkacat, you can configure connection information either by passing options to the kafkacat command or by using a configuration file. The example in this task sets environment variables and then passes them to the kafkcat command.

For more information about Kafkacat configuration options, see Configuration in the Kafkacat documentation.

Kafkacat does not yet fully support SASL/OAUTHBEARER authentication, so connecting to a Kafka instance requires only the bootstrap server and the service account credentials for SASL/PLAIN authentication.
Prerequisites
  • You have the bootstrap server endpoint for your Kafka instance. To relocate the server endpoint, select your Kafka instance in the OpenShift Streams for Apache Kafka web console, select the options menu (three vertical dots), and click Connection.

  • You have the generated credentials for your service account. To regenerate the credentials, use the Service Accounts page in the OpenShift Streams for Apache Kafka web console to find your service account and update the credentials.

  • You’ve set the permissions for your service account to access the Kafka instance resources. To verify the current permissions, select your Kafka instance in the OpenShift Streams for Apache Kafka web console and use the Access page to find your service account permission settings.

Procedure
  • On the command line, set the Kafka instance bootstrap server and client credentials as environment variables to be used by Kafkacat or other applications. Replace the values with your own server and credential information.

    Setting environment variables for server and credentials
    $ export BOOTSTRAP_SERVER=<bootstrap_server>
    $ export USER=<client_id>
    $ export PASSWORD=<client_secret>

Producing messages in Kafkacat

You can use Kafkacat to produce messages to Kafka topics in several ways, such as reading them from standard input (stdin) directly on the command line or from a file. This example produces messages from input on the command line. For more examples of Kafkacat producer messaging, see the Examples in the Kafkacat documentation.

Prerequisites
  • Kafkacat is installed.

  • You have a running Kafka instance in OpenShift Streams for Apache Kafka.

  • You’ve set the Kafka bootstrap server endpoint and your service account credentials as environment variables.

Procedure
  1. On the command line, enter the following commands to start Kafkacat in producer mode. This mode enables you to produce messages to your Kafka topic.

    This example uses the SASL/PLAIN authentication mechanism with the server and credential environment variables that you set previously. This example produces messages to a topic in OpenShift Streams for Apache Kafka named my-first-kafka-topic. Replace the topic name with the relevant topic as needed. The topic that you use in this command must already exist in OpenShift Streams for Apache Kafka.

    Starting Kafkacat in producer mode
    $ kafkacat -t my-first-kafka-topic -b "$BOOTSTRAP_SERVER" \
     -X security.protocol=SASL_SSL -X sasl.mechanisms=PLAIN \
     -X sasl.username="$USER" \
     -X sasl.password="$PASSWORD" -P
    OpenShift Streams for Apache Kafka also supports the SASL/OAUTHBEARER mechanism for authentication, which is the recommended authentication mechanism to use. However, Kafkacat does not yet fully support OAUTHBEARER, so this example uses SASL/PLAIN.
  2. With Kafkacat running in producer mode, enter messages into Kafkacat that you want to produce to the Kafka topic.

    Example messages to produce to the Kafka topic
    First message
    Second message
    Third message
  3. Keep this producer running to use later when you create a consumer.

Verification
  • Verify that your producer is still running without any errors in the terminal.

Consuming messages in Kafkacat

You can use Kafkacat to consume messages from Kafka topics. This example consumes the messages that you sent previously with the producer that you created with Kafkacat.

Prerequisites
  • Kafkacat is installed.

  • You have a running Kafka instance in OpenShift Streams for Apache Kafka.

  • You’ve set the Kafka bootstrap server endpoint and your service account credentials as environment variables.

  • You used a producer to produce example messages to a topic.

Procedure
  1. On the command line in a separate terminal from your producer, enter the following commands to start Kafkacat in consumer mode. This mode enables you to consume messages from your Kafka topic.

    This example uses the SASL/PLAIN authentication mechanism with the server and credential environment variables that you set previously. This example consumes and displays the messages from the my-first-kafka-topic example topic, and states that it reached the end of partition 0 in the topic.

    Starting Kafkacat in consumer mode
    $ kafkacat -t my-first-kafka-topic -b "$BOOTSTRAP_SERVER" \
     -X security.protocol=SASL_SSL -X sasl.mechanisms=PLAIN \
     -X sasl.username="$USER" \
     -X sasl.password="$PASSWORD" -C
    
    First message
    Second message
    Third message
    % Reached end of topic my-first-kafka-topic [0] at offset 3
  2. If your producer is still running in a separate terminal, continue entering messages in the producer terminal and observe the messages being consumed in the consumer terminal.

Verification
  1. Verify that your consumer is running without any errors in the terminal.

  2. Verify that the consumer displays the messages from the my-first-kafka-topic example topic.