Configuring and connecting Kafka scripts with Red Hat OpenShift Streams for Apache Kafka

Guide
  • Red Hat OpenShift Streams for Apache Kafka 1
  • Updated 19 October 2021
  • Published 13 April 2021

Configuring and connecting Kafka scripts with Red Hat OpenShift Streams for Apache Kafka

Guide
Red Hat OpenShift Streams for Apache Kafka 1
  • Updated 19 October 2021
  • Published 13 April 2021

Red Hat OpenShift Streams for Apache Kafka is currently available for Development Preview. Development Preview releases provide early access to a limited set of features that might not be fully tested and that might change in the final GA version. Users should not use Development Preview software in production or for business-critical workloads. Limited documentation is available for Development Preview releases and is typically focused on fundamental user goals.

As a developer of applications and services, you can use Kafka scripts to manage your Kafka instances in Red Hat OpenShift Streams for Apache Kafka. The Kafka scripts are a set of shell scripts that are included with the Apache Kafka distribution. With these scripts, you can produce and consume messages for your Kafka instances.

The Kafka scripts are part of the open source community version of Apache Kafka. The scripts are not a part of OpenShift Streams for Apache Kafka and are therefore not supported by Red Hat.

When you download and extract the Apache Kafka distribution, the bin/ directory (or the bin\windows\ directory if you’re using Windows) of the distribution contains a set of shell scripts that enable you to interact with your Kafka instance. With the scripts, you can produce and consume messages, and perform various operations against the Kafka APIs to administer topics, consumer groups, and other resources.

The command examples in this quick start demonstrate how to use the Kafka scripts on Linux and macOS. If you’re using Windows, use the Windows versions of the scripts. For example, instead of the <Kafka-distribution-dir>/bin/kafka-console-producer.sh script, use the <Kafka-distribution-dir>\bin\windows\kafka-console-producer.bat script.
Prerequisites
  • You have a Red Hat account.

  • You have a running Kafka instance in OpenShift Streams for Apache Kafka.

  • JDK 11 or later is installed.

  • For Windows, the latest version of Oracle JDK is installed.

  • You’ve downloaded the latest supported binary version of the Apache Kafka distribution.

    Verifying Kafka scripts
    $ ./kafka-console-producer.sh --version
    2.7.0 (Commit:448719dc99a19793)

Configuring the Kafka scripts to connect to a Kafka instance

To enable the Kafka scripts to access a Kafka instance, you must configure the connection using the generated credentials for your OpenShift Streams for Apache Kafka service account. For the Kafka scripts, you will create a configuration file that defines these values.

Prerequisites
  • You have the generated credentials for your service account. To regenerate the credentials, use the Service Accounts page in the OpenShift Streams for Apache Kafka web console to find your service account and update the credentials.

  • You’ve set the permissions for your service account to access the Kafka instance resources. To verify the current permissions, select your Kafka instance in the OpenShift Streams for Apache Kafka web console and use the Access page to find your service account permission settings.

Procedure
  1. In your Kafka distribution, navigate to the config/ directory.

  2. Create a file called app-services.properties.

  3. In the app-services.properties file, set the SASL connection mechanism and the Kafka instance client credentials. Replace the values with your own credential information.

    Setting server and credential values
    sasl.mechanism=PLAIN
    security.protocol=SASL_SSL
    
    sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
      username="<client_id>" \
      password="<client_secret>" ;
    OpenShift Streams for Apache Kafka also supports the SASL/OAUTHBEARER mechanism for authentication, which is the recommended authentication mechanism to use. However, the Kafka scripts do not yet fully support OAUTHBEARER, so this example uses SASL/PLAIN.
  4. Save the file. You will use it in the next task to connect to your Kafka instance and produce messages.

Producing messages using Kafka scripts

You can use the kafka-console-producer script to produce messages to Kafka topics.

Prerequisites
  • You have a running Kafka instance in OpenShift Streams for Apache Kafka.

  • You have the bootstrap server endpoint for your Kafka instance. To relocate the server endpoint, select your Kafka instance in the OpenShift Streams for Apache Kafka web console, select the options menu (three vertical dots), and click Connection.

  • You’ve set the permissions for your service account to access the Kafka instance resources. To verify the current permissions, select your Kafka instance in the OpenShift Streams for Apache Kafka web console and use the Access page to find your service account permission settings.

  • You’ve created the app-services.properties file to store your service account credentials.

Procedure
  1. On the command line, from the bin/ directory, enter the following command to create a Kafka topic.

    This example uses the kafka-topics script to create the my-other-topic Kafka topic with the default settings.

    Using the kafka-topics script to create a Kafka topic
    $ ./kafka-topics.sh --create --topic my-other-topic --bootstrap-server <bootstrap_server> --command-config ../config/app-services.properties
    Created topic my-other-topic.
  2. Enter the following command to start the kafka-console-producer script.

    This example uses the SASL/PLAIN authentication mechanism with the credentials that you saved in the app-services.properties file. This example produces messages to the my-other-topic example topic that you created.

    Starting the kafka-console-producer script
    $ ./kafka-console-producer.sh --topic my-other-topic --bootstrap-server "<bootstrap_server>" --producer.config ../config/app-services.properties
  3. With the kafka-console-producer script running, enter messages that you want to produce to the Kafka topic.

    Example messages to produce to the Kafka topic
    >First message
    >Second message
    >Third message
  4. Keep the producer running to use later when you create a consumer.

Verification
  • Verify that the kafka-console-producer script is still running without any errors in the terminal.

Consuming messages using Kafka scripts

You can use the kafka-console-consumer script to consume messages from Kafka topics. This example consumes the messages that you sent previously with the producer that you created with the kafka-console-producer script.

Prerequisites
  • You used the kafka-console-producer script to produce example messages to a topic.

Procedure
  1. On the command line in a separate terminal from your producer, enter the following command to start the kafka-console-consumer script.

    This example uses the SASL/PLAIN authentication mechanism with the credentials that you saved in the app-services.properties file. This example consumes and displays the messages from the my-other-topic example topic.

    Starting the kafka-console-consumer script
    $ ./kafka-console-consumer.sh --topic my-other-topic --bootstrap-server "<bootstrap_server>" --from-beginning --consumer.config ../config/app-services.properties
    First message
    Second message
    Third message
  2. If your producer is still running in a separate terminal, continue entering messages in the producer terminal and observe the messages being consumed in the consumer terminal.

Verification
  1. Verify that the kafka-console-consumer script is running without any errors in the terminal.

  2. Verify that the kafka-console-consumer script displays the messages from the my-other-topic example topic.