Using Node.js applications with Kafka instances in Red Hat OpenShift Streams for Apache Kafka

Guide
  • Red Hat OpenShift Streams for Apache Kafka 1
  • Updated 19 October 2021
  • Published 03 September 2021

Using Node.js applications with Kafka instances in Red Hat OpenShift Streams for Apache Kafka

Guide
Red Hat OpenShift Streams for Apache Kafka 1
  • Updated 19 October 2021
  • Published 03 September 2021

Red Hat OpenShift Streams for Apache Kafka is currently available for Development Preview. Development Preview releases provide early access to a limited set of features that might not be fully tested and that might change in the final GA version. Users should not use Development Preview software in production or for business-critical workloads. Limited documentation is available for Development Preview releases and is typically focused on fundamental user goals.

As a developer of applications and services, you can connect Node.js applications to Kafka instances in Red Hat OpenShift Streams for Apache Kafka. Node.js is a server-side JavaScript runtime that is designed to build scalable network applications. Node.js provides an I/O model that is based on events and non-blocking operations, which enables efficient applications.

Prerequisites

The example Node.js application in this quick start uses the KafkaJS client by default. If you want to use the node-rdkafka client, you must install some development tools locally on your computer, or use Docker to run a specified container image and configure a development environment. To learn more, see the documentation for the example Node.js application.

Importing the Node.js sample code

For this quick start, you’ll use sample code from the Nodeshift Application Starters reactive-example repository in GitHub. After you understand the concepts and tasks in this quick start, you can use your own Node.js applications with OpenShift Streams for Apache Kafka in the same way.

Procedure
  1. On the command line, clone the Nodeshift Application Starters reactive-example repository from GitHub.

    Cloning the reactive-example repository
    $ git clone https://github.com/nodeshift-starters/reactive-example.git
  2. In your IDE, open the reactive-example directory of the repository that you cloned.

Configuring the Node.js example application to connect to a Kafka instance

To enable your Node.js application to access a Kafka instance, you must configure a connection by specifying the following details:

  • The bootstrap server endpoint for your Kafka instance

  • The generated credentials for your OpenShift Streams for Apache Kafka service account

  • The Simple Authentication and Security Layer (SASL) mechanism that the client will use to authenticate with the Kafka instance

In this task, you’ll create a new configuration file called .env. In this file, you’ll set the required bootstrap server and client credentials as environment variables.

Prerequisites
  • You have the bootstrap server endpoint for your Kafka instance. To relocate the server endpoint, select your Kafka instance in the OpenShift Streams for Apache Kafka web console, select the options menu (three vertical dots), and click Connection.

  • You have the generated credentials for your service account. To regenerate the credentials, use the Service Accounts page in the OpenShift Streams for Apache Kafka web console to find your service account and update the credentials.

  • You’ve set the permissions for your service account to access the Kafka instance resources. To verify the current permissions, select your Kafka instance in the OpenShift Streams for Apache Kafka web console and use the Access page to find your service account permission settings.

Procedure
  1. In your IDE, create a new file. Save the file with the name .env, at the root level of the reactive-example directory for the cloned repository.

  2. In the .env file, add the lines shown in the example. These lines set the bootstrap server and client credentials as environment variables to be used by the Node.js application.

    Setting environment variables in the .env file
    KAFKA_BOOTSTRAP_SERVER=<bootstrap_server>
    KAFKA_CLIENT_ID=<client_id>
    KAFKA_CLIENT_SECRET=<client_secret>
    KAFKA_SASL_MECHANISM=plain

    In the preceding example, replace the values in angle brackets (< >) with your own bootstrap server and client credential information.

    In this case, observe that the Node.js application uses the SASL/PLAIN authentication method (that is, the value of KAFKA_SASL_MECHANISM is set to plain). This means that the application uses only the client ID and client secret to authenticate with the Kafka instance. The application doesn’t require an authentication token.

  3. Save the .env file.

Creating a Kafka topic in OpenShift Streams for Apache Kafka

The Node.js application in this quick start uses a Kafka topic called countries to produce and consume messages. In this task, you’ll create the topic in your Kafka instance.

Prerequisites
  • You’ve created a Kafka instance in OpenShift Streams for Apache Kafka and the instance is in the Ready state.

Procedure
  1. In the OpenShift Streams for Apache Kafka web console, go to Streams for Apache Kafka > Kafka Instances and click the name of the Kafka instance that you want to add a topic to.

  2. Click Create topic and follow the guided steps to define the topic details. Click Next to complete each step and click Finish to complete the setup.

    Image of wizard to create a topic
    Figure 1. Guided steps to define topic details
    • Topic name: Enter countries as the topic name.

    • Partitions: Set the number of partitions for this topic. This example sets the partition to 1 for a single partition. Partitions are distinct lists of messages within a topic and enable parts of a topic to be distributed over multiple brokers in the cluster. A topic can contain one or more partitions, enabling producer and consumer loads to be scaled.

    • Message retention: Set the message retention time and size to the relevant value and increment. This example sets the retention time to 7 days and the retention size to Unlimited. Message retention time is the amount of time that messages are retained in a topic before they are deleted or compacted, depending on the cleanup policy. Retention size is the maximum total size of all log segments in a partition before they are deleted or compacted.

    • Replicas: For this release of OpenShift Streams for Apache Kafka, the replicas are preconfigured. The number of partition replicas for the topic is set to 3 and the minimum number of follower replicas that must be in sync with a partition leader is set to 2. Replicas are copies of partitions in a topic. Partition replicas are distributed over multiple brokers in the cluster to ensure topic availability if a broker fails. When a follower replica is in sync with a partition leader, the follower replica can become the new partition leader if needed.

      After you complete the topic setup, the new Kafka topic is listed in the topics table for your Kafka instance. You can now run the Node.js application to start producing and consuming messages.

Verification
  • Verify that the countries topic is listed in the topics table.

Running the Node.js example application

After you configure your Node.js application to connect to a Kafka instance, and you create the required Kafka topic, you’re ready to run the application.

In this task, you’ll run the following components of the Node.js application:

  • A producer-backend component that generates random country names and sends these names to the Kafka topic.

  • A consumer-backend component that consumes the country names from the Kafka topic.

Prerequisites
  • You’ve configured the Node.js example application to connect to a Kafka instance.

  • You’ve created the countries Kafka topic.

Procedure
  1. On the command line, navigate to the reactive-example directory of the repository that you cloned.

    Navigating to the reactive-example directory
    $ cd reactive-example
  2. Navigate to the directory for the consumer component. Use Node Package Manager (npm) to install the dependencies for this component.

    Installing dependencies for the consumer component
    $ cd consumer-backend
    $ npm install
  3. Run the consumer component.

    Running the consumer component
    $ node consumer.js

    You should see the Node.js application start to run and connect to the Kafka instance. However, because you haven’t yet run the producer component, the consumer has no country names to display.

    If the application fails to run, review the error log in the command-line window and address any problems. Also, review the steps in this quick start to ensure that the application and Kafka topic are configured correctly.

  4. Open a second command-line window or tab.

  5. On the second command line, navigate to the reactive-example directory of the repository that you cloned.

    Navigating to the reactive-example directory
    $ cd reactive-example
  6. Navigate to the directory for the producer component. Use Node Package Manager to install the dependencies for this component.

    Installing dependencies for the producer component
    $ cd producer-backend
    $ npm install
  7. Run the producer component.

    Running the producer component
    $ node producer.js

    You should see output like that shown in the example.

    Example output from the producer component
    $ node producer.js
    Ghana
    Réunion
    Guatemala
    Luxembourg
    Mayotte
    Syria
    United Kingdom
    Bolivia
    Haiti

    As shown in the example, the producer component starts to run and generate messages that represent country names.

  8. Switch back to the first command-line window that you opened.

    You should now see that the consumer component displays the same country names generated by the producer, and in the same order, as shown in the example.

    Example output from the consumer component
    $ node consumer.js
    Ghana
    Réunion
    Guatemala
    Luxembourg
    Mayotte
    Syria
    United Kingdom
    Bolivia
    Haiti

    The output from both components confirms that they successfully connected to the Kafka instance. The components are using the Kafka topic that you created to produce and consume messages.

  9. In your IDE, in the producer-backend directory of the repository that you cloned, open the producer.js file.

    Observe that the producer component is configured to process environment variables from the .env file that you created. The values of these environment variables are the bootstrap server endpoint and client credentials that the component used to connect to the Kafka instance.

  10. In the consumer-backend directory, open the consumer.js file.

    Observe that the consumer component is also configured to process environment variables from the .env file that you created.