Red Hat OpenShift Connectors Reference

Guide
  • Red Hat OpenShift Connectors 1
  • Updated 04 March 2023
  • Published 04 August 2022

Red Hat OpenShift Connectors Reference

Guide
Red Hat OpenShift Connectors 1
  • Updated 04 March 2023
  • Published 04 August 2022

With Red Hat OpenShift Connectors, you can create and configure connections between Red Hat OpenShift Streams for Apache Kafka and third-party systems.

Amazon CloudWatch Metrics sink

Send data to Amazon CloudWatch metrics.

Configuration properties

The following table describes the configuration properties for the Amazon CloudWatch Metrics sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Cloud Watch Namespace*

aws_cw_namespace

The CloudWatch metric namespace.

string

Access Key*

aws_access_key

The access key obtained from AWS. / An opaque reference to the aws_access_key

string / object

Secret Key*

aws_secret_key

The secret key obtained from AWS. / An opaque reference to the aws_secret_key

string / object

AWS Region*

aws_region

The AWS region to access.

string

Overwrite Endpoint URI

aws_uri_endpoint_override

The overriding endpoint URI. To use this option, you must also select the overrideEndpoint option.

string

Endpoint Overwrite

aws_override_endpoint

Select this option to override the endpoint URI. To use this option, you must also provide a URI for the uriEndpointOverride option.

boolean

False

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Amazon DynamoDB sink

Send data to Amazon DynamoDB NoSQL database service. The sent data inserts, updates, or deletes an item in the specified Amazon DynamoDB table.

Configuration properties

The following table describes the configuration properties for the Amazon DynamoDB sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Table*

aws_table

The name of the DynamoDB table.

string

Access Key*

aws_access_key

The access key obtained from AWS. / An opaque reference to the aws_access_key

string / object

Secret Key*

aws_secret_key

The secret key obtained from AWS. / An opaque reference to the aws_secret_key

string / object

AWS Region*

aws_region

The AWS region to access.

string

Operation

aws_operation

The operation to perform. The options are PutItem, UpdateItem, or DeleteItem.

string

PutItem

PutItem

Write Capacity

aws_write_capacity

The provisioned throughput to reserve for writing resources to your table.

integer

1

Overwrite Endpoint URI

aws_uri_endpoint_override

The overriding endpoint URI. To use this option, you must also select the overrideEndpoint option.

string

Endpoint Overwrite

aws_override_endpoint

Select this option to override the endpoint URI. To use this option, you must also provide a URI for the uriEndpointOverride option.

boolean

False

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Amazon DynamoDB Stream source

Receive data from Amazon DynamoDB Streams.

Configuration properties

The following table describes the configuration properties for the Amazon DynamoDB Stream source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Table*

aws_table

The name of the DynamoDB table.

string

Access Key*

aws_access_key

The access key obtained from AWS. / An opaque reference to the aws_access_key

string / object

Secret Key*

aws_secret_key

The secret key obtained from AWS. / An opaque reference to the aws_secret_key

string / object

AWS Region*

aws_region

The AWS region to access.

string

Stream Iterator Type

aws_stream_iterator_type

Defines where in the DynamoDB stream to start getting records. There are two enums and the value can be one of FROM_LATEST and FROM_START. Note that using FROM_START can cause a significant delay before the stream has caught up to real-time.

string

FROM_LATEST

Overwrite Endpoint URI

aws_uri_endpoint_override

The overriding endpoint URI. To use this option, you must also select the overrideEndpoint option.

string

Endpoint Overwrite

aws_override_endpoint

Select this option to override the endpoint URI. To use this option, you must also provide a URI for the uriEndpointOverride option.

boolean

False

Delay

aws_delay

The number of milliseconds before the next poll from the database.

integer

500

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Amazon Kinesis sink

Send data to Amazon Kinesis.

Configuration properties

The following table describes the configuration properties for the Amazon Kinesis sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Stream Name*

aws_stream

The Kinesis stream that you want to access. The Kinesis stream that you specify must already exist.

string

Access Key*

aws_access_key

The access key obtained from AWS. / An opaque reference to the aws_access_key

string / object

Secret Key*

aws_secret_key

The secret key obtained from AWS. / An opaque reference to the aws_secret_key

string / object

AWS Region*

aws_region

The AWS region to access.

string

Overwrite Endpoint URI

aws_uri_endpoint_override

The overriding endpoint URI. To use this option, you must also select the overrideEndpoint option.

string

Endpoint Overwrite

aws_override_endpoint

Select this option to override the endpoint URI. To use this option, you must also provide a URI for the uriEndpointOverride option.

boolean

False

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Amazon Kinesis source

Receive data from Amazon Kinesis.

Configuration properties

The following table describes the configuration properties for the Amazon Kinesis source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Stream Name*

aws_stream

The Kinesis stream that you want to access. The Kinesis stream that you specify must already exist.

string

Access Key*

aws_access_key

The access key obtained from AWS. / An opaque reference to the aws_access_key

string / object

Secret Key*

aws_secret_key

The secret key obtained from AWS. / An opaque reference to the aws_secret_key

string / object

AWS Region*

aws_region

The AWS region to access.

string

eu-west-1

Overwrite Endpoint URI

aws_uri_endpoint_override

The overriding endpoint URI. To use this option, you must also select the overrideEndpoint option.

string

Endpoint Overwrite

aws_override_endpoint

Select this option to override the endpoint URI. To use this option, you must also provide a URI for the uriEndpointOverride option.

boolean

False

Delay

aws_delay

The number of milliseconds before the next poll of the selected stream.

integer

500

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Amazon Lambda sink

Send data to an Amazon Lambda function.

Configuration properties

The following table describes the configuration properties for the Amazon Lambda sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Function Name*

aws_function

The Lambda Function name.

string

Access Key*

aws_access_key

The access key obtained from AWS. / An opaque reference to the aws_access_key

string / object

Secret Key*

aws_secret_key

The secret key obtained from AWS. / An opaque reference to the aws_secret_key

string / object

AWS Region*

aws_region

The AWS region to access.

string

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Amazon Redshift sink

Send data to an Amazon Redshift Database.

Configuration properties

The following table describes the configuration properties for the Amazon Redshift sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Server Name*

sql_server_name

The server name for the data source.

string

localhost

Server Port

sql_server_port

The server port for the AWS RedShi data source.

string

5439

Username*

sql_username

The username to access a secured AWS Redshift Database.

string

Password*

sql_password

The password to access a secured AWS Redshift Database. / An opaque reference to the sql_password

string / object

Query*

sql_query

The query to execute against the AWS Redshift Database.

string

INSERT INTO accounts (username,city) VALUES (:#username,:#city)

Database Name*

sql_database_name

The name of the AWS RedShift Database.

string

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Amazon Redshift source

Query data from an Amazon Redshift Database.

Configuration properties

The following table describes the configuration properties for the Amazon Redshift source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Server Name*

sql_server_name

The server name for the data source.

string

localhost

Server Port

sql_server_port

The server port for the data source.

string

5439

Username*

sql_username

The username to access a secured AWS RedShift Database.

string

Password*

sql_password

The password to access a secured AWS RedShift Database. / An opaque reference to the sql_password

string / object

Query*

sql_query

The query to execute against the AWS RedShift Database.

string

INSERT INTO accounts (username,city) VALUES (:#username,:#city)

Database Name*

sql_database_name

The name of the AWS RedShift Database.

string

Consumed Query

sql_consumed_query

A query to run on a tuple consumed.

string

DELETE FROM accounts where user_id = :#user_id

Delay

sql_delay

The number of milliseconds before the next poll from the AWS RedShift database.

integer

500

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Amazon S3 sink

Send data to an Amazon S3 bucket.

Configuration properties

The following table describes the configuration properties for the Amazon S3 sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Bucket Name*

aws_bucket_name_or_arn

The S3 Bucket name or Amazon Resource Name (ARN).

string

Access Key*

aws_access_key

The access key obtained from AWS. / An opaque reference to the aws_access_key

string / object

Secret Key*

aws_secret_key

The secret key obtained from AWS. / An opaque reference to the aws_secret_key

string / object

AWS Region*

aws_region

The AWS region to access.

string

Autocreate Bucket

aws_auto_create_bucket

Specifies to automatically create the S3 bucket.

boolean

False

Overwrite Endpoint URI

aws_uri_endpoint_override

The overriding endpoint URI. To use this option, you must also select the overrideEndpoint option.

string

Endpoint Overwrite

aws_override_endpoint

Select this option to override the endpoint URI. To use this option, you must also provide a URI for the uriEndpointOverride option.

boolean

False

Key Name

aws_key_name

The key name for saving an element in the bucket.

string

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Amazon S3 source

Receive data from an Amazon S3 bucket.

Configuration properties

The following table describes the configuration properties for the Amazon S3 source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Bucket Name*

aws_bucket_name_or_arn

The S3 Bucket name or Amazon Resource Name (ARN).

string

Auto-delete Objects

aws_delete_after_read

Specifies to delete objects after consuming them.

boolean

True

Access Key*

aws_access_key

The access key obtained from AWS. / An opaque reference to the aws_access_key

string / object

Secret Key*

aws_secret_key

The secret key obtained from AWS. / An opaque reference to the aws_secret_key

string / object

AWS Region*

aws_region

The AWS region to access.

string

Autocreate Bucket

aws_auto_create_bucket

Specifies to automatically create the S3 bucket.

boolean

False

Include Body

aws_include_body

If true, the exchange is consumed and put into the body and closed. If false, the S3Object stream is put raw into the body and the headers are set with the S3 object metadata.

boolean

True

Prefix

aws_prefix

The AWS S3 bucket prefix to consider while searching.

string

folder/

Ignore Body

aws_ignore_body

If true, the S3 Object body is ignored. Setting this to true overrides any behavior defined by the includeBody option. If false, the S3 object is put in the body.

boolean

False

Overwrite Endpoint URI

aws_uri_endpoint_override

The overriding endpoint URI. To use this option, you must also select the overrideEndpoint option.

string

Endpoint Overwrite

aws_override_endpoint

Select this option to override the endpoint URI. To use this option, you must also provide a URI for the uriEndpointOverride option.

boolean

False

Delay

aws_delay

The number of milliseconds before the next poll of the selected bucket.

integer

500

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Amazon Simple Email Service sink

Send email through the Amazon Simple Email Service (SES).

Configuration properties

The following table describes the configuration properties for the Amazon Simple Email Service sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

From*

aws_from

From address

string

user@example.com

Access Key*

aws_access_key

The access key obtained from AWS. / An opaque reference to the aws_access_key

string / object

Secret Key*

aws_secret_key

The secret key obtained from AWS. / An opaque reference to the aws_secret_key

string / object

AWS Region*

aws_region

The AWS region to access.

string

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Amazon Simple Notification Service sink

Send data to an Amazon Simple Notification Service (SNS) topic.

Configuration properties

The following table describes the configuration properties for the Amazon Simple Notification Service sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Topic Name*

aws_topic_name_or_arn

The SNS topic name name or Amazon Resource Name (ARN).

string

Access Key*

aws_access_key

The access key obtained from AWS. / An opaque reference to the aws_access_key

string / object

Secret Key*

aws_secret_key

The secret key obtained from AWS. / An opaque reference to the aws_secret_key

string / object

AWS Region*

aws_region

The AWS region to access.

string

Autocreate Topic

aws_auto_create_topic

Setting the autocreation of the SNS topic.

boolean

False

Overwrite Endpoint URI

aws_uri_endpoint_override

The overriding endpoint URI. To use this option, you must also select the overrideEndpoint option.

string

Endpoint Overwrite

aws_override_endpoint

Select this option to override the endpoint URI. To use this option, you must also provide a URI for the uriEndpointOverride option.

boolean

False

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Amazon Simple Queue Service sink

Send data to Amazon Simple Queue Service (SQS).

Configuration properties

The following table describes the configuration properties for the Amazon Simple Queue Service sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Queue Name*

aws_queue_name_or_arn

The SQS Queue name or or Amazon Resource Name (ARN).

string

Access Key*

aws_access_key

The access key obtained from AWS. / An opaque reference to the aws_access_key

string / object

Secret Key*

aws_secret_key

The secret key obtained from AWS. / An opaque reference to the aws_secret_key

string / object

AWS Region*

aws_region

The AWS region to access.

string

Autocreate Queue

aws_auto_create_queue

Automatically create the SQS queue.

boolean

False

AWS Host

aws_amazon_a_w_s_host

The hostname of the Amazon AWS cloud.

string

amazonaws.com

Protocol

aws_protocol

The underlying protocol used to communicate with SQS.

string

https

http or https

Overwrite Endpoint URI

aws_uri_endpoint_override

The overriding endpoint URI. To use this option, you must also select the overrideEndpoint option.

string

Endpoint Overwrite

aws_override_endpoint

Select this option to override the endpoint URI. To use this option, you must also provide a URI for the uriEndpointOverride option.

boolean

False

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Amazon Simple Queue Service source

Receive data from Amazon Simple Queue Service (SQS).

Configuration properties

The following table describes the configuration properties for the Amazon Simple Queue Service source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Queue Name*

aws_queue_name_or_arn

The SQS Queue Name or ARN

string

Auto-delete Messages

aws_delete_after_read

Delete messages after consuming them

boolean

True

Access Key*

aws_access_key

The access key obtained from AWS. / An opaque reference to the aws_access_key

string / object

Secret Key*

aws_secret_key

The secret key obtained from AWS. / An opaque reference to the aws_secret_key

string / object

AWS Region*

aws_region

The AWS region to access.

string

Autocreate Queue

aws_auto_create_queue

Setting the autocreation of the SQS queue.

boolean

False

AWS Host

aws_amazon_a_w_s_host

The hostname of the Amazon AWS cloud.

string

amazonaws.com

Protocol

aws_protocol

The underlying protocol used to communicate with SQS

string

https

http or https

Queue URL

aws_queue_u_r_l

The full SQS Queue URL (required if using KEDA)

string

Overwrite Endpoint URI

aws_uri_endpoint_override

The overriding endpoint URI. To use this option, you must also select the overrideEndpoint option.

string

Endpoint Overwrite

aws_override_endpoint

Select this option to override the endpoint URI. To use this option, you must also provide a URI for the uriEndpointOverride option.

boolean

False

Delay

aws_delay

The number of milliseconds before the next poll of the selected stream

integer

500

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Azure Event Hubs sink

Send data to Azure Event Hubs.

Configuration properties

The following table describes the configuration properties for the Azure Event Hubs sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Eventhubs Namespace*

azure_namespace_name

The eventhubs namespace

string

Eventhubs Name*

azure_eventhub_name

The eventhub name

string

Share Access Name*

azure_shared_access_name

EventHubs SAS key name

string

Share Access Key*

azure_shared_access_key

The key for EventHubs SAS key name / An opaque reference to the azure_shared_access_key

string / object

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Azure Event Hubs source

Receive data from Azure Event Hubs.

Configuration properties

The following table describes the configuration properties for the Azure Event Hubs source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Eventhubs Namespace*

azure_namespace_name

The Event Hubs namespace.

string

Eventhubs Name*

azure_eventhub_name

The Event Hub name.

string

Share Access Name*

azure_shared_access_name

The Event Hubs SAS key name.

string

Share Access Key*

azure_shared_access_key

The key for the Event Hubs SAS key name. / An opaque reference to the azure_shared_access_key

string / object

Azure Storage Blob Account Name*

azure_blob_account_name

The name of the Storage Blob account.

string

Azure Storage Blob Container Name*

azure_blob_container_name

The name of the Storage Blob container.

string

Azure Storage Blob Access Key*

azure_blob_access_key

The key for the Azure Storage Blob service that is associated with the Storage Blob account name. / An opaque reference to the azure_blob_access_key

string / object

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Azure Functions sink

Send data to Functions.

Configuration properties

The following table describes the configuration properties for the Azure Functions sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

URL*

azure_url

The Azure Functions URL you want to send the data to.

string

https://azure-function-demo-12234.azurewebsites.net/api/httpexample

Method

azure_method

The HTTP method to use.

string

POST

Key

azure_key

A function-specific API key is required, if the authLevel of the function is FUNCTION or master key if the authLevel is ADMIN. / An opaque reference to the azure_key

string / object

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Azure Blob Storage sink

Send data to Azure Blob storage.

Configuration properties

The following table describes the configuration properties for the Azure Blob Storage sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Account Name*

azure_account_name

The Azure Storage Blob account name.

string

Container Name*

azure_container_name

The Azure Storage Blob container name.

string

Access Key*

azure_access_key

The Azure Storage Blob access key. / An opaque reference to the azure_access_key

string / object

Operation name

azure_operation

The operation to perform.

string

uploadBlockBlob

Credential Type

azure_credential_type

Determines the credential strategy to adopt. Possible values are SHARED_ACCOUNT_KEY, SHARED_KEY_CREDENTIAL and AZURE_IDENTITY

string

SHARED_ACCOUNT_KEY

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Azure Blob Storage source

Receive data from Azure Blob storage.

Configuration properties

The following table describes the configuration properties for the Azure Blob Storage source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Period between Polls*

azure_period

The interval (in milliseconds) between fetches to the Azure Storage Container.

integer

10000

Account Name*

azure_account_name

The Azure Storage Blob account name.

string

Container Name*

azure_container_name

The Azure Storage Blob container name.

string

Access Key*

azure_access_key

The Azure Storage Blob access key. / An opaque reference to the azure_access_key

string / object

Credential Type

azure_credential_type

Determines the credential strategy to adopt. Possible values are SHARED_ACCOUNT_KEY, SHARED_KEY_CREDENTIAL and AZURE_IDENTITY

string

SHARED_ACCOUNT_KEY

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Azure Queue Storage sink

Send data to Azure Queue Storage.

Configuration properties

The following table describes the configuration properties for the Azure Queue Storage sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Account Name*

azure_account_name

The Azure Storage Queue account name.

string

Queue Name*

azure_queue_name

The Azure Storage Queue container name.

string

Access Key*

azure_access_key

The Azure Storage Queue access key. / An opaque reference to the azure_access_key

string / object

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Azure Queue Storage source

Receive data from Azure Queue Storage.

Configuration properties

The following table describes the configuration properties for the Azure Queue Storage source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Account Name*

azure_account_name

The Azure Storage Queue account name.

string

Queue Name*

azure_queue_name

The Azure Storage Queue container name.

string

Access Key*

azure_access_key

The Azure Storage Queue access key. / An opaque reference to the azure_access_key

string / object

Maximum Messages

azure_max_messages

The maximum number of messages to get. You can specify a value between 1 and 32. The default is 1 (one message). If there are fewer than the maximum number of messages in the queue, then all the messages are returned.

integer

1

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Apache Cassandra sink

Send data to an Apache Cassandra cluster.

Configuration properties

The following table describes the configuration properties for the Apache Cassandra sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Connection Host*

cassandra_connection_host

The hostname(s) for the Cassandra server(s). Use a comma to separate multiple hostnames.

string

localhost

Connection Port*

cassandra_connection_port

The port number(s) of the cassandra server(s). Use a comma to separate multiple port numbers.

string

9042

Keyspace*

cassandra_keyspace

The keyspace to use.

string

customers

Username

cassandra_username

The username for accessing a secured Cassandra cluster.

string

Password

cassandra_password

The password for accessing a secured Cassandra cluster. / An opaque reference to the cassandra_password

string / object

Consistency Level

cassandra_consistency_level

The consistency level to use. Set the value to one of these options - ANY, ONE, TWO, THREE, QUORUM, ALL, LOCAL_QUORUM, EACH_QUORUM, SERIAL, LOCAL_SERIAL, or LOCAL_ONE.

string

ANY

Prepare Statements

cassandra_prepare_statements

If true, specifies to use PreparedStatements as the query. If false, specifies to use regular Statements as the query.

boolean

True

Query*

cassandra_query

The query to execute against the Cassandra cluster table.

string

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Apache Cassandra source

Retrieve data by sending a query to an Apache Cassandra cluster table.

Configuration properties

The following table describes the configuration properties for the Apache Cassandra source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Connection Host*

cassandra_connection_host

The hostname(s) for the Cassandra server(s). Use a comma to separate multiple hostnames.

string

localhost

Connection Port*

cassandra_connection_port

The port number(s) of the cassandra server(s). Use a comma to separate multiple port numbers.

string

9042

Keyspace*

cassandra_keyspace

The keyspace to use.

string

customers

Username

cassandra_username

The username for accessing a secured Cassandra cluster.

string

Password

cassandra_password

The password for accessing a secured Cassandra cluster. / An opaque reference to the cassandra_password

string / object

Result Strategy

cassandra_result_strategy

The strategy to convert the result set of the query. Possible values are ALL, ONE, LIMIT_10, or LIMIT_100.

string

ALL

Consistency Level

cassandra_consistency_level

The consistency level to use. Possible values are ANY, ONE, TWO, THREE, QUORUM, ALL, LOCAL_QUORUM, EACH_QUORUM, SERIAL, LOCAL_SERIAL, or LOCAL_ONE.

string

QUORUM

Query*

cassandra_query

The query to execute against the Cassandra cluster table.

string

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Data Generator source

A data generator (for development and testing purposes).

Configuration properties

The following table describes the configuration properties for the Data Generator source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Period

timer_period

The interval (in milliseconds) to wait between producing the next message.

integer

1000

Message*

timer_message

The message to generate.

string

hello world

Content Type

timer_content_type

The content type of the generated message.

string

text/plain

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Debezium MongoDB Connector

Configuration properties

The following table describes the configuration properties for the Debezium MongoDB Connector Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Topic prefix*

topic.prefix

Topic prefix that identifies and provides a namespace for the particular database server/cluster is capturing changes. The topic prefix should be unique across all other connectors, since it is used as a prefix for all Kafka topic names that receive events emitted by this connector. Only alphanumeric characters, hyphens, dots and underscores must be accepted.

string

Hosts

mongodb.hosts

The hostname and port pairs (in the form 'host' or 'host:port') of the MongoDB server(s) in the replica set.

string

User

mongodb.user

Database user for connecting to MongoDB, if necessary.

string

Password

mongodb.password

Password to be used when connecting to MongoDB, if necessary.

string / object

Enable SSL connection to MongoDB

mongodb.ssl.enabled

Should connector use SSL to connect to MongoDB instances

boolean

False

Credentials Database

mongodb.authsource

Database containing user credentials.

string

admin

Include Databases

database.include.list

A comma-separated list of regular expressions that match the database names for which changes are to be captured

string

Exclude Databases

database.exclude.list

A comma-separated list of regular expressions that match the database names for which changes are to be excluded

string

Include Collections

collection.include.list

A comma-separated list of regular expressions that match the collection names for which changes are to be captured

string

collection.exclude.list

A comma-separated list of regular expressions that match the collection names for which changes are to be excluded

string

Exclude Fields

field.exclude.list

A comma-separated list of the fully-qualified names of fields that should be excluded from change event message values

string

Snapshot mode

snapshot.mode

The criteria for running a snapshot upon startup of the connector. Options include: 'initial' (the default) to specify the connector should always perform an initial sync when required; 'never' to specify the connector should never perform an initial sync

string

initial

Query fetch size

query.fetch.size

The maximum number of records that should be loaded into memory while streaming. A value of '0' uses the default JDBC fetch size.

integer

0

Change event batch size

max.batch.size

Maximum size of each batch of source records. Defaults to 2048.

integer

2048

Change event buffer size

max.queue.size

Maximum size of the queue for change events read from the database log but not yet recorded or forwarded. Defaults to 8192, and should always be larger than the maximum batch size.

integer

8192

Kafka topic name

schema.history.internal.kafka.topic

The name of the topic for the database schema history.

string

Kafka bootstrap servers

schema.history.internal.kafka.bootstrap.servers

A list of host/port pairs that the connector will use for establishing the initial connection to the Kafka cluster for retrieving database schema history previously stored by the connector. This should point to the same Kafka cluster used by the Kafka Connect process.

string

Kafka Message Key Format

data_shape

The serialization format for the Kafka message key.

object

Kafka Message Value Format

data_shape

The serialization format for the Kafka message value.

object

Debezium MySQL Connector

Configuration properties

The following table describes the configuration properties for the Debezium MySQL Connector Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Topic prefix*

topic.prefix

Topic prefix that identifies and provides a namespace for the particular database server/cluster is capturing changes. The topic prefix should be unique across all other connectors, since it is used as a prefix for all Kafka topic names that receive events emitted by this connector. Only alphanumeric characters, hyphens, dots and underscores must be accepted.

string

Cluster ID*

database.server.id

A numeric ID of this database client, which must be unique across all currently-running database processes in the cluster. This connector joins the MySQL database cluster as another server (with this unique ID) so it can read the binlog.

integer

Hostname*

database.hostname

Resolvable hostname or IP address of the database server.

string

Port

database.port

Port of the database server.

integer

3306

User*

database.user

Name of the database user to be used when connecting to the database.

string

Password

database.password

Password of the database user to be used when connecting to the database.

string / object

Include Databases

database.include.list

The databases for which changes are to be captured

string

Exclude Databases

database.exclude.list

A comma-separated list of regular expressions that match database names to be excluded from monitoring

string

Include Tables

table.include.list

The tables for which changes are to be captured

string

Exclude Tables

table.exclude.list

A comma-separated list of regular expressions that match the fully-qualified names of tables to be excluded from monitoring

string

Include Columns

column.include.list

Regular expressions matching columns to include in change events

string

Exclude Columns

column.exclude.list

Regular expressions matching columns to exclude from change events

string

Snapshot mode

snapshot.mode

The criteria for running a snapshot upon startup of the connector. Options include: 'when_needed' to specify that the connector run a snapshot upon startup whenever it deems it necessary; 'schema_only' to only take a snapshot of the schema (table structures) but no actual data; 'initial' (the default) to specify the connector can run a snapshot only when no offsets are available for the logical server name; 'initial_only' same as 'initial' except the connector should stop after completing the snapshot and before it would normally read the binlog; and’never' to specify the connector should never run a snapshot and that upon first startup the connector should read from the beginning of the binlog. The 'never' mode should be used with care, and only when the binlog is known to contain all history.

string

initial

Decimal Handling

decimal.handling.mode

Specify how DECIMAL and NUMERIC columns should be represented in change events, including: 'precise' (the default) uses java.math.BigDecimal to represent values, which are encoded in the change events using a binary representation and Kafka Connect’s 'org.apache.kafka.connect.data.Decimal' type; 'string' uses string to represent values; 'double' represents values using Java’s 'double', which may not offer the precision but will be far easier to use in consumers.

string

precise

Columns PK mapping

message.key.columns

A semicolon-separated list of expressions that match fully-qualified tables and column(s) to be used as message key. Each expression must match the pattern '<fully-qualified table name>:<key columns>', where the table names could be defined as (DB_NAME.TABLE_NAME) or (SCHEMA_NAME.TABLE_NAME), depending on the specific connector, and the key columns are a comma-separated list of columns representing the custom key. For any table without an explicit key configuration the table’s primary key column(s) will be used as message key. Example: dbserver1.inventory.orderlines:orderId,orderLineId;dbserver1.inventory.orders:id

string

Query fetch size

query.fetch.size

The maximum number of records that should be loaded into memory while streaming. A value of '0' uses the default JDBC fetch size.

integer

0

Change event batch size

max.batch.size

Maximum size of each batch of source records. Defaults to 2048.

integer

2048

Change event buffer size

max.queue.size

Maximum size of the queue for change events read from the database log but not yet recorded or forwarded. Defaults to 8192, and should always be larger than the maximum batch size.

integer

8192

Kafka topic name

schema.history.internal.kafka.topic

The name of the topic for the database schema history.

string

Kafka bootstrap servers

schema.history.internal.kafka.bootstrap.servers

A list of host/port pairs that the connector will use for establishing the initial connection to the Kafka cluster for retrieving database schema history previously stored by the connector. This should point to the same Kafka cluster used by the Kafka Connect process.

string

Kafka Message Key Format

data_shape

The serialization format for the Kafka message key.

object

Kafka Message Value Format

data_shape

The serialization format for the Kafka message value.

object

Debezium PostgreSQL Connector

Configuration properties

The following table describes the configuration properties for the Debezium PostgreSQL Connector Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Topic prefix*

topic.prefix

Topic prefix that identifies and provides a namespace for the particular database server/cluster is capturing changes. The topic prefix should be unique across all other connectors, since it is used as a prefix for all Kafka topic names that receive events emitted by this connector. Only alphanumeric characters, hyphens, dots and underscores must be accepted.

string

Hostname*

database.hostname

Resolvable hostname or IP address of the database server.

string

Port

database.port

Port of the database server.

integer

5432

User*

database.user

Name of the database user to be used when connecting to the database.

string

Password

database.password

Password of the database user to be used when connecting to the database.

string / object

Slot

slot.name

The name of the Postgres logical decoding slot created for streaming changes from a plugin. Defaults to 'debezium

string

debezium

Publication

publication.name

The name of the Postgres 10+ publication used for streaming changes from a plugin. Defaults to 'dbz_publication'

string

dbz_publication

Publication Auto Create Mode

publication.autocreate.mode

Applies only when streaming changes using pgoutput.Determine how creation of a publication should work, the default is all_tables.DISABLED - The connector will not attempt to create a publication at all. The expectation is that the user has created the publication up-front. If the publication isn’t found to exist upon startup, the connector will throw an exception and stop.ALL_TABLES - If no publication exists, the connector will create a new publication for all tables. Note this requires that the configured user has access. If the publication already exists, it will be used. i.e CREATE PUBLICATION <publication_name> FOR ALL TABLES;FILTERED - If no publication exists, the connector will create a new publication for all those tables matchingthe current filter configuration (see table/database include/exclude list properties). If the publication already exists, it will be used. i.e CREATE PUBLICATION <publication_name> FOR TABLE <tbl1, tbl2, etc>

string

all_tables

Include Schemas

schema.include.list

The schemas for which events should be captured

string

Exclude Schemas

schema.exclude.list

The schemas for which events must not be captured

string

Include Tables

table.include.list

The tables for which changes are to be captured

string

Exclude Tables

table.exclude.list

A comma-separated list of regular expressions that match the fully-qualified names of tables to be excluded from monitoring

string

Include Columns

column.include.list

Regular expressions matching columns to include in change events

string

Exclude Columns

column.exclude.list

Regular expressions matching columns to exclude from change events

string

Snapshot mode

snapshot.mode

The criteria for running a snapshot upon startup of the connector. Options include: 'always' to specify that the connector run a snapshot each time it starts up; 'initial' (the default) to specify the connector can run a snapshot only when no offsets are available for the logical server name; 'initial_only' same as 'initial' except the connector should stop after completing the snapshot and before it would normally start emitting changes;'never' to specify the connector should never run a snapshot and that upon first startup the connector should read from the last position (LSN) recorded by the server; and’exported' deprecated, use 'initial' instead; 'custom' to specify a custom class with 'snapshot.custom_class' which will be loaded and used to determine the snapshot, see docs for more details.

string

initial

Decimal Handling

decimal.handling.mode

Specify how DECIMAL and NUMERIC columns should be represented in change events, including: 'precise' (the default) uses java.math.BigDecimal to represent values, which are encoded in the change events using a binary representation and Kafka Connect’s 'org.apache.kafka.connect.data.Decimal' type; 'string' uses string to represent values; 'double' represents values using Java’s 'double', which may not offer the precision but will be far easier to use in consumers.

string

precise

Columns PK mapping

message.key.columns

A semicolon-separated list of expressions that match fully-qualified tables and column(s) to be used as message key. Each expression must match the pattern '<fully-qualified table name>:<key columns>', where the table names could be defined as (DB_NAME.TABLE_NAME) or (SCHEMA_NAME.TABLE_NAME), depending on the specific connector, and the key columns are a comma-separated list of columns representing the custom key. For any table without an explicit key configuration the table’s primary key column(s) will be used as message key. Example: dbserver1.inventory.orderlines:orderId,orderLineId;dbserver1.inventory.orders:id

string

Query fetch size

query.fetch.size

The maximum number of records that should be loaded into memory while streaming. A value of '0' uses the default JDBC fetch size.

integer

0

Change event batch size

max.batch.size

Maximum size of each batch of source records. Defaults to 2048.

integer

2048

Change event buffer size

max.queue.size

Maximum size of the queue for change events read from the database log but not yet recorded or forwarded. Defaults to 8192, and should always be larger than the maximum batch size.

integer

8192

Kafka topic name

schema.history.internal.kafka.topic

The name of the topic for the database schema history.

string

Kafka bootstrap servers

schema.history.internal.kafka.bootstrap.servers

A list of host/port pairs that the connector will use for establishing the initial connection to the Kafka cluster for retrieving database schema history previously stored by the connector. This should point to the same Kafka cluster used by the Kafka Connect process.

string

Kafka Message Key Format

data_shape

The serialization format for the Kafka message key.

object

Kafka Message Value Format

data_shape

The serialization format for the Kafka message value.

object

Debezium SQLServer Connector

Configuration properties

The following table describes the configuration properties for the Debezium SQLServer Connector Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Topic prefix*

topic.prefix

Topic prefix that identifies and provides a namespace for the particular database server/cluster is capturing changes. The topic prefix should be unique across all other connectors, since it is used as a prefix for all Kafka topic names that receive events emitted by this connector. Only alphanumeric characters, hyphens, dots and underscores must be accepted.

string

Hostname*

database.hostname

Resolvable hostname or IP address of the database server.

string

Port

database.port

Port of the database server.

integer

1433

User

database.user

Name of the database user to be used when connecting to the database.

string

Password

database.password

Password of the database user to be used when connecting to the database.

string / object

Databases

database.names

The names of the databases from which the connector should capture changes

string

Include Tables

table.include.list

The tables for which changes are to be captured

string

Exclude Tables

table.exclude.list

A comma-separated list of regular expressions that match the fully-qualified names of tables to be excluded from monitoring

string

Include Columns

column.include.list

Regular expressions matching columns to include in change events

string

Exclude Columns

column.exclude.list

Regular expressions matching columns to exclude from change events

string

Snapshot mode

snapshot.mode

The criteria for running a snapshot upon startup of the connector. Options include: 'initial' (the default) to specify the connector should run a snapshot only when no offsets are available for the logical server name; 'schema_only' to specify the connector should run a snapshot of the schema when no offsets are available for the logical server name.

string

initial

Decimal Handling

decimal.handling.mode

Specify how DECIMAL and NUMERIC columns should be represented in change events, including: 'precise' (the default) uses java.math.BigDecimal to represent values, which are encoded in the change events using a binary representation and Kafka Connect’s 'org.apache.kafka.connect.data.Decimal' type; 'string' uses string to represent values; 'double' represents values using Java’s 'double', which may not offer the precision but will be far easier to use in consumers.

string

precise

Columns PK mapping

message.key.columns

A semicolon-separated list of expressions that match fully-qualified tables and column(s) to be used as message key. Each expression must match the pattern '<fully-qualified table name>:<key columns>', where the table names could be defined as (DB_NAME.TABLE_NAME) or (SCHEMA_NAME.TABLE_NAME), depending on the specific connector, and the key columns are a comma-separated list of columns representing the custom key. For any table without an explicit key configuration the table’s primary key column(s) will be used as message key. Example: dbserver1.inventory.orderlines:orderId,orderLineId;dbserver1.inventory.orders:id

string

Query fetch size

query.fetch.size

The maximum number of records that should be loaded into memory while streaming. A value of '0' uses the default JDBC fetch size.

integer

0

Change event batch size

max.batch.size

Maximum size of each batch of source records. Defaults to 2048.

integer

2048

Change event buffer size

max.queue.size

Maximum size of the queue for change events read from the database log but not yet recorded or forwarded. Defaults to 8192, and should always be larger than the maximum batch size.

integer

8192

Kafka topic name

schema.history.internal.kafka.topic

The name of the topic for the database schema history.

string

Kafka bootstrap servers

schema.history.internal.kafka.bootstrap.servers

A list of host/port pairs that the connector will use for establishing the initial connection to the Kafka cluster for retrieving database schema history previously stored by the connector. This should point to the same Kafka cluster used by the Kafka Connect process.

string

Kafka Message Key Format

data_shape

The serialization format for the Kafka message key.

object

Kafka Message Value Format

data_shape

The serialization format for the Kafka message value.

object

Elasticsearch sink

Store JSON-formatted data into ElasticSearch.

Configuration properties

The following table describes the configuration properties for the Elasticsearch sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Username

elasticsearch_user

The username to connect to ElasticSearch.

string

Password

elasticsearch_password

The password to connect to ElasticSearch. / An opaque reference to the elasticsearch_password

string / object

Enable SSL

elasticsearch_enable_s_s_l

Specifies to connect by using SSL.

boolean

True

Host Addresses*

elasticsearch_host_addresses

A comma-separated list of remote transport addresses in ip:port format.

string

quickstart-es-http:9200

ElasticSearch Cluster Name*

elasticsearch_cluster_name

The name of the ElasticSearch cluster.

string

quickstart

Index in ElasticSearch

elasticsearch_index_name

The name of the ElasticSearch index.

string

data

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

FTPS sink

Send data to an FTPS Server.

Configuration properties

The following table describes the configuration properties for the FTPS sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Connection Host*

ftps_connection_host

The hostname of the FTP server.

string

Connection Port*

ftps_connection_port

The port of the FTP server.

string

21

Username*

ftps_username

The username to access the FTP server.

string

Password*

ftps_password

The password to access the FTP server. / An opaque reference to the ftps_password

string / object

Directory Name*

ftps_directory_name

The starting directory.

string

Passive Mode

ftps_passive_mode

Set the passive mode connection.

boolean

False

File Existence

ftps_file_exist

Specifies how the Kamelet behaves if the file already exists. Possible values are Override, Append, Fail, or Ignore.

string

Override

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

FTPS source

Retrieve data from an FTPS Server.

Configuration properties

The following table describes the configuration properties for the FTPS source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Connection Host*

ftps_connection_host

The hostname of the FTPS server.

string

Connection Port*

ftps_connection_port

The port of the FTPS server.

string

21

Username*

ftps_username

The username to access the FTPS server.

string

Password*

ftps_password

The password to access the FTPS server. / An opaque reference to the ftps_password

string / object

Directory Name*

ftps_directory_name

The starting directory.

string

Passive Mode

ftps_passive_mode

Specifies to use passive mode connection.

boolean

False

Recursive

ftps_recursive

If a directory, look for files in all sub-directories as well.

boolean

False

Idempotency

ftps_idempotent

Skip already-processed files.

boolean

True

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Google BigQuery sink

Send data to a Google Big Query table.

Configuration properties

The following table describes the configuration properties for the Google BigQuery sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Google Cloud Project Id*

gcp_project_id

The Google Cloud Project ID.

string

Big Query Dataset Id*

gcp_dataset

The Big Query Dataset ID.

string

Big Query Table Id*

gcp_table

The Big Query Table ID.

string

Google Cloud Platform Credential File*

gcp_credentials_file_location

The credential for accessing Google Cloud Platform API services. This value must be a path to a service account key file.

string

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Google Cloud Functions sink

Send data to Google Functions.

Configuration properties

The following table describes the configuration properties for the Google Cloud Functions sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Project Id*

gcp_project_id

The Google Cloud Functions Project ID.

string

Region*

gcp_region

The region where Google Cloud Functions has been deployed.

string

Function Name*

gcp_function_name

The Function name.

string

Service Account Key*

gcp_service_account_key

The path to the service account key file that provides credentials for the Google Cloud Functions platform. / An opaque reference to the aws_access_key

string / object

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Google Cloud Pub/Sub sink

Send data to Google Cloud Pub/Sub.

Configuration properties

The following table describes the configuration properties for the Google Cloud Pub/Sub sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Project Id*

gcp_project_id

The Google Cloud Pub/Sub Project ID.

string

Destination Name*

gcp_destination_name

The destination name.

string

Service Account Key*

gcp_service_account_key

The service account key to use as credentials for the Pub/Sub publisher/subscriber. You must encode this value in base64. / An opaque reference to the aws_access_key

string / object

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Google Cloud Pub/Sub source

Receive data from Google Cloud Pub/Sub.

Configuration properties

The following table describes the configuration properties for the Google Cloud Pub/Sub source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Project Id*

gcp_project_id

The Google Cloud Pub/Sub Project ID.

string

Subscription Name*

gcp_subscription_name

The subscription name.

string

Service Account Key*

gcp_service_account_key

The service account key to use as credentials for the Pub/Sub publisher/subscriber. You must encode this value in base64. / An opaque reference to the aws_access_key

string / object

Synchronous Pull

gcp_synchronous_pull

Specifies to synchronously pull batches of messages.

boolean

False

Max Messages Per Poll

gcp_max_messages_per_poll

The maximum number of messages to receive from the server in a single API call.

integer

1

Concurrent Consumers

gcp_concurrent_consumers

The number of parallel streams to consume from the subscription.

integer

1

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Google Cloud Storage sink

Upload data to Google Cloud Storage.

Configuration properties

The following table describes the configuration properties for the Google Cloud Storage sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Bucket Name Or ARN*

gcp_bucket_name_or_arn

The Google Cloud Storage bucket name or Bucket Amazon Resource Name (ARN).

string

Service Account Key*

gcp_service_account_key

The service account key to use as credentials for Google Cloud Storage access. You must encode this value in base64. / An opaque reference to the aws_access_key

string / object

Autocreate Bucket

gcp_auto_create_bucket

Specifies to automatically create the Google Cloud Storage bucket.

boolean

False

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Google Cloud Storage source

Send messages to Google Pubsub.

Configuration properties

The following table describes the configuration properties for the Google Cloud Storage source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Bucket Name Or ARN*

gcp_bucket_name_or_arn

The Google Cloud Storage bucket name or Bucket Amazon Resource Name (ARN).

string

Service Account Key*

gcp_service_account_key

The service account key to use as credentials for Google Cloud Storage access. You must encode this value in base64. / An opaque reference to the aws_access_key

string / object

Auto-delete Objects

gcp_delete_after_read

Specifies to delete objects after consuming them.

boolean

True

Autocreate Bucket

gcp_auto_create_bucket

Specifies to automatically create the Google Cloud Storage bucket.

boolean

False

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

HTTP sink

Send data to a HTTP endpoint.

Configuration properties

The following table describes the configuration properties for the HTTP sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

URL*

http_url

The URL to which you want to send data.

string

https://my-service/path

Method

http_method

The HTTP method to use.

string

POST

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Jira Add Comment sink

Add a new comment to an existing issue in Jira.

Configuration properties

The following table describes the configuration properties for the Jira Add Comment sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Jira URL*

jira_jira_url

The URL of your instance of Jira

string

http://my_jira.com:8081

Username*

jira_username

The username to access Jira

string

Password*

jira_password

The password or the API Token to access Jira / An opaque reference to the jira_password

string / object

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Jira Add Issue sink

Add a new issue to Jira.

Configuration properties

The following table describes the configuration properties for the Jira Add Issue sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Jira URL*

jira_jira_url

The URL of your instance of Jira

string

http://my_jira.com:8081

Username*

jira_username

The username to access Jira

string

Password*

jira_password

The password or the API Token to access Jira / An opaque reference to the jira_password

string / object

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Jira source

Receive notifications about new issues from Jira.

Configuration properties

The following table describes the configuration properties for the Jira source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Jira URL*

jira_jira_url

The URL of your instance of Jira.

string

http://my_jira.com:8081

Username*

jira_username

The username to access Jira.

string

Password*

jira_password

The password or the API Token to access Jira. / An opaque reference to the jira_password

string / object

JQL

jira_jql

A query to filter issues.

string

project=MyProject

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

JMS AMQP 1.0 sink

Send data to any AMQP 1.0 compliant message broker by using the Apache Qpid JMS client.

Configuration properties

The following table describes the configuration properties for the JMS AMQP 1.0 sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Destination Type

jms_amqp_destination_type

The JMS destination type (queue or topic).

string

queue

Destination Name*

jms_amqp_destination_name

The JMS destination name.

string

Broker URL*

jms_amqp_remote_u_r_i

The JMS URL.

string

amqp://my-host:31616

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

JMS AMQP 1.0 source

Receive data from any AMQP 1.0 compliant message broker by using the Apache Qpid JMS client.

Configuration properties

The following table describes the configuration properties for the JMS AMQP 1.0 source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Destination Type

jms_amqp_destination_type

The JMS destination type (queue or topic).

string

queue

Destination Name*

jms_amqp_destination_name

The JMS destination name.

string

Broker URL*

jms_amqp_remote_u_r_i

The JMS URL.

string

amqp://my-host:31616

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

JMS Apache Artemis sink

Send data to an Apache Artemis message broker by using JMS.

Configuration properties

The following table describes the configuration properties for the JMS Apache Artemis sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Destination Type

jms_artemis_destination_type

The JMS destination type (queue or topic).

string

queue

Destination Name*

jms_artemis_destination_name

The JMS destination name.

string

person

Broker URL*

jms_artemis_broker_u_r_l

The JMS URL.

string

tcp://my-host:61616

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

JMS Apache Artemis source

Receive data from an Apache Artemis message broker by using JMS.

Configuration properties

The following table describes the configuration properties for the JMS Apache Artemis source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Destination Type

jms_artemis_destination_type

The JMS destination type (queue or topic).

string

queue

Destination Name*

jms_artemis_destination_name

The JMS destination name.

string

Broker URL*

jms_artemis_broker_u_r_l

The JMS URL.

string

tcp://k3s-node-master.usersys.redhat.com:31616

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

MariaDB sink

Send data to a MariaDB Database.

Configuration properties

The following table describes the configuration properties for the MariaDB sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Server Name*

db_server_name

The server name for the data source.

string

localhost

Server Port

db_server_port

The server port for the data source.

string

3306

Username*

db_username

The username to access a secured MariaDB Database.

string

Password*

db_password

The password to access a secured MariaDB Database. / An opaque reference to the db_password

string / object

Query*

db_query

The query to execute against the MariaDB Database.

string

INSERT INTO accounts (username,city) VALUES (:#username,:#city)

Database Name*

db_database_name

The name of the MariaDB Database.

string

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

MariaDB source

Query data from a MariaDB Database.

Configuration properties

The following table describes the configuration properties for the MariaDB source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Server Name*

db_server_name

The server name for the data source.

string

localhost

Server Port

db_server_port

The server port for the data source.

string

3306

Username*

db_username

The username to access a secured MariaDB Database.

string

Password*

db_password

The password to access a secured MariaDB Database. / An opaque reference to the db_password

string / object

Query*

db_query

The query to execute against the MariaDB Database.

string

INSERT INTO accounts (username,city) VALUES (:#username,:#city)

Database Name*

db_database_name

The name of the MariaDB Database.

string

Consumed Query

db_consumed_query

A query to run on a tuple consumed.

string

DELETE FROM accounts where user_id = :#user_id

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

MinIO sink

Send data to MinIO.

Configuration properties

The following table describes the configuration properties for the MinIO sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Bucket Name*

minio_bucket_name

The Minio Bucket name.

string

Access Key*

minio_access_key

The access key obtained from MinIO. / An opaque reference to the minio_access_key

string / object

Secret Key*

minio_secret_key

The secret key obtained from MinIO. / An opaque reference to the minio_secret_key

string / object

Endpoint*

minio_endpoint

The MinIO Endpoint. You can specify an URL, domain name, IPv4 address, or IPv6 address.

string

http://localhost:9000

Autocreate Bucket

minio_auto_create_bucket

Specify to automatically create the MinIO bucket.

boolean

False

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

MinIO source

Retrieve data from MinIO.

Configuration properties

The following table describes the configuration properties for the MinIO source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Bucket Name*

minio_bucket_name

The MinIO Bucket name.

string

Auto-delete Objects

minio_delete_after_read

Delete objects after consuming them.

boolean

True

Access Key*

minio_access_key

The access key obtained from MinIO. / An opaque reference to the minio_access_key

string / object

Secret Key*

minio_secret_key

The secret key obtained from MinIO. / An opaque reference to the minio_secret_key

string / object

Endpoint*

minio_endpoint

The MinIO Endpoint. You can specify an URL, domain name, IPv4 address, or IPv6 address.

string

http://localhost:9000

Autocreate Bucket

minio_auto_create_bucket

Specifies to automatically create the MinIO bucket.

boolean

False

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

MongoDB sink

Send data to MongoDB.

Configuration properties

The following table describes the configuration properties for the MongoDB sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

MongoDB Hosts*

mongodb_hosts

Comma separated list of MongoDB Host Addresses in host:port format.

string

MongoDB Collection*

mongodb_collection

Sets the name of the MongoDB collection to bind to this endpoint.

string

MongoDB Password

mongodb_password

User password for accessing MongoDB. / An opaque reference to the mongodb_password

string / object

MongoDB Username

mongodb_username

Username for accessing MongoDB.

string

MongoDB Database*

mongodb_database

Sets the name of the MongoDB database to target.

string

Write Concern

mongodb_write_concern

Configure the level of acknowledgment requested from MongoDB for write operations, possible values are ACKNOWLEDGED, W1, W2, W3, UNACKNOWLEDGED, JOURNALED, MAJORITY.

string

Collection

mongodb_create_collection

Create collection during initialisation if it doesn’t exist.

boolean

False

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

MongoDB source

Retrieve data from MongoDB.

Configuration properties

The following table describes the configuration properties for the MongoDB source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

MongoDB Hosts*

mongodb_hosts

A comma-separated list of MongoDB host addresses in host:port format.

string

MongoDB Collection*

mongodb_collection

The name of the MongoDB collection to bind to this endpoint.

string

MongoDB Password

mongodb_password

The user password for accessing MongoDB. / An opaque reference to the mongodb_password

string / object

MongoDB Username

mongodb_username

The username for accessing MongoDB. The username must be present in the MongoDB’s authentication database (authenticationDatabase). By default, the MongoDB authenticationDatabase is 'admin'.

string

MongoDB Database*

mongodb_database

The name of the MongoDB database.

string

MongoDB Persistent Tail Tracking

mongodb_persistent_tail_tracking

Specifies to enable persistent tail tracking, which is a mechanism to keep track of the last consumed data across system restarts. The next time the system is up, the endpoint recovers the cursor from the point where it last stopped consuimg data. This option will only work on capped collections.

boolean

False

MongoDB Tail Track Increasing Field

mongodb_tail_track_increasing_field

The correlation field in the incoming data which is of increasing nature and is used to position the tailing cursor every time it is generated.

string

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

MySQL sink

Send data to a MySQL Database.

Configuration properties

The following table describes the configuration properties for the MySQL sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Server Name*

db_server_name

The server name for the data source.

string

localhost

Server Port

db_server_port

The server port for the data source.

string

3306

Username*

db_username

The username to access a secured MySQL Database.

string

Password*

db_password

The password to access a secured MySQL Database. / An opaque reference to the db_password

string / object

Query*

db_query

The query to execute against the MySQL Database.

string

INSERT INTO accounts (username,city) VALUES (:#username,:#city)

Database Name*

db_database_name

The name of the MySQL Database.

string

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

MySQL source

Query data from a MySQL Database.

Configuration properties

The following table describes the configuration properties for the MySQL source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Server Name*

db_server_name

The server name for the data source.

string

localhost

Server Port

db_server_port

The server port for the data source.

string

3306

Username*

db_username

The username to access a secured MySQL Database.

string

Password*

db_password

The password to access a secured MySQL Database. / An opaque reference to the db_password

string / object

Query*

db_query

The query to execute against the MySQL Database.

string

INSERT INTO accounts (username,city) VALUES (:#username,:#city)

Database Name*

db_database_name

The name of the MySQL Database.

string

Consumed Query

db_consumed_query

A query to run on a tuple consumed.

string

DELETE FROM accounts where user_id = :#user_id

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

PostgreSQL sink

Send data to a PostgreSQL Database.

Configuration properties

The following table describes the configuration properties for the PostgreSQL sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Server Name*

db_server_name

The server name for the data source.

string

localhost

Server Port

db_server_port

The server port for the data source.

string

5432

Username*

db_username

The username to access a secured PostgreSQL Database.

string

Password*

db_password

The password to access a secured PostgreSQL Database. / An opaque reference to the db_password

string / object

Query*

db_query

The query to execute against the PostgreSQL Database.

string

INSERT INTO accounts (username,city) VALUES (:#username,:#city)

Database Name*

db_database_name

The name of the PostgreSQL Database.

string

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

PostgreSQL source

Query data from a PostgreSQL Database.

Configuration properties

The following table describes the configuration properties for the PostgreSQL source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Server Name*

db_server_name

The server name for the data source.

string

localhost

Server Port

db_server_port

The server port for the data source.

string

5432

Username*

db_username

The username to access a secured PostgreSQL Database.

string

Password*

db_password

The password to access a secured PostgreSQL Database. / An opaque reference to the db_password

string / object

Query*

db_query

The query to execute against the PostgreSQL Database.

string

INSERT INTO accounts (username,city) VALUES (:#username,:#city)

Database Name*

db_database_name

The name of the PostgreSQL Database.

string

Consumed Query

db_consumed_query

A query to run on a tuple consumed.

string

DELETE FROM accounts where user_id = :#user_id

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Salesforce Create sink

Create an object in Salesforce.

Configuration properties

The following table describes the configuration properties for the Salesforce Create sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Object Name

salesforce_s_object_name

The type of the object.

string

Contact

Login URL

salesforce_login_url

The Salesforce instance login URL.

string

https://login.salesforce.com

Consumer Key*

salesforce_client_id

The Salesforce application consumer key.

string

Consumer Secret*

salesforce_client_secret

The Salesforce application consumer secret. / An opaque reference to the salesforce_client_secret

string / object

Username*

salesforce_user_name

The Salesforce username.

string

Password*

salesforce_password

The Salesforce user password. / An opaque reference to the salesforce_password

string / object

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Salesforce Delete sink

Delete an object in Salesforce.

Configuration properties

The following table describes the configuration properties for the Salesforce Delete sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Login URL

salesforce_login_url

The Salesforce instance login URL.

string

https://login.salesforce.com

Consumer Key*

salesforce_client_id

The Salesforce application consumer key.

string

Consumer Secret*

salesforce_client_secret

The Salesforce application consumer secret. / An opaque reference to the salesforce_client_secret

string / object

Username*

salesforce_user_name

The Salesforce username.

string

Password*

salesforce_password

The Salesforce user password. / An opaque reference to the salesforce_password

string / object

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Salesforce Streaming source

Receive updates from Salesforce.

Configuration properties

The following table describes the configuration properties for the Salesforce Streaming source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

objectName*

salesforce_object_name

The sObjectName

string

Login URL

salesforce_login_url

The Salesforce instance login URL

string

https://login.salesforce.com

Consumer Key*

salesforce_client_id

The Salesforce application consumer key

string

Consumer Secret*

salesforce_client_secret

The Salesforce application consumer secret / An opaque reference to the salesforce_client_secret

string / object

Username*

salesforce_user_name

The Salesforce username

string

Password*

salesforce_password

The Salesforce user password / An opaque reference to the salesforce_password

string / object

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Salesforce Update sink

Update an object in Salesforce.

Configuration properties

The following table describes the configuration properties for the Salesforce Update sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Object Name*

salesforce_s_object_name

The type of the Salesforce object. Required if using a key-value pair.

string

Contact

Object Id*

salesforce_s_object_id

The ID of the Salesforce object. Required if using a key-value pair.

string

Login URL

salesforce_login_url

The Salesforce instance login URL.

string

https://login.salesforce.com

Consumer Key*

salesforce_client_id

The Salesforce application consumer key.

string

Consumer Secret*

salesforce_client_secret

The Salesforce application consumer secret. / An opaque reference to the salesforce_client_secret

string / object

Username*

salesforce_user_name

The Salesforce username.

string

Password*

salesforce_password

The Salesforce user password. / An opaque reference to the salesforce_password

string / object

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

SFTP sink

Send data to an SFTP Server.

Configuration properties

The following table describes the configuration properties for the SFTP sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Connection Host*

sftp_connection_host

The hostname of the FTP server

string

Connection Port*

sftp_connection_port

The port of the FTP server

string

22

Username*

sftp_username

The username to access the FTP server.

string

Password*

sftp_password

The password to access the FTP server. / An opaque reference to the sftp_password

string / object

Directory Name*

sftp_directory_name

The starting directory.

string

Passive Mode

sftp_passive_mode

Specifies to use passive mode connection.

boolean

False

File Existence

sftp_file_exist

How to behave in case of file already existent. There are 4 enums. Possible values are Override, Append, Fail, or Ignore.

string

Override

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

SFTP source

Retrieve data from an SFTP Server.

Configuration properties

The following table describes the configuration properties for the SFTP source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Connection Host*

sftp_connection_host

The hostname of the SFTP server.

string

Connection Port*

sftp_connection_port

The port of the FTP server.

string

22

Username*

sftp_username

The username to access the SFTP server.

string

Password*

sftp_password

The password to access the SFTP server. / An opaque reference to the sftp_password

string / object

Directory Name*

sftp_directory_name

The starting directory.

string

Passive Mode

sftp_passive_mode

Sets the passive mode connection.

boolean

False

Recursive

sftp_recursive

If a directory, look for files in all sub-directories as well.

boolean

False

Idempotency

sftp_idempotent

Skip already-processed files.

boolean

True

Ignore File Not Found Or Permission Error

sftp_ignore_file_not_found_or_permission_error

Whether to ignore when (trying to list files in directories or when downloading a file), which does not exist or due to permission error. By default when a directory or file does not exists or insufficient permission, then an exception is thrown. Setting this option to true allows to ignore that instead.

boolean

False

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Slack sink

Send messages to a Slack channel.

Configuration properties

The following table describes the configuration properties for the Slack sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Channel*

slack_channel

The Slack channel to send messages to.

string

#myroom

Webhook URL*

slack_webhook_url

The webhook URL used by the Slack channel to handle incoming messages. / An opaque reference to the slack_webhook_url

string / object

Icon Emoji

slack_icon_emoji

Use a Slack emoji as an avatar.

string

Icon URL

slack_icon_url

The avatar to use when sending a message to a channel or user.

string

Username

slack_username

The username for the bot when it sends messages to a channel or user.

string

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Slack source

Receive messages from a Slack channel.

Configuration properties

The following table describes the configuration properties for the Slack source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Channel*

slack_channel

The Slack channel to receive messages from.

string

#myroom

Token*

slack_token

The Bot User OAuth Access Token to access Slack. A Slack app that has the following permissions is required: channels:history, groups:history, im:history, mpim:history, channels:read, groups:read, im:read, and mpim:read. / An opaque reference to the slack_token

string / object

Delay

slack_delay

The delay between polls.

string

1s

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Splunk sink

Send data to Splunk.

Configuration properties

The following table describes the configuration properties for the Splunk sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Splunk Server Address*

splunk_server_hostname

The address of your Splunk server.

string

my_server_splunk.com

Splunk Server Port

splunk_server_port

The address of your Splunk server.

integer

8089

Username*

splunk_username

The username to authenticate to Splunk Server.

string

Password*

splunk_password

The password to authenticate to Splunk Server. / An opaque reference to the splunk_password

string / object

Index

splunk_index

Splunk index to write to.

string

Protocol

splunk_protocol

Connection Protocol to Splunk server.

string

https

Source

splunk_source

The source named field of the data.

string

Source Type

splunk_source_type

The source named field of the data.

string

Splunk App

splunk_app

The app name in Splunk.

string

Connection Timeout

splunk_connection_timeout

Timeout in milliseconds when connecting to Splunk server

integer

5000

Mode

splunk_mode

The mode to publish events to Splunk.

string

stream

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

SQL Server sink

Send data to a SQL Server Database.

Configuration properties

The following table describes the configuration properties for the SQL Server sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Server Name*

db_server_name

The server name for the data source.

string

localhost

Server Port

db_server_port

The server port for the data source.

string

1433

Username*

db_username

The username to access a secured SQL Server Database.

string

Password*

db_password

The password to access a secured SQL Server Database. / An opaque reference to the db_password

string / object

Query*

db_query

The query to execute against the SQL Server Database.

string

INSERT INTO accounts (username,city) VALUES (:#username,:#city)

Database Name*

db_database_name

The name of the SQL Server Database.

string

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

SQL Server source

Query data from a SQL Server Database.

Configuration properties

The following table describes the configuration properties for the SQL Server source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Server Name*

db_server_name

The server name for the data source.

string

localhost

Server Port

db_server_port

The server port for the data source.

string

1433

Username*

db_username

The username to access a secured SQL Server Database

string

Password*

db_password

The password to access a secured SQL Server Database / An opaque reference to the db_password

string / object

Query*

db_query

The query to execute against the SQL Server Database

string

INSERT INTO accounts (username,city) VALUES (:#username,:#city)

Database Name*

db_database_name

The name of the Database.

string

Consumed Query

db_consumed_query

A query to run on a tuple consumed

string

DELETE FROM accounts where user_id = :#user_id

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object

Telegram sink

Send a message to a Telegram chat using your Telegram bot as sender. To create a bot, use your Telegram app to contact the @botfather account.

Configuration properties

The following table describes the configuration properties for the Telegram sink Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Token*

telegram_authorization_token

The token to access your bot on Telegram. You you can obtain it from the Telegram @botfather. / An opaque reference to the telegram_authorization_token

string / object

Chat ID

telegram_chat_id

The Chat ID to where you want to send messages by default.

string

Topic Names*

kafka_topic

A comma-separated list of Kafka topic names.

string

Data Shape

data_shape

The format of the data that Kafka sends to the sink connector.

object

Telegram source

Receive all messages that people send to your Telegram bot.

Configuration properties

The following table describes the configuration properties for the Telegram source Connector.

Fields marked with an asterisk (*) are mandatory.
Name Property Description Type Default Example

Token*

telegram_authorization_token

The token to access your bot on Telegram. You you can obtain it from the Telegram @botfather. / An opaque reference to the telegram_authorization_token

string / object

Topic Name*

kafka_topic

The name of the Kafka Topic to use.

string

Data Shape

data_shape

The format of the data that the source connector sends to Kafka.

object