With Red Hat OpenShift Connectors, you can create and configure connections between Red Hat OpenShift Streams for Apache Kafka and third-party systems.
Amazon CloudWatch Metrics sink
Send data to Amazon CloudWatch metrics.
Configuration properties
The following table describes the configuration properties for the Amazon CloudWatch Metrics sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Cloud Watch Namespace* |
|
The CloudWatch metric namespace. |
string |
||
Access Key* |
|
The access key obtained from AWS. / An opaque reference to the aws_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from AWS. / An opaque reference to the aws_secret_key |
string / object |
||
AWS Region* |
|
The AWS region to access. |
string |
||
Overwrite Endpoint URI |
|
The overriding endpoint URI. To use this option, you must also select the |
string |
||
Endpoint Overwrite |
|
Select this option to override the endpoint URI. To use this option, you must also provide a URI for the |
boolean |
False |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Amazon DynamoDB sink
Send data to Amazon DynamoDB NoSQL database service. The sent data inserts, updates, or deletes an item in the specified Amazon DynamoDB table.
Configuration properties
The following table describes the configuration properties for the Amazon DynamoDB sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Table* |
|
The name of the DynamoDB table. |
string |
||
Access Key* |
|
The access key obtained from AWS. / An opaque reference to the aws_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from AWS. / An opaque reference to the aws_secret_key |
string / object |
||
AWS Region* |
|
The AWS region to access. |
string |
||
Operation |
|
The operation to perform. The options are PutItem, UpdateItem, or DeleteItem. |
string |
PutItem |
PutItem |
Write Capacity |
|
The provisioned throughput to reserve for writing resources to your table. |
integer |
1 |
|
Overwrite Endpoint URI |
|
The overriding endpoint URI. To use this option, you must also select the |
string |
||
Endpoint Overwrite |
|
Select this option to override the endpoint URI. To use this option, you must also provide a URI for the |
boolean |
False |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Amazon DynamoDB Stream source
Receive data from Amazon DynamoDB Streams.
Configuration properties
The following table describes the configuration properties for the Amazon DynamoDB Stream source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Table* |
|
The name of the DynamoDB table. |
string |
||
Access Key* |
|
The access key obtained from AWS. / An opaque reference to the aws_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from AWS. / An opaque reference to the aws_secret_key |
string / object |
||
AWS Region* |
|
The AWS region to access. |
string |
||
Stream Iterator Type |
|
Defines where in the DynamoDB stream to start getting records. There are two enums and the value can be one of FROM_LATEST and FROM_START. Note that using FROM_START can cause a significant delay before the stream has caught up to real-time. |
string |
FROM_LATEST |
|
Overwrite Endpoint URI |
|
The overriding endpoint URI. To use this option, you must also select the |
string |
||
Endpoint Overwrite |
|
Select this option to override the endpoint URI. To use this option, you must also provide a URI for the |
boolean |
False |
|
Delay |
|
The number of milliseconds before the next poll from the database. |
integer |
500 |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Amazon Kinesis sink
Send data to Amazon Kinesis.
Configuration properties
The following table describes the configuration properties for the Amazon Kinesis sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Stream Name* |
|
The Kinesis stream that you want to access. The Kinesis stream that you specify must already exist. |
string |
||
Access Key* |
|
The access key obtained from AWS. / An opaque reference to the aws_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from AWS. / An opaque reference to the aws_secret_key |
string / object |
||
AWS Region* |
|
The AWS region to access. |
string |
||
Overwrite Endpoint URI |
|
The overriding endpoint URI. To use this option, you must also select the |
string |
||
Endpoint Overwrite |
|
Select this option to override the endpoint URI. To use this option, you must also provide a URI for the |
boolean |
False |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Amazon Kinesis source
Receive data from Amazon Kinesis.
Configuration properties
The following table describes the configuration properties for the Amazon Kinesis source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Stream Name* |
|
The Kinesis stream that you want to access. The Kinesis stream that you specify must already exist. |
string |
||
Access Key* |
|
The access key obtained from AWS. / An opaque reference to the aws_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from AWS. / An opaque reference to the aws_secret_key |
string / object |
||
AWS Region* |
|
The AWS region to access. |
string |
eu-west-1 |
|
Overwrite Endpoint URI |
|
The overriding endpoint URI. To use this option, you must also select the |
string |
||
Endpoint Overwrite |
|
Select this option to override the endpoint URI. To use this option, you must also provide a URI for the |
boolean |
False |
|
Delay |
|
The number of milliseconds before the next poll of the selected stream. |
integer |
500 |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Amazon Lambda sink
Send data to an Amazon Lambda function.
Configuration properties
The following table describes the configuration properties for the Amazon Lambda sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Function Name* |
|
The Lambda Function name. |
string |
||
Access Key* |
|
The access key obtained from AWS. / An opaque reference to the aws_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from AWS. / An opaque reference to the aws_secret_key |
string / object |
||
AWS Region* |
|
The AWS region to access. |
string |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Amazon Redshift sink
Send data to an Amazon Redshift Database.
Configuration properties
The following table describes the configuration properties for the Amazon Redshift sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Server Name* |
|
The server name for the data source. |
string |
localhost |
|
Server Port |
|
The server port for the AWS RedShi data source. |
string |
5439 |
|
Username* |
|
The username to access a secured AWS Redshift Database. |
string |
||
Password* |
|
The password to access a secured AWS Redshift Database. / An opaque reference to the sql_password |
string / object |
||
Query* |
|
The query to execute against the AWS Redshift Database. |
string |
INSERT INTO accounts (username,city) VALUES (:#username,:#city) |
|
Database Name* |
|
The name of the AWS RedShift Database. |
string |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Amazon Redshift source
Query data from an Amazon Redshift Database.
Configuration properties
The following table describes the configuration properties for the Amazon Redshift source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Server Name* |
|
The server name for the data source. |
string |
localhost |
|
Server Port |
|
The server port for the data source. |
string |
5439 |
|
Username* |
|
The username to access a secured AWS RedShift Database. |
string |
||
Password* |
|
The password to access a secured AWS RedShift Database. / An opaque reference to the sql_password |
string / object |
||
Query* |
|
The query to execute against the AWS RedShift Database. |
string |
INSERT INTO accounts (username,city) VALUES (:#username,:#city) |
|
Database Name* |
|
The name of the AWS RedShift Database. |
string |
||
Consumed Query |
|
A query to run on a tuple consumed. |
string |
DELETE FROM accounts where user_id = :#user_id |
|
Delay |
|
The number of milliseconds before the next poll from the AWS RedShift database. |
integer |
500 |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Amazon S3 sink
Send data to an Amazon S3 bucket.
Configuration properties
The following table describes the configuration properties for the Amazon S3 sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Bucket Name* |
|
The S3 Bucket name or Amazon Resource Name (ARN). |
string |
||
Access Key* |
|
The access key obtained from AWS. / An opaque reference to the aws_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from AWS. / An opaque reference to the aws_secret_key |
string / object |
||
AWS Region* |
|
The AWS region to access. |
string |
||
Autocreate Bucket |
|
Specifies to automatically create the S3 bucket. |
boolean |
False |
|
Overwrite Endpoint URI |
|
The overriding endpoint URI. To use this option, you must also select the |
string |
||
Endpoint Overwrite |
|
Select this option to override the endpoint URI. To use this option, you must also provide a URI for the |
boolean |
False |
|
Key Name |
|
The key name for saving an element in the bucket. |
string |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Amazon S3 source
Receive data from an Amazon S3 bucket.
Configuration properties
The following table describes the configuration properties for the Amazon S3 source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Bucket Name* |
|
The S3 Bucket name or Amazon Resource Name (ARN). |
string |
||
Auto-delete Objects |
|
Specifies to delete objects after consuming them. |
boolean |
True |
|
Access Key* |
|
The access key obtained from AWS. / An opaque reference to the aws_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from AWS. / An opaque reference to the aws_secret_key |
string / object |
||
AWS Region* |
|
The AWS region to access. |
string |
||
Autocreate Bucket |
|
Specifies to automatically create the S3 bucket. |
boolean |
False |
|
Include Body |
|
If true, the exchange is consumed and put into the body and closed. If false, the S3Object stream is put raw into the body and the headers are set with the S3 object metadata. |
boolean |
True |
|
Prefix |
|
The AWS S3 bucket prefix to consider while searching. |
string |
folder/ |
|
Ignore Body |
|
If true, the S3 Object body is ignored. Setting this to true overrides any behavior defined by the |
boolean |
False |
|
Overwrite Endpoint URI |
|
The overriding endpoint URI. To use this option, you must also select the |
string |
||
Endpoint Overwrite |
|
Select this option to override the endpoint URI. To use this option, you must also provide a URI for the |
boolean |
False |
|
Delay |
|
The number of milliseconds before the next poll of the selected bucket. |
integer |
500 |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Amazon Simple Email Service sink
Send email through the Amazon Simple Email Service (SES).
Configuration properties
The following table describes the configuration properties for the Amazon Simple Email Service sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
From* |
|
From address |
string |
||
Access Key* |
|
The access key obtained from AWS. / An opaque reference to the aws_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from AWS. / An opaque reference to the aws_secret_key |
string / object |
||
AWS Region* |
|
The AWS region to access. |
string |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Amazon Simple Notification Service sink
Send data to an Amazon Simple Notification Service (SNS) topic.
Configuration properties
The following table describes the configuration properties for the Amazon Simple Notification Service sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Topic Name* |
|
The SNS topic name name or Amazon Resource Name (ARN). |
string |
||
Access Key* |
|
The access key obtained from AWS. / An opaque reference to the aws_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from AWS. / An opaque reference to the aws_secret_key |
string / object |
||
AWS Region* |
|
The AWS region to access. |
string |
||
Autocreate Topic |
|
Setting the autocreation of the SNS topic. |
boolean |
False |
|
Overwrite Endpoint URI |
|
The overriding endpoint URI. To use this option, you must also select the |
string |
||
Endpoint Overwrite |
|
Select this option to override the endpoint URI. To use this option, you must also provide a URI for the |
boolean |
False |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Amazon Simple Queue Service sink
Send data to Amazon Simple Queue Service (SQS).
Configuration properties
The following table describes the configuration properties for the Amazon Simple Queue Service sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Queue Name* |
|
The SQS Queue name or or Amazon Resource Name (ARN). |
string |
||
Access Key* |
|
The access key obtained from AWS. / An opaque reference to the aws_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from AWS. / An opaque reference to the aws_secret_key |
string / object |
||
AWS Region* |
|
The AWS region to access. |
string |
||
Autocreate Queue |
|
Automatically create the SQS queue. |
boolean |
False |
|
AWS Host |
|
The hostname of the Amazon AWS cloud. |
string |
amazonaws.com |
|
Protocol |
|
The underlying protocol used to communicate with SQS. |
string |
https |
http or https |
Overwrite Endpoint URI |
|
The overriding endpoint URI. To use this option, you must also select the |
string |
||
Endpoint Overwrite |
|
Select this option to override the endpoint URI. To use this option, you must also provide a URI for the |
boolean |
False |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Amazon Simple Queue Service source
Receive data from Amazon Simple Queue Service (SQS).
Configuration properties
The following table describes the configuration properties for the Amazon Simple Queue Service source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Queue Name* |
|
The SQS Queue Name or ARN |
string |
||
Auto-delete Messages |
|
Delete messages after consuming them |
boolean |
True |
|
Access Key* |
|
The access key obtained from AWS. / An opaque reference to the aws_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from AWS. / An opaque reference to the aws_secret_key |
string / object |
||
AWS Region* |
|
The AWS region to access. |
string |
||
Autocreate Queue |
|
Setting the autocreation of the SQS queue. |
boolean |
False |
|
AWS Host |
|
The hostname of the Amazon AWS cloud. |
string |
amazonaws.com |
|
Protocol |
|
The underlying protocol used to communicate with SQS |
string |
https |
http or https |
Queue URL |
|
The full SQS Queue URL (required if using KEDA) |
string |
||
Overwrite Endpoint URI |
|
The overriding endpoint URI. To use this option, you must also select the |
string |
||
Endpoint Overwrite |
|
Select this option to override the endpoint URI. To use this option, you must also provide a URI for the |
boolean |
False |
|
Delay |
|
The number of milliseconds before the next poll of the selected stream |
integer |
500 |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Azure Event Hubs sink
Send data to Azure Event Hubs.
Configuration properties
The following table describes the configuration properties for the Azure Event Hubs sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Eventhubs Namespace* |
|
The eventhubs namespace |
string |
||
Eventhubs Name* |
|
The eventhub name |
string |
||
Share Access Name* |
|
EventHubs SAS key name |
string |
||
Share Access Key* |
|
The key for EventHubs SAS key name / An opaque reference to the azure_shared_access_key |
string / object |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Azure Event Hubs source
Receive data from Azure Event Hubs.
Configuration properties
The following table describes the configuration properties for the Azure Event Hubs source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Eventhubs Namespace* |
|
The Event Hubs namespace. |
string |
||
Eventhubs Name* |
|
The Event Hub name. |
string |
||
Share Access Name* |
|
The Event Hubs SAS key name. |
string |
||
Share Access Key* |
|
The key for the Event Hubs SAS key name. / An opaque reference to the azure_shared_access_key |
string / object |
||
Azure Storage Blob Account Name* |
|
The name of the Storage Blob account. |
string |
||
Azure Storage Blob Container Name* |
|
The name of the Storage Blob container. |
string |
||
Azure Storage Blob Access Key* |
|
The key for the Azure Storage Blob service that is associated with the Storage Blob account name. / An opaque reference to the azure_blob_access_key |
string / object |
||
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Azure Functions sink
Send data to Functions.
Configuration properties
The following table describes the configuration properties for the Azure Functions sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
URL* |
|
The Azure Functions URL you want to send the data to. |
string |
https://azure-function-demo-12234.azurewebsites.net/api/httpexample |
|
Method |
|
The HTTP method to use. |
string |
POST |
|
Key |
|
A function-specific API key is required, if the authLevel of the function is FUNCTION or master key if the authLevel is ADMIN. / An opaque reference to the azure_key |
string / object |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Azure Blob Storage sink
Send data to Azure Blob storage.
Configuration properties
The following table describes the configuration properties for the Azure Blob Storage sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Account Name* |
|
The Azure Storage Blob account name. |
string |
||
Container Name* |
|
The Azure Storage Blob container name. |
string |
||
Access Key* |
|
The Azure Storage Blob access key. / An opaque reference to the azure_access_key |
string / object |
||
Operation name |
|
The operation to perform. |
string |
uploadBlockBlob |
|
Credential Type |
|
Determines the credential strategy to adopt. Possible values are SHARED_ACCOUNT_KEY, SHARED_KEY_CREDENTIAL and AZURE_IDENTITY |
string |
SHARED_ACCOUNT_KEY |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Azure Blob Storage source
Receive data from Azure Blob storage.
Configuration properties
The following table describes the configuration properties for the Azure Blob Storage source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Period between Polls* |
|
The interval (in milliseconds) between fetches to the Azure Storage Container. |
integer |
10000 |
|
Account Name* |
|
The Azure Storage Blob account name. |
string |
||
Container Name* |
|
The Azure Storage Blob container name. |
string |
||
Access Key* |
|
The Azure Storage Blob access key. / An opaque reference to the azure_access_key |
string / object |
||
Credential Type |
|
Determines the credential strategy to adopt. Possible values are SHARED_ACCOUNT_KEY, SHARED_KEY_CREDENTIAL and AZURE_IDENTITY |
string |
SHARED_ACCOUNT_KEY |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Azure Queue Storage sink
Send data to Azure Queue Storage.
Configuration properties
The following table describes the configuration properties for the Azure Queue Storage sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Account Name* |
|
The Azure Storage Queue account name. |
string |
||
Queue Name* |
|
The Azure Storage Queue container name. |
string |
||
Access Key* |
|
The Azure Storage Queue access key. / An opaque reference to the azure_access_key |
string / object |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Azure Queue Storage source
Receive data from Azure Queue Storage.
Configuration properties
The following table describes the configuration properties for the Azure Queue Storage source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Account Name* |
|
The Azure Storage Queue account name. |
string |
||
Queue Name* |
|
The Azure Storage Queue container name. |
string |
||
Access Key* |
|
The Azure Storage Queue access key. / An opaque reference to the azure_access_key |
string / object |
||
Maximum Messages |
|
The maximum number of messages to get. You can specify a value between 1 and 32. The default is 1 (one message). If there are fewer than the maximum number of messages in the queue, then all the messages are returned. |
integer |
1 |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Apache Cassandra sink
Send data to an Apache Cassandra cluster.
Configuration properties
The following table describes the configuration properties for the Apache Cassandra sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Connection Host* |
|
The hostname(s) for the Cassandra server(s). Use a comma to separate multiple hostnames. |
string |
localhost |
|
Connection Port* |
|
The port number(s) of the cassandra server(s). Use a comma to separate multiple port numbers. |
string |
9042 |
|
Keyspace* |
|
The keyspace to use. |
string |
customers |
|
Username |
|
The username for accessing a secured Cassandra cluster. |
string |
||
Password |
|
The password for accessing a secured Cassandra cluster. / An opaque reference to the cassandra_password |
string / object |
||
Consistency Level |
|
The consistency level to use. Set the value to one of these options - ANY, ONE, TWO, THREE, QUORUM, ALL, LOCAL_QUORUM, EACH_QUORUM, SERIAL, LOCAL_SERIAL, or LOCAL_ONE. |
string |
ANY |
|
Prepare Statements |
|
If true, specifies to use PreparedStatements as the query. If false, specifies to use regular Statements as the query. |
boolean |
True |
|
Query* |
|
The query to execute against the Cassandra cluster table. |
string |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Apache Cassandra source
Retrieve data by sending a query to an Apache Cassandra cluster table.
Configuration properties
The following table describes the configuration properties for the Apache Cassandra source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Connection Host* |
|
The hostname(s) for the Cassandra server(s). Use a comma to separate multiple hostnames. |
string |
localhost |
|
Connection Port* |
|
The port number(s) of the cassandra server(s). Use a comma to separate multiple port numbers. |
string |
9042 |
|
Keyspace* |
|
The keyspace to use. |
string |
customers |
|
Username |
|
The username for accessing a secured Cassandra cluster. |
string |
||
Password |
|
The password for accessing a secured Cassandra cluster. / An opaque reference to the cassandra_password |
string / object |
||
Result Strategy |
|
The strategy to convert the result set of the query. Possible values are ALL, ONE, LIMIT_10, or LIMIT_100. |
string |
ALL |
|
Consistency Level |
|
The consistency level to use. Possible values are ANY, ONE, TWO, THREE, QUORUM, ALL, LOCAL_QUORUM, EACH_QUORUM, SERIAL, LOCAL_SERIAL, or LOCAL_ONE. |
string |
QUORUM |
|
Query* |
|
The query to execute against the Cassandra cluster table. |
string |
||
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Data Generator source
A data generator (for development and testing purposes).
Configuration properties
The following table describes the configuration properties for the Data Generator source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Period |
|
The interval (in milliseconds) to wait between producing the next message. |
integer |
1000 |
|
Message* |
|
The message to generate. |
string |
hello world |
|
Content Type |
|
The content type of the generated message. |
string |
text/plain |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Debezium MongoDB Connector
Configuration properties
The following table describes the configuration properties for the Debezium MongoDB Connector Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Topic prefix* |
|
Topic prefix that identifies and provides a namespace for the particular database server/cluster is capturing changes. The topic prefix should be unique across all other connectors, since it is used as a prefix for all Kafka topic names that receive events emitted by this connector. Only alphanumeric characters, hyphens, dots and underscores must be accepted. |
string |
||
Hosts |
|
The hostname and port pairs (in the form 'host' or 'host:port') of the MongoDB server(s) in the replica set. |
string |
||
User |
|
Database user for connecting to MongoDB, if necessary. |
string |
||
Password |
|
Password to be used when connecting to MongoDB, if necessary. |
string / object |
||
Enable SSL connection to MongoDB |
|
Should connector use SSL to connect to MongoDB instances |
boolean |
False |
|
Credentials Database |
|
Database containing user credentials. |
string |
admin |
|
Include Databases |
|
A comma-separated list of regular expressions that match the database names for which changes are to be captured |
string |
||
Exclude Databases |
|
A comma-separated list of regular expressions that match the database names for which changes are to be excluded |
string |
||
Include Collections |
|
A comma-separated list of regular expressions that match the collection names for which changes are to be captured |
string |
||
|
|
A comma-separated list of regular expressions that match the collection names for which changes are to be excluded |
string |
||
Exclude Fields |
|
A comma-separated list of the fully-qualified names of fields that should be excluded from change event message values |
string |
||
Snapshot mode |
|
The criteria for running a snapshot upon startup of the connector. Options include: 'initial' (the default) to specify the connector should always perform an initial sync when required; 'never' to specify the connector should never perform an initial sync |
string |
initial |
|
Query fetch size |
|
The maximum number of records that should be loaded into memory while streaming. A value of '0' uses the default JDBC fetch size. |
integer |
0 |
|
Change event batch size |
|
Maximum size of each batch of source records. Defaults to 2048. |
integer |
2048 |
|
Change event buffer size |
|
Maximum size of the queue for change events read from the database log but not yet recorded or forwarded. Defaults to 8192, and should always be larger than the maximum batch size. |
integer |
8192 |
|
Kafka topic name |
|
The name of the topic for the database schema history. |
string |
||
Kafka bootstrap servers |
|
A list of host/port pairs that the connector will use for establishing the initial connection to the Kafka cluster for retrieving database schema history previously stored by the connector. This should point to the same Kafka cluster used by the Kafka Connect process. |
string |
||
Kafka Message Key Format |
|
The serialization format for the Kafka message key. |
object |
||
Kafka Message Value Format |
|
The serialization format for the Kafka message value. |
object |
Debezium MySQL Connector
Configuration properties
The following table describes the configuration properties for the Debezium MySQL Connector Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Topic prefix* |
|
Topic prefix that identifies and provides a namespace for the particular database server/cluster is capturing changes. The topic prefix should be unique across all other connectors, since it is used as a prefix for all Kafka topic names that receive events emitted by this connector. Only alphanumeric characters, hyphens, dots and underscores must be accepted. |
string |
||
Cluster ID* |
|
A numeric ID of this database client, which must be unique across all currently-running database processes in the cluster. This connector joins the MySQL database cluster as another server (with this unique ID) so it can read the binlog. |
integer |
||
Hostname* |
|
Resolvable hostname or IP address of the database server. |
string |
||
Port |
|
Port of the database server. |
integer |
3306 |
|
User* |
|
Name of the database user to be used when connecting to the database. |
string |
||
Password |
|
Password of the database user to be used when connecting to the database. |
string / object |
||
Include Databases |
|
The databases for which changes are to be captured |
string |
||
Exclude Databases |
|
A comma-separated list of regular expressions that match database names to be excluded from monitoring |
string |
||
Include Tables |
|
The tables for which changes are to be captured |
string |
||
Exclude Tables |
|
A comma-separated list of regular expressions that match the fully-qualified names of tables to be excluded from monitoring |
string |
||
Include Columns |
|
Regular expressions matching columns to include in change events |
string |
||
Exclude Columns |
|
Regular expressions matching columns to exclude from change events |
string |
||
Snapshot mode |
|
The criteria for running a snapshot upon startup of the connector. Options include: 'when_needed' to specify that the connector run a snapshot upon startup whenever it deems it necessary; 'schema_only' to only take a snapshot of the schema (table structures) but no actual data; 'initial' (the default) to specify the connector can run a snapshot only when no offsets are available for the logical server name; 'initial_only' same as 'initial' except the connector should stop after completing the snapshot and before it would normally read the binlog; and’never' to specify the connector should never run a snapshot and that upon first startup the connector should read from the beginning of the binlog. The 'never' mode should be used with care, and only when the binlog is known to contain all history. |
string |
initial |
|
Decimal Handling |
|
Specify how DECIMAL and NUMERIC columns should be represented in change events, including: 'precise' (the default) uses java.math.BigDecimal to represent values, which are encoded in the change events using a binary representation and Kafka Connect’s 'org.apache.kafka.connect.data.Decimal' type; 'string' uses string to represent values; 'double' represents values using Java’s 'double', which may not offer the precision but will be far easier to use in consumers. |
string |
precise |
|
Columns PK mapping |
|
A semicolon-separated list of expressions that match fully-qualified tables and column(s) to be used as message key. Each expression must match the pattern '<fully-qualified table name>:<key columns>', where the table names could be defined as (DB_NAME.TABLE_NAME) or (SCHEMA_NAME.TABLE_NAME), depending on the specific connector, and the key columns are a comma-separated list of columns representing the custom key. For any table without an explicit key configuration the table’s primary key column(s) will be used as message key. Example: dbserver1.inventory.orderlines:orderId,orderLineId;dbserver1.inventory.orders:id |
string |
||
Query fetch size |
|
The maximum number of records that should be loaded into memory while streaming. A value of '0' uses the default JDBC fetch size. |
integer |
0 |
|
Change event batch size |
|
Maximum size of each batch of source records. Defaults to 2048. |
integer |
2048 |
|
Change event buffer size |
|
Maximum size of the queue for change events read from the database log but not yet recorded or forwarded. Defaults to 8192, and should always be larger than the maximum batch size. |
integer |
8192 |
|
Kafka topic name |
|
The name of the topic for the database schema history. |
string |
||
Kafka bootstrap servers |
|
A list of host/port pairs that the connector will use for establishing the initial connection to the Kafka cluster for retrieving database schema history previously stored by the connector. This should point to the same Kafka cluster used by the Kafka Connect process. |
string |
||
Kafka Message Key Format |
|
The serialization format for the Kafka message key. |
object |
||
Kafka Message Value Format |
|
The serialization format for the Kafka message value. |
object |
Debezium PostgreSQL Connector
Configuration properties
The following table describes the configuration properties for the Debezium PostgreSQL Connector Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Topic prefix* |
|
Topic prefix that identifies and provides a namespace for the particular database server/cluster is capturing changes. The topic prefix should be unique across all other connectors, since it is used as a prefix for all Kafka topic names that receive events emitted by this connector. Only alphanumeric characters, hyphens, dots and underscores must be accepted. |
string |
||
Hostname* |
|
Resolvable hostname or IP address of the database server. |
string |
||
Port |
|
Port of the database server. |
integer |
5432 |
|
User* |
|
Name of the database user to be used when connecting to the database. |
string |
||
Password |
|
Password of the database user to be used when connecting to the database. |
string / object |
||
Slot |
|
The name of the Postgres logical decoding slot created for streaming changes from a plugin. Defaults to 'debezium |
string |
debezium |
|
Publication |
|
The name of the Postgres 10+ publication used for streaming changes from a plugin. Defaults to 'dbz_publication' |
string |
dbz_publication |
|
Publication Auto Create Mode |
|
Applies only when streaming changes using pgoutput.Determine how creation of a publication should work, the default is all_tables.DISABLED - The connector will not attempt to create a publication at all. The expectation is that the user has created the publication up-front. If the publication isn’t found to exist upon startup, the connector will throw an exception and stop.ALL_TABLES - If no publication exists, the connector will create a new publication for all tables. Note this requires that the configured user has access. If the publication already exists, it will be used. i.e CREATE PUBLICATION <publication_name> FOR ALL TABLES;FILTERED - If no publication exists, the connector will create a new publication for all those tables matchingthe current filter configuration (see table/database include/exclude list properties). If the publication already exists, it will be used. i.e CREATE PUBLICATION <publication_name> FOR TABLE <tbl1, tbl2, etc> |
string |
all_tables |
|
Include Schemas |
|
The schemas for which events should be captured |
string |
||
Exclude Schemas |
|
The schemas for which events must not be captured |
string |
||
Include Tables |
|
The tables for which changes are to be captured |
string |
||
Exclude Tables |
|
A comma-separated list of regular expressions that match the fully-qualified names of tables to be excluded from monitoring |
string |
||
Include Columns |
|
Regular expressions matching columns to include in change events |
string |
||
Exclude Columns |
|
Regular expressions matching columns to exclude from change events |
string |
||
Snapshot mode |
|
The criteria for running a snapshot upon startup of the connector. Options include: 'always' to specify that the connector run a snapshot each time it starts up; 'initial' (the default) to specify the connector can run a snapshot only when no offsets are available for the logical server name; 'initial_only' same as 'initial' except the connector should stop after completing the snapshot and before it would normally start emitting changes;'never' to specify the connector should never run a snapshot and that upon first startup the connector should read from the last position (LSN) recorded by the server; and’exported' deprecated, use 'initial' instead; 'custom' to specify a custom class with 'snapshot.custom_class' which will be loaded and used to determine the snapshot, see docs for more details. |
string |
initial |
|
Decimal Handling |
|
Specify how DECIMAL and NUMERIC columns should be represented in change events, including: 'precise' (the default) uses java.math.BigDecimal to represent values, which are encoded in the change events using a binary representation and Kafka Connect’s 'org.apache.kafka.connect.data.Decimal' type; 'string' uses string to represent values; 'double' represents values using Java’s 'double', which may not offer the precision but will be far easier to use in consumers. |
string |
precise |
|
Columns PK mapping |
|
A semicolon-separated list of expressions that match fully-qualified tables and column(s) to be used as message key. Each expression must match the pattern '<fully-qualified table name>:<key columns>', where the table names could be defined as (DB_NAME.TABLE_NAME) or (SCHEMA_NAME.TABLE_NAME), depending on the specific connector, and the key columns are a comma-separated list of columns representing the custom key. For any table without an explicit key configuration the table’s primary key column(s) will be used as message key. Example: dbserver1.inventory.orderlines:orderId,orderLineId;dbserver1.inventory.orders:id |
string |
||
Query fetch size |
|
The maximum number of records that should be loaded into memory while streaming. A value of '0' uses the default JDBC fetch size. |
integer |
0 |
|
Change event batch size |
|
Maximum size of each batch of source records. Defaults to 2048. |
integer |
2048 |
|
Change event buffer size |
|
Maximum size of the queue for change events read from the database log but not yet recorded or forwarded. Defaults to 8192, and should always be larger than the maximum batch size. |
integer |
8192 |
|
Kafka topic name |
|
The name of the topic for the database schema history. |
string |
||
Kafka bootstrap servers |
|
A list of host/port pairs that the connector will use for establishing the initial connection to the Kafka cluster for retrieving database schema history previously stored by the connector. This should point to the same Kafka cluster used by the Kafka Connect process. |
string |
||
Kafka Message Key Format |
|
The serialization format for the Kafka message key. |
object |
||
Kafka Message Value Format |
|
The serialization format for the Kafka message value. |
object |
Debezium SQLServer Connector
Configuration properties
The following table describes the configuration properties for the Debezium SQLServer Connector Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Topic prefix* |
|
Topic prefix that identifies and provides a namespace for the particular database server/cluster is capturing changes. The topic prefix should be unique across all other connectors, since it is used as a prefix for all Kafka topic names that receive events emitted by this connector. Only alphanumeric characters, hyphens, dots and underscores must be accepted. |
string |
||
Hostname* |
|
Resolvable hostname or IP address of the database server. |
string |
||
Port |
|
Port of the database server. |
integer |
1433 |
|
User |
|
Name of the database user to be used when connecting to the database. |
string |
||
Password |
|
Password of the database user to be used when connecting to the database. |
string / object |
||
Databases |
|
The names of the databases from which the connector should capture changes |
string |
||
Include Tables |
|
The tables for which changes are to be captured |
string |
||
Exclude Tables |
|
A comma-separated list of regular expressions that match the fully-qualified names of tables to be excluded from monitoring |
string |
||
Include Columns |
|
Regular expressions matching columns to include in change events |
string |
||
Exclude Columns |
|
Regular expressions matching columns to exclude from change events |
string |
||
Snapshot mode |
|
The criteria for running a snapshot upon startup of the connector. Options include: 'initial' (the default) to specify the connector should run a snapshot only when no offsets are available for the logical server name; 'schema_only' to specify the connector should run a snapshot of the schema when no offsets are available for the logical server name. |
string |
initial |
|
Decimal Handling |
|
Specify how DECIMAL and NUMERIC columns should be represented in change events, including: 'precise' (the default) uses java.math.BigDecimal to represent values, which are encoded in the change events using a binary representation and Kafka Connect’s 'org.apache.kafka.connect.data.Decimal' type; 'string' uses string to represent values; 'double' represents values using Java’s 'double', which may not offer the precision but will be far easier to use in consumers. |
string |
precise |
|
Columns PK mapping |
|
A semicolon-separated list of expressions that match fully-qualified tables and column(s) to be used as message key. Each expression must match the pattern '<fully-qualified table name>:<key columns>', where the table names could be defined as (DB_NAME.TABLE_NAME) or (SCHEMA_NAME.TABLE_NAME), depending on the specific connector, and the key columns are a comma-separated list of columns representing the custom key. For any table without an explicit key configuration the table’s primary key column(s) will be used as message key. Example: dbserver1.inventory.orderlines:orderId,orderLineId;dbserver1.inventory.orders:id |
string |
||
Query fetch size |
|
The maximum number of records that should be loaded into memory while streaming. A value of '0' uses the default JDBC fetch size. |
integer |
0 |
|
Change event batch size |
|
Maximum size of each batch of source records. Defaults to 2048. |
integer |
2048 |
|
Change event buffer size |
|
Maximum size of the queue for change events read from the database log but not yet recorded or forwarded. Defaults to 8192, and should always be larger than the maximum batch size. |
integer |
8192 |
|
Kafka topic name |
|
The name of the topic for the database schema history. |
string |
||
Kafka bootstrap servers |
|
A list of host/port pairs that the connector will use for establishing the initial connection to the Kafka cluster for retrieving database schema history previously stored by the connector. This should point to the same Kafka cluster used by the Kafka Connect process. |
string |
||
Kafka Message Key Format |
|
The serialization format for the Kafka message key. |
object |
||
Kafka Message Value Format |
|
The serialization format for the Kafka message value. |
object |
Elasticsearch sink
Store JSON-formatted data into ElasticSearch.
Configuration properties
The following table describes the configuration properties for the Elasticsearch sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Username |
|
The username to connect to ElasticSearch. |
string |
||
Password |
|
The password to connect to ElasticSearch. / An opaque reference to the elasticsearch_password |
string / object |
||
Enable SSL |
|
Specifies to connect by using SSL. |
boolean |
True |
|
Host Addresses* |
|
A comma-separated list of remote transport addresses in |
string |
quickstart-es-http:9200 |
|
ElasticSearch Cluster Name* |
|
The name of the ElasticSearch cluster. |
string |
quickstart |
|
Index in ElasticSearch |
|
The name of the ElasticSearch index. |
string |
data |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
FTPS sink
Send data to an FTPS Server.
Configuration properties
The following table describes the configuration properties for the FTPS sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Connection Host* |
|
The hostname of the FTP server. |
string |
||
Connection Port* |
|
The port of the FTP server. |
string |
21 |
|
Username* |
|
The username to access the FTP server. |
string |
||
Password* |
|
The password to access the FTP server. / An opaque reference to the ftps_password |
string / object |
||
Directory Name* |
|
The starting directory. |
string |
||
Passive Mode |
|
Set the passive mode connection. |
boolean |
False |
|
File Existence |
|
Specifies how the Kamelet behaves if the file already exists. Possible values are Override, Append, Fail, or Ignore. |
string |
Override |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
FTPS source
Retrieve data from an FTPS Server.
Configuration properties
The following table describes the configuration properties for the FTPS source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Connection Host* |
|
The hostname of the FTPS server. |
string |
||
Connection Port* |
|
The port of the FTPS server. |
string |
21 |
|
Username* |
|
The username to access the FTPS server. |
string |
||
Password* |
|
The password to access the FTPS server. / An opaque reference to the ftps_password |
string / object |
||
Directory Name* |
|
The starting directory. |
string |
||
Passive Mode |
|
Specifies to use passive mode connection. |
boolean |
False |
|
Recursive |
|
If a directory, look for files in all sub-directories as well. |
boolean |
False |
|
Idempotency |
|
Skip already-processed files. |
boolean |
True |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Google BigQuery sink
Send data to a Google Big Query table.
Configuration properties
The following table describes the configuration properties for the Google BigQuery sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Google Cloud Project Id* |
|
The Google Cloud Project ID. |
string |
||
Big Query Dataset Id* |
|
The Big Query Dataset ID. |
string |
||
Big Query Table Id* |
|
The Big Query Table ID. |
string |
||
Google Cloud Platform Credential File* |
|
The credential for accessing Google Cloud Platform API services. This value must be a path to a service account key file. |
string |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Google Cloud Functions sink
Send data to Google Functions.
Configuration properties
The following table describes the configuration properties for the Google Cloud Functions sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Project Id* |
|
The Google Cloud Functions Project ID. |
string |
||
Region* |
|
The region where Google Cloud Functions has been deployed. |
string |
||
Function Name* |
|
The Function name. |
string |
||
Service Account Key* |
|
The path to the service account key file that provides credentials for the Google Cloud Functions platform. / An opaque reference to the aws_access_key |
string / object |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Google Cloud Pub/Sub sink
Send data to Google Cloud Pub/Sub.
Configuration properties
The following table describes the configuration properties for the Google Cloud Pub/Sub sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Project Id* |
|
The Google Cloud Pub/Sub Project ID. |
string |
||
Destination Name* |
|
The destination name. |
string |
||
Service Account Key* |
|
The service account key to use as credentials for the Pub/Sub publisher/subscriber. You must encode this value in base64. / An opaque reference to the aws_access_key |
string / object |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Google Cloud Pub/Sub source
Receive data from Google Cloud Pub/Sub.
Configuration properties
The following table describes the configuration properties for the Google Cloud Pub/Sub source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Project Id* |
|
The Google Cloud Pub/Sub Project ID. |
string |
||
Subscription Name* |
|
The subscription name. |
string |
||
Service Account Key* |
|
The service account key to use as credentials for the Pub/Sub publisher/subscriber. You must encode this value in base64. / An opaque reference to the aws_access_key |
string / object |
||
Synchronous Pull |
|
Specifies to synchronously pull batches of messages. |
boolean |
False |
|
Max Messages Per Poll |
|
The maximum number of messages to receive from the server in a single API call. |
integer |
1 |
|
Concurrent Consumers |
|
The number of parallel streams to consume from the subscription. |
integer |
1 |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Google Cloud Storage sink
Upload data to Google Cloud Storage.
Configuration properties
The following table describes the configuration properties for the Google Cloud Storage sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Bucket Name Or ARN* |
|
The Google Cloud Storage bucket name or Bucket Amazon Resource Name (ARN). |
string |
||
Service Account Key* |
|
The service account key to use as credentials for Google Cloud Storage access. You must encode this value in base64. / An opaque reference to the aws_access_key |
string / object |
||
Autocreate Bucket |
|
Specifies to automatically create the Google Cloud Storage bucket. |
boolean |
False |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Google Cloud Storage source
Send messages to Google Pubsub.
Configuration properties
The following table describes the configuration properties for the Google Cloud Storage source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Bucket Name Or ARN* |
|
The Google Cloud Storage bucket name or Bucket Amazon Resource Name (ARN). |
string |
||
Service Account Key* |
|
The service account key to use as credentials for Google Cloud Storage access. You must encode this value in base64. / An opaque reference to the aws_access_key |
string / object |
||
Auto-delete Objects |
|
Specifies to delete objects after consuming them. |
boolean |
True |
|
Autocreate Bucket |
|
Specifies to automatically create the Google Cloud Storage bucket. |
boolean |
False |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
HTTP sink
Send data to a HTTP endpoint.
Configuration properties
The following table describes the configuration properties for the HTTP sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
URL* |
|
The URL to which you want to send data. |
string |
||
Method |
|
The HTTP method to use. |
string |
POST |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Jira Add Comment sink
Add a new comment to an existing issue in Jira.
Configuration properties
The following table describes the configuration properties for the Jira Add Comment sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Jira URL* |
|
The URL of your instance of Jira |
string |
||
Username* |
|
The username to access Jira |
string |
||
Password* |
|
The password or the API Token to access Jira / An opaque reference to the jira_password |
string / object |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Jira Add Issue sink
Add a new issue to Jira.
Configuration properties
The following table describes the configuration properties for the Jira Add Issue sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Jira URL* |
|
The URL of your instance of Jira |
string |
||
Username* |
|
The username to access Jira |
string |
||
Password* |
|
The password or the API Token to access Jira / An opaque reference to the jira_password |
string / object |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Jira source
Receive notifications about new issues from Jira.
Configuration properties
The following table describes the configuration properties for the Jira source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Jira URL* |
|
The URL of your instance of Jira. |
string |
||
Username* |
|
The username to access Jira. |
string |
||
Password* |
|
The password or the API Token to access Jira. / An opaque reference to the jira_password |
string / object |
||
JQL |
|
A query to filter issues. |
string |
project=MyProject |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
JMS AMQP 1.0 sink
Send data to any AMQP 1.0 compliant message broker by using the Apache Qpid JMS client.
Configuration properties
The following table describes the configuration properties for the JMS AMQP 1.0 sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Destination Type |
|
The JMS destination type (queue or topic). |
string |
queue |
|
Destination Name* |
|
The JMS destination name. |
string |
||
Broker URL* |
|
The JMS URL. |
string |
amqp://my-host:31616 |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
JMS AMQP 1.0 source
Receive data from any AMQP 1.0 compliant message broker by using the Apache Qpid JMS client.
Configuration properties
The following table describes the configuration properties for the JMS AMQP 1.0 source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Destination Type |
|
The JMS destination type (queue or topic). |
string |
queue |
|
Destination Name* |
|
The JMS destination name. |
string |
||
Broker URL* |
|
The JMS URL. |
string |
amqp://my-host:31616 |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
JMS Apache Artemis sink
Send data to an Apache Artemis message broker by using JMS.
Configuration properties
The following table describes the configuration properties for the JMS Apache Artemis sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Destination Type |
|
The JMS destination type (queue or topic). |
string |
queue |
|
Destination Name* |
|
The JMS destination name. |
string |
person |
|
Broker URL* |
|
The JMS URL. |
string |
tcp://my-host:61616 |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
JMS Apache Artemis source
Receive data from an Apache Artemis message broker by using JMS.
Configuration properties
The following table describes the configuration properties for the JMS Apache Artemis source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Destination Type |
|
The JMS destination type (queue or topic). |
string |
queue |
|
Destination Name* |
|
The JMS destination name. |
string |
||
Broker URL* |
|
The JMS URL. |
string |
tcp://k3s-node-master.usersys.redhat.com:31616 |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
MariaDB sink
Send data to a MariaDB Database.
Configuration properties
The following table describes the configuration properties for the MariaDB sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Server Name* |
|
The server name for the data source. |
string |
localhost |
|
Server Port |
|
The server port for the data source. |
string |
3306 |
|
Username* |
|
The username to access a secured MariaDB Database. |
string |
||
Password* |
|
The password to access a secured MariaDB Database. / An opaque reference to the db_password |
string / object |
||
Query* |
|
The query to execute against the MariaDB Database. |
string |
INSERT INTO accounts (username,city) VALUES (:#username,:#city) |
|
Database Name* |
|
The name of the MariaDB Database. |
string |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
MariaDB source
Query data from a MariaDB Database.
Configuration properties
The following table describes the configuration properties for the MariaDB source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Server Name* |
|
The server name for the data source. |
string |
localhost |
|
Server Port |
|
The server port for the data source. |
string |
3306 |
|
Username* |
|
The username to access a secured MariaDB Database. |
string |
||
Password* |
|
The password to access a secured MariaDB Database. / An opaque reference to the db_password |
string / object |
||
Query* |
|
The query to execute against the MariaDB Database. |
string |
INSERT INTO accounts (username,city) VALUES (:#username,:#city) |
|
Database Name* |
|
The name of the MariaDB Database. |
string |
||
Consumed Query |
|
A query to run on a tuple consumed. |
string |
DELETE FROM accounts where user_id = :#user_id |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
MinIO sink
Send data to MinIO.
Configuration properties
The following table describes the configuration properties for the MinIO sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Bucket Name* |
|
The Minio Bucket name. |
string |
||
Access Key* |
|
The access key obtained from MinIO. / An opaque reference to the minio_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from MinIO. / An opaque reference to the minio_secret_key |
string / object |
||
Endpoint* |
|
The MinIO Endpoint. You can specify an URL, domain name, IPv4 address, or IPv6 address. |
string |
||
Autocreate Bucket |
|
Specify to automatically create the MinIO bucket. |
boolean |
False |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
MinIO source
Retrieve data from MinIO.
Configuration properties
The following table describes the configuration properties for the MinIO source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Bucket Name* |
|
The MinIO Bucket name. |
string |
||
Auto-delete Objects |
|
Delete objects after consuming them. |
boolean |
True |
|
Access Key* |
|
The access key obtained from MinIO. / An opaque reference to the minio_access_key |
string / object |
||
Secret Key* |
|
The secret key obtained from MinIO. / An opaque reference to the minio_secret_key |
string / object |
||
Endpoint* |
|
The MinIO Endpoint. You can specify an URL, domain name, IPv4 address, or IPv6 address. |
string |
||
Autocreate Bucket |
|
Specifies to automatically create the MinIO bucket. |
boolean |
False |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
MongoDB sink
Send data to MongoDB.
Configuration properties
The following table describes the configuration properties for the MongoDB sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
MongoDB Hosts* |
|
Comma separated list of MongoDB Host Addresses in host:port format. |
string |
||
MongoDB Collection* |
|
Sets the name of the MongoDB collection to bind to this endpoint. |
string |
||
MongoDB Password |
|
User password for accessing MongoDB. / An opaque reference to the mongodb_password |
string / object |
||
MongoDB Username |
|
Username for accessing MongoDB. |
string |
||
MongoDB Database* |
|
Sets the name of the MongoDB database to target. |
string |
||
Write Concern |
|
Configure the level of acknowledgment requested from MongoDB for write operations, possible values are ACKNOWLEDGED, W1, W2, W3, UNACKNOWLEDGED, JOURNALED, MAJORITY. |
string |
||
Collection |
|
Create collection during initialisation if it doesn’t exist. |
boolean |
False |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
MongoDB source
Retrieve data from MongoDB.
Configuration properties
The following table describes the configuration properties for the MongoDB source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
MongoDB Hosts* |
|
A comma-separated list of MongoDB host addresses in |
string |
||
MongoDB Collection* |
|
The name of the MongoDB collection to bind to this endpoint. |
string |
||
MongoDB Password |
|
The user password for accessing MongoDB. / An opaque reference to the mongodb_password |
string / object |
||
MongoDB Username |
|
The username for accessing MongoDB. The username must be present in the MongoDB’s authentication database ( |
string |
||
MongoDB Database* |
|
The name of the MongoDB database. |
string |
||
MongoDB Persistent Tail Tracking |
|
Specifies to enable persistent tail tracking, which is a mechanism to keep track of the last consumed data across system restarts. The next time the system is up, the endpoint recovers the cursor from the point where it last stopped consuimg data. This option will only work on capped collections. |
boolean |
False |
|
MongoDB Tail Track Increasing Field |
|
The correlation field in the incoming data which is of increasing nature and is used to position the tailing cursor every time it is generated. |
string |
||
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
MySQL sink
Send data to a MySQL Database.
Configuration properties
The following table describes the configuration properties for the MySQL sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Server Name* |
|
The server name for the data source. |
string |
localhost |
|
Server Port |
|
The server port for the data source. |
string |
3306 |
|
Username* |
|
The username to access a secured MySQL Database. |
string |
||
Password* |
|
The password to access a secured MySQL Database. / An opaque reference to the db_password |
string / object |
||
Query* |
|
The query to execute against the MySQL Database. |
string |
INSERT INTO accounts (username,city) VALUES (:#username,:#city) |
|
Database Name* |
|
The name of the MySQL Database. |
string |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
MySQL source
Query data from a MySQL Database.
Configuration properties
The following table describes the configuration properties for the MySQL source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Server Name* |
|
The server name for the data source. |
string |
localhost |
|
Server Port |
|
The server port for the data source. |
string |
3306 |
|
Username* |
|
The username to access a secured MySQL Database. |
string |
||
Password* |
|
The password to access a secured MySQL Database. / An opaque reference to the db_password |
string / object |
||
Query* |
|
The query to execute against the MySQL Database. |
string |
INSERT INTO accounts (username,city) VALUES (:#username,:#city) |
|
Database Name* |
|
The name of the MySQL Database. |
string |
||
Consumed Query |
|
A query to run on a tuple consumed. |
string |
DELETE FROM accounts where user_id = :#user_id |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
PostgreSQL sink
Send data to a PostgreSQL Database.
Configuration properties
The following table describes the configuration properties for the PostgreSQL sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Server Name* |
|
The server name for the data source. |
string |
localhost |
|
Server Port |
|
The server port for the data source. |
string |
5432 |
|
Username* |
|
The username to access a secured PostgreSQL Database. |
string |
||
Password* |
|
The password to access a secured PostgreSQL Database. / An opaque reference to the db_password |
string / object |
||
Query* |
|
The query to execute against the PostgreSQL Database. |
string |
INSERT INTO accounts (username,city) VALUES (:#username,:#city) |
|
Database Name* |
|
The name of the PostgreSQL Database. |
string |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
PostgreSQL source
Query data from a PostgreSQL Database.
Configuration properties
The following table describes the configuration properties for the PostgreSQL source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Server Name* |
|
The server name for the data source. |
string |
localhost |
|
Server Port |
|
The server port for the data source. |
string |
5432 |
|
Username* |
|
The username to access a secured PostgreSQL Database. |
string |
||
Password* |
|
The password to access a secured PostgreSQL Database. / An opaque reference to the db_password |
string / object |
||
Query* |
|
The query to execute against the PostgreSQL Database. |
string |
INSERT INTO accounts (username,city) VALUES (:#username,:#city) |
|
Database Name* |
|
The name of the PostgreSQL Database. |
string |
||
Consumed Query |
|
A query to run on a tuple consumed. |
string |
DELETE FROM accounts where user_id = :#user_id |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Salesforce Create sink
Create an object in Salesforce.
Configuration properties
The following table describes the configuration properties for the Salesforce Create sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Object Name |
|
The type of the object. |
string |
Contact |
|
Login URL |
|
The Salesforce instance login URL. |
string |
||
Consumer Key* |
|
The Salesforce application consumer key. |
string |
||
Consumer Secret* |
|
The Salesforce application consumer secret. / An opaque reference to the salesforce_client_secret |
string / object |
||
Username* |
|
The Salesforce username. |
string |
||
Password* |
|
The Salesforce user password. / An opaque reference to the salesforce_password |
string / object |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Salesforce Delete sink
Delete an object in Salesforce.
Configuration properties
The following table describes the configuration properties for the Salesforce Delete sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Login URL |
|
The Salesforce instance login URL. |
string |
||
Consumer Key* |
|
The Salesforce application consumer key. |
string |
||
Consumer Secret* |
|
The Salesforce application consumer secret. / An opaque reference to the salesforce_client_secret |
string / object |
||
Username* |
|
The Salesforce username. |
string |
||
Password* |
|
The Salesforce user password. / An opaque reference to the salesforce_password |
string / object |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Salesforce Streaming source
Receive updates from Salesforce.
Configuration properties
The following table describes the configuration properties for the Salesforce Streaming source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
objectName* |
|
The sObjectName |
string |
||
Login URL |
|
The Salesforce instance login URL |
string |
||
Consumer Key* |
|
The Salesforce application consumer key |
string |
||
Consumer Secret* |
|
The Salesforce application consumer secret / An opaque reference to the salesforce_client_secret |
string / object |
||
Username* |
|
The Salesforce username |
string |
||
Password* |
|
The Salesforce user password / An opaque reference to the salesforce_password |
string / object |
||
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Salesforce Update sink
Update an object in Salesforce.
Configuration properties
The following table describes the configuration properties for the Salesforce Update sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Object Name* |
|
The type of the Salesforce object. Required if using a key-value pair. |
string |
Contact |
|
Object Id* |
|
The ID of the Salesforce object. Required if using a key-value pair. |
string |
||
Login URL |
|
The Salesforce instance login URL. |
string |
||
Consumer Key* |
|
The Salesforce application consumer key. |
string |
||
Consumer Secret* |
|
The Salesforce application consumer secret. / An opaque reference to the salesforce_client_secret |
string / object |
||
Username* |
|
The Salesforce username. |
string |
||
Password* |
|
The Salesforce user password. / An opaque reference to the salesforce_password |
string / object |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
SFTP sink
Send data to an SFTP Server.
Configuration properties
The following table describes the configuration properties for the SFTP sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Connection Host* |
|
The hostname of the FTP server |
string |
||
Connection Port* |
|
The port of the FTP server |
string |
22 |
|
Username* |
|
The username to access the FTP server. |
string |
||
Password* |
|
The password to access the FTP server. / An opaque reference to the sftp_password |
string / object |
||
Directory Name* |
|
The starting directory. |
string |
||
Passive Mode |
|
Specifies to use passive mode connection. |
boolean |
False |
|
File Existence |
|
How to behave in case of file already existent. There are 4 enums. Possible values are Override, Append, Fail, or Ignore. |
string |
Override |
|
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
SFTP source
Retrieve data from an SFTP Server.
Configuration properties
The following table describes the configuration properties for the SFTP source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Connection Host* |
|
The hostname of the SFTP server. |
string |
||
Connection Port* |
|
The port of the FTP server. |
string |
22 |
|
Username* |
|
The username to access the SFTP server. |
string |
||
Password* |
|
The password to access the SFTP server. / An opaque reference to the sftp_password |
string / object |
||
Directory Name* |
|
The starting directory. |
string |
||
Passive Mode |
|
Sets the passive mode connection. |
boolean |
False |
|
Recursive |
|
If a directory, look for files in all sub-directories as well. |
boolean |
False |
|
Idempotency |
|
Skip already-processed files. |
boolean |
True |
|
Ignore File Not Found Or Permission Error |
|
Whether to ignore when (trying to list files in directories or when downloading a file), which does not exist or due to permission error. By default when a directory or file does not exists or insufficient permission, then an exception is thrown. Setting this option to true allows to ignore that instead. |
boolean |
False |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Slack sink
Send messages to a Slack channel.
Configuration properties
The following table describes the configuration properties for the Slack sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Channel* |
|
The Slack channel to send messages to. |
string |
#myroom |
|
Webhook URL* |
|
The webhook URL used by the Slack channel to handle incoming messages. / An opaque reference to the slack_webhook_url |
string / object |
||
Icon Emoji |
|
Use a Slack emoji as an avatar. |
string |
||
Icon URL |
|
The avatar to use when sending a message to a channel or user. |
string |
||
Username |
|
The username for the bot when it sends messages to a channel or user. |
string |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Slack source
Receive messages from a Slack channel.
Configuration properties
The following table describes the configuration properties for the Slack source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Channel* |
|
The Slack channel to receive messages from. |
string |
#myroom |
|
Token* |
|
The Bot User OAuth Access Token to access Slack. A Slack app that has the following permissions is required: |
string / object |
||
Delay |
|
The delay between polls. |
string |
1s |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Splunk sink
Send data to Splunk.
Configuration properties
The following table describes the configuration properties for the Splunk sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Splunk Server Address* |
|
The address of your Splunk server. |
string |
my_server_splunk.com |
|
Splunk Server Port |
|
The address of your Splunk server. |
integer |
8089 |
|
Username* |
|
The username to authenticate to Splunk Server. |
string |
||
Password* |
|
The password to authenticate to Splunk Server. / An opaque reference to the splunk_password |
string / object |
||
Index |
|
Splunk index to write to. |
string |
||
Protocol |
|
Connection Protocol to Splunk server. |
string |
https |
|
Source |
|
The source named field of the data. |
string |
||
Source Type |
|
The source named field of the data. |
string |
||
Splunk App |
|
The app name in Splunk. |
string |
||
Connection Timeout |
|
Timeout in milliseconds when connecting to Splunk server |
integer |
5000 |
|
Mode |
|
The mode to publish events to Splunk. |
string |
stream |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
SQL Server sink
Send data to a SQL Server Database.
Configuration properties
The following table describes the configuration properties for the SQL Server sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Server Name* |
|
The server name for the data source. |
string |
localhost |
|
Server Port |
|
The server port for the data source. |
string |
1433 |
|
Username* |
|
The username to access a secured SQL Server Database. |
string |
||
Password* |
|
The password to access a secured SQL Server Database. / An opaque reference to the db_password |
string / object |
||
Query* |
|
The query to execute against the SQL Server Database. |
string |
INSERT INTO accounts (username,city) VALUES (:#username,:#city) |
|
Database Name* |
|
The name of the SQL Server Database. |
string |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
SQL Server source
Query data from a SQL Server Database.
Configuration properties
The following table describes the configuration properties for the SQL Server source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Server Name* |
|
The server name for the data source. |
string |
localhost |
|
Server Port |
|
The server port for the data source. |
string |
1433 |
|
Username* |
|
The username to access a secured SQL Server Database |
string |
||
Password* |
|
The password to access a secured SQL Server Database / An opaque reference to the db_password |
string / object |
||
Query* |
|
The query to execute against the SQL Server Database |
string |
INSERT INTO accounts (username,city) VALUES (:#username,:#city) |
|
Database Name* |
|
The name of the Database. |
string |
||
Consumed Query |
|
A query to run on a tuple consumed |
string |
DELETE FROM accounts where user_id = :#user_id |
|
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |
Telegram sink
Send a message to a Telegram chat using your Telegram bot as sender. To create a bot, use your Telegram app to contact the @botfather account.
Configuration properties
The following table describes the configuration properties for the Telegram sink Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Token* |
|
The token to access your bot on Telegram. You you can obtain it from the Telegram @botfather. / An opaque reference to the telegram_authorization_token |
string / object |
||
Chat ID |
|
The Chat ID to where you want to send messages by default. |
string |
||
Topic Names* |
|
A comma-separated list of Kafka topic names. |
string |
||
Data Shape |
|
The format of the data that Kafka sends to the sink connector. |
object |
Telegram source
Receive all messages that people send to your Telegram bot.
Configuration properties
The following table describes the configuration properties for the Telegram source Connector.
Name | Property | Description | Type | Default | Example |
---|---|---|---|---|---|
Token* |
|
The token to access your bot on Telegram. You you can obtain it from the Telegram @botfather. / An opaque reference to the telegram_authorization_token |
string / object |
||
Topic Name* |
|
The name of the Kafka Topic to use. |
string |
||
Data Shape |
|
The format of the data that the source connector sends to Kafka. |
object |