The Kafka Connect S3 connector by Confluent enables you to move data from Aiven Kafka cluster to Amazon S3 for long term storage.

Configuration

An example minimal configuration

name=demo-s3-sink
connector.class=io.confluent.connect.s3.S3SinkConnector
tasks.max=10
topics=demo_s3_input_topic
s3.region=us-west-2
s3.bucket.name=aiven-demo-bucket
flush.size=1

s3.credentials.provider.class=io.aiven.kafka.connect.util.AivenAWSCredentialsProvider
s3.credentials.provider.access_key_id=xxx
s3.credentials.provider.secret_access_key=xxx

storage.class=io.confluent.connect.s3.storage.S3Storage
format.class=io.confluent.connect.s3.format.bytearray.ByteArrayFormat

S3 connector specific settings

S3.region and s3.bucket.name specify the target bucket.

You can supply credentials via AivenAWSCredentialProvider helper using s3.credentials.provider.access_key_id and s3.credentials.provider.secret_access_key.

Storage.class must be set to io.confluent.connect.s3.storage.S3Storage.

Format.class can be used to specify the output formatter before data is written to S3.

Further information

 You can find additional documentation at https://docs.confluent.io/5.0.0/connect/kafka-connect-s3/index.html.

Did this answer your question?