The S3 sink connector enables you to move data from Aiven Kafka cluster to Amazon S3 for long term storage. 

Data Format

S3 Object names

S3 connector stores series of files in the specified bucket. Each object is named using pattern


Aws_s3_prefix can utilize a tempate placeholders {{ utc_date }} and {{ local_date }}. The .gz  extension is used if gzip compression is used, see output_compression  below.

The connector creates one file per Kafka Connect  setting for partitions that have received new messages during that period. The setting defaults to 60  seconds.

Data file format

Data is stored in one record per line in S3. The format is comma separated fields specified by output_fields  configuration option. If key and value fields are selected, they are written out in Base64  encoded form. For example, output_fields  of value,key,timestamp results in rows looking something like this:



S3 permissions

S3 connector needs the following permissions to the specified bucket:

  • s3:GetObject
  • s3:PutObject
  • s3:AbortMultipartUpload
  • s3:ListMultipartUploadParts
  • s3:ListBucketMultipartUploads

In case of Access Denied error see

Connector Configuration

  AWS Access Key ID for accessing S3 bucket. Mandatory.
  AWS S3 Secret Access Key. Mandatory.
  Name of an existing bucket for storing the records. Mandatory.
  Name of the region for the bucket used for storing the records. Defaults to us-east-1.
  Connector class name, in this case: io.aiven.kafka.connect.s3.AivenKafkaConnectS3SinkConnectorkey.converter:
Connector specific key encoding, must be set to
  Compression type for output files. Supported algorithms are gzip and none. Defaults to gzip.
  A comma separated list of fields to include in output. Supported values are: key, offset, timestamp and value. Defaults to value.
  Topics to subscribe to. See Kafka Connect documentation for details. E.g. demo_topic,another_topicvalue.converter: Connector specific value encoding, must be set to


The following information will be required in the configuration process:

  • Kafka connect URL from the Kafka service

S3 connector definition example:

curl -X POST \ \
-H "Content-Type: application/json" -d @- \
 << EOF
"name": "example-s3-sink",
  "config": {
  "aws_access_key_id": "AKI...",
  "aws_secret_access_key": "SECRET_ACCESS_KEY",
  "aws_s3_bucket": "aiven-example",
  "aws_s3_prefix": "example-s3-sink/",
  "aws_s3_region": "us-east-1",
  "output_compression": "gzip",
  "output_fields": "value,key,timestamp",
  "tasks.max": 1,
  "topics": "source_topic,another_topic",

(Note: Please substitute appropriate values for Kafka connect URL)

Please see for additional details.

Got here by accident? Learn how Aiven simplifies working with Apache Kafka:

Did this answer your question?