Support

Destinations

Amazon S3

Push events to an Amazon S3 bucket.

The Amazon S3 destination allows you to push events up to S3 object storage. Multiple events are batched per uploaded object file. Exporting to S3 is useful for archiving events to cheaper storage or importing to external services that support S3.


Overview

Streamfold supports uploading events to Amazon S3 buckets. This is useful for archiving events to cheaper storage for later retrieval or importing them to systems that are compatible with S3. Events are batched in JSON encoded format and uploaded as compressed files.

Streamfold will partition uploads by the timestamp of the event in the format of YYYY/MM/DD. All events received within that same timestamp day will be batched together in successive staging files and uploaded to the partitioning folder. Streamfold will stage files locally on disk and upload them when the full batch size is reached or a given maximum file age is reached. Each uploaded batch file will have a random file name assigned to it under the partitioning folder.

The partitioning key can not be changed at the moment.

Configuration

Users can control how events are uploaded as objects to S3.

Common options

  • Bucket name: Name of the S3 bucket to upload to
  • AWS region: AWS region the S3 bucket is located in
  • Key prefix: Prefix to use for object key names. The full key name will be: <key prefix>/YYYY/MM/DD/<file name>.json.gz (if compression is enabled)
  • Compression: Whether or not to compress files. Options are none or gzip, default is gzip
  • Role ARN: ARN of the AWS Role to assume when writing to this bucket

Advanced options

  • Batch max idle time: Maximum time a file can be without new entries before it is uploaded
  • Batch max open time: Maximum time a file can be opened before it is uploaded

Authentication

Currently the only supported authentication method is assume role.

Assume role

Streamfold uses the Assume Role authentication method to gain permissions to write to the S3 bucket. Follow the guide in the UI when setting up a new S3 destination to correctly create a role that Streamfold can assume.

The use of an External ID when assuming a role is not supported at the moment.

Previous
Prometheus Remote-Write