Logo

S3

OpenMeter can ingest data from an S3-compatible object storage, making integration easier with existing data pipelines. Example of popular S3 compatible object stores:

This guide will show you how to collect data from an S3-compatible object store and ingest it into OpenMeter.

Prerequisites

There are several strategies to collect data from an S3-compatible object store, but all of them depend on how the data is stored. Since we cannot possibly cover all possible scenarios, we will provide a solution for the most common one: batches of data indexed by timestamps.

If your data is not structured that way, check out the aws_s3 Benthos input documentation for more options.

Configuration

First, create a new YAML file for the collector configuration. You will have to use the aws_s3 Benthos input:

input:
  aws_s3:
    bucket: my-bucket
    region: us-east-1
    prefix: ${!timestamp_unix().ts_round("1h".parse_duration()).ts_unix()}/

The above section will tell Benthos to read data from your S3 bucket in the specified region, and look for objects in the specified prefix (hourly timestamp). You will have to run the collector as a cron job every hour to ingest the data into OpenMeter.

Feel free to tweak the prefix to your needs.

Next, you need to configure the mapping from your schema to CloudEvents using bloblang:

pipeline:
  processors:
    - mapping: |
        root = {
          "id": this.id,
          "specversion": "1.0",
          "type": "your-usage-event-type",
          "source": "s3",
          "time": this.time,
          "subject": this.subject_field,
          "data": {
            "data": this.data_field,
          },
        }

Finally, you need to configure the OpenMeter output:

output:
  openmeter:
    url: https://openmeter.cloud # optional
    token: '<YOUR OPENMETER CLOUD TOKEN>'

Read more about configuring Benthos in the Benthos collector guide.

Installation

Check out the Benthos collector guide for installation instructions.