Usage Metering
Ingestion Options
AWS SQS
4min
write events directly to an amberflo supplied and secured aws sqs queue amberflo provides you with an aws sqs queue (fifo) with access rights and controls to write meters it automatically picks up the meters for processing as they arrive into the queue please contact us to get the sqs queue and s3 bucket provisioned for your account format the meter records you send to the sqs queue should be of the same standardized format as accepted by the ingest meter records docid\ t8ybtn9tmllhzvbed2w2p here are some examples \[{ "customerid" "customer 123", "meterapiname" "computehours", "metervalue" 5, "metertimeinmillis" 1619445706909, "dimensions" { "region" "us west 2", "az" "az1" } }] we also support ndjson https //github com/ndjson/ndjson spec format (json separated by a newline) { "customerid" "customer 123", "meterapiname" "computehours", "metervalue" 5, "metertimeinmillis" 1619445706909 } { "customerid" "customer 321", "meterapiname" "computehours", "metervalue" 4, "metertimeinmillis" 1619445712341 } { "customerid" "customer 123", "meterapiname" "computehours", "metervalue" 1, "metertimeinmillis" 1619445783456 } code example import json from uuid import uuid1 from datetime import date import boto3 records to send = \[{ 'customerid' 'customer 123', 'meterapiname' 'computehours', 'metervalue' 5, 'metertimeinmillis' 1619445706909, 'dimensions' { 'region' 'us west 2', 'az' 'az1' } }] queue url = 'https //sqs us west 2 amazonaws com/624335419252/62 ingest fifo' sqs = boto3 client('sqs') sqs send message( queueurl=queue url, messagebody=json dumps(records to send), messagegroupid=str(uuid1()), messagededuplicationid=str(uuid1()), ) troubleshooting the records sent to the queue will be saved in an s3 bucket before being processed (see the aws s3 docid\ ybf6t2 oh5f fy0eb2qno ) so troubleshooting just follows the same pattern, i e a file will be created in s3 with the failure reason we also set up a dead letter queue in case there are issues writing to the s3 bucket