Getting Started
Amberflo SDKs

Python

14min
https //github com/amberflo/metering python https //github com/amberflo/metering python features add and update customers assign and update product plans to customers send meter events in asynchronous batches for high throughput (with optional flush on demand) or synchronously using the amberflo api or the amberflo supplied aws s3 bucket query usage fine grained logging control quick start sign up for free https //ui amberflo io/ and get an api key install the sdk pip install amberflo metering python create a customer import os from time import time from metering ingest import create ingest client client = create ingest client(api key=os environ\["api key"]) dimensions = {"region" "us east 1"} customer id = "sample customer 123" client meter( meter api name="sample meter", meter value=5, meter time in millis=int(time() 1000), customer id=customer id, dimensions=dimensions, ) 4\ ingest meter events import os from time import time from metering ingest import create ingest client client = create ingest client(api key=os environ\["api key"]) dimensions = {"region" "us east 1"} customer id = "sample customer 123" client meter( meter api name="sample meter", meter value=5, meter time in millis=int(time() 1000), customer id=customer id, dimensions=dimensions, ) 5\ query usage query usage import os from time import time from metering usage import (aggregationtype, take, timegroupinginterval, timerange, usageapiclient, create usage query) client = usageapiclient(os environ get("api key")) since two days ago = timerange(int(time()) 60 60 24 2) query = create usage query( meter api name="my meter", aggregation=aggregationtype sum, time grouping interval=timegroupinginterval day, time range=since two days ago, group by=\["customerid"], usage filter={"customerid" \["some customer 321", "sample customer 123"]}, take=take(limit=10, is ascending=false), ) report = client get(query) high throughput ingestion amberflo io libraries are built to support high throughput environments that means you can safely send hundreds of meter records per second for example, you can chose to deploy it on a web server that is serving hundreds of requests per second however, every call does not result in a http request, but is queued in memory instead messages are batched and flushed in the background, allowing for much faster operation the size of batch and rate of flush can be customized flush on demand for example, at the end of your program, you'll want to flush to make sure there's nothing left in the queue calling this method will block the calling thread until there are no messages left in the queue so, you'll want to use it as part of your cleanup scripts and avoid using it as part of the request lifecycle error handling the sdk allows you to set up a on error callback function for handling errors when trying to send a batch here is a complete example, showing the default values of all options def on error callback(error, batch) client = create ingest client( api key=api key, max queue size=100000, # max number of items in the queue before rejecting new items threads=2, # number of worker threads doing the sending retries=2, # max number of retries after failures batch size=100, # max number of meter records in a batch send interval in secs=0 5, # wait time before sending an incomplete batch sleep interval in secs=0 1, # wait time after failure to send or queue empty on error=on error callback, # handle failures to send a batch ) client meter( ) client flush() # block and make sure all messages are sent what happens if there are just too many messages? if the module detects that it can't flush faster than it's receiving messages, it'll simply stop accepting new messages this allows your program to continually run without ever crashing due to a backed up metering queue ingesting through the s3 bucket the sdk provides a metering ingest ingests3client so you can send your meter records to us via the s3 bucket use of this feature is enabled if you install the library with the s3 option pip install amberflo metering python\[s3] just pass the s3 bucket credentials to the factory function client = create ingest client( bucket name=os environ get("bucket name"), access key=os environ get("access key"), secret key=os environ get("secret key"), ) documentation general documentation on how to use amberflo is available at onboarding walkthrough docid\ n7 owhejvrapwtrzb7mkb the full rest api documentation is available at getting started docid\ kdkc4ffc0 k 1anjm362f samples code samples covering different scenarios are available here https //github com/amberflo/metering python/blob/main/samples/readme md reference api clients ingest meter records docid\ kk4sbtgghhzg1248apun0 from metering ingest import ( create ingest payload, create ingest client, ) create a customer docid f6sduvjrnpco hkx6kuj from metering customer import ( customerapiclient, create customer payload, ) query the usage data docid\ b2rapdew6w5aqu3hbevdo from metering usage import ( aggregationtype, take, timegroupinginterval, timerange, usageapiclient, create usage query, create all usage query, ) create a session docid\ jj3 5fh5pznhgfzhr 9va from metering customer portal session import ( customerportalsessionapiclient, create customer portal session payload, ) order a one time or a recurrence prepaid card by price docid\ egqa0mcbpjmioegsjjsix from metering customer prepaid order import ( billingperiod, billingperiodunit, customerprepaidorderapiclient, create customer prepaid order payload, ) get an invoice docid\ ly7tmgx2wx iecxunzu a from metering customer product invoice import ( customerproductinvoiceapiclient, create all invoices query, create latest invoice query, create invoice query, ) assign a product plan to a customer docid\ pu 45mi2o2mai5yzx91zc from metering customer product plan import ( customerproductplanapiclient, create customer product plan payload, ) exceptions from metering exceptions import apierror logging amberflo metering python uses the standard python logging framework by default, logging is and set at the warning level the following loggers are used metering ingest producer metering ingest s3 client metering ingest consumer metering session ingest session metering session api session