Getting Started
Amberflo SDKs
Python
15 min
https //github com/amberflo/metering python https //github com/amberflo/metering python amberflo python sdk features add and update customers assign or update product plans for customers send meter events in asynchronous batches (high throughput) synchronously via amberflo’s api or through an aws s3 bucket query usage data fine grained logging and error handling quick start go to settings and copy your api key install the sdk pip install amberflo metering python create a customer import os from time import time from metering ingest import create ingest client client = create ingest client(api key=os environ\["api key"]) dimensions = {"region" "us east 1"} customer id = "sample customer 123" client meter( meter api name="sample meter", meter value=5, meter time in millis=int(time() 1000), customer id=customer id, dimensions=dimensions, ) 4\ ingest meter events import os from time import time from metering ingest import create ingest client client = create ingest client(api key=os environ\["api key"]) dimensions = {"region" "us east 1"} customer id = "sample customer 123" client meter( meter api name="sample meter", meter value=5, meter time in millis=int(time() 1000), customer id=customer id, dimensions=dimensions, ) 5\ query usage query usage import os from time import time from metering usage import (aggregationtype, take, timegroupinginterval, timerange, usageapiclient, create usage query) client = usageapiclient(os environ get("api key")) since two days ago = timerange(int(time()) 60 60 24 2) query = create usage query( meter api name="my meter", aggregation=aggregationtype sum, time grouping interval=timegroupinginterval day, time range=since two days ago, group by=\["customerid"], usage filter={"customerid" \["some customer 321", "sample customer 123"]}, take=take(limit=10, is ascending=false), ) report = client get(query) high throughput ingestion the amberflo python sdk is designed for high throughput environments you can safely send hundreds of meter records per second for example, this works well in a web server that handles hundreds of requests each second each meter event is queued in memory rather than sent immediately events are batched and flushed in the background to improve performance both the batch size and the flush rate can be customized flush on demand at the end of your program, you may want to flush the queue to ensure that all events are sent this is a blocking call that waits until the queue is empty it is best used in cleanup scripts and should be avoided during regular request handling error handling you can define an on error callback to handle errors that occur during batch sends here is a complete example, showing the default values of all options def on error callback(error, batch) client = create ingest client( api key=api key, max queue size=100000, # max number of items in the queue before rejecting new items threads=2, # number of worker threads doing the sending retries=2, # max number of retries after failures batch size=100, # max number of meter records in a batch send interval in secs=0 5, # wait time before sending an incomplete batch sleep interval in secs=0 1, # wait time after failure to send or queue empty on error=on error callback, # handle failures to send a batch ) client meter( ) client flush() # block and make sure all messages are sent handling message overload if the sdk detects that it cannot flush messages as quickly as they are being added, it will stop accepting new messages this prevents your program from crashing due to a backed up metering queue and allows it to continue running smoothly ingesting through the s3 bucket the sdk includes a metering ingest ingests3client that allows you to send meter records via an aws s3 bucket to enable this feature, install the sdk with the s3 option pip install amberflo metering python\[s3] then, pass your s3 bucket credentials to the factory function to send records client = create ingest client( bucket name=os environ get("bucket name"), access key=os environ get("access key"), secret key=os environ get("secret key"), ) documentation general documentation on how to use amberflo is available at onboarding walkthrough docid\ n7 owhejvrapwtrzb7mkb the full rest api documentation is available at getting started docid\ kdkc4ffc0 k 1anjm362f samples code samples covering different scenarios are available here https //github com/amberflo/metering python/blob/main/samples/readme md reference api clients ingest meter records docid\ kk4sbtgghhzg1248apun0 from metering ingest import ( create ingest payload, create ingest client, ) create a customer docid f6sduvjrnpco hkx6kuj from metering customer import ( customerapiclient, create customer payload, ) query the usage data docid\ b2rapdew6w5aqu3hbevdo from metering usage import ( aggregationtype, take, timegroupinginterval, timerange, usageapiclient, create usage query, create all usage query, ) create a session docid\ jj3 5fh5pznhgfzhr 9va from metering customer portal session import ( customerportalsessionapiclient, create customer portal session payload, ) order a one time or a recurrence prepaid card by price docid\ egqa0mcbpjmioegsjjsix from metering customer prepaid order import ( billingperiod, billingperiodunit, customerprepaidorderapiclient, create customer prepaid order payload, ) get an invoice docid\ ly7tmgx2wx iecxunzu a from metering customer product invoice import ( customerproductinvoiceapiclient, create all invoices query, create latest invoice query, create invoice query, ) assign a product plan to a customer docid\ pu 45mi2o2mai5yzx91zc from metering customer product plan import ( customerproductplanapiclient, create customer product plan payload, ) exceptions from metering exceptions import apierror logging amberflo metering python uses the standard python logging framework by default, logging is and set at the warning level the following loggers are used metering ingest producer metering ingest s3 client metering ingest consumer metering session ingest session metering session api session