Usage Metering
Ingestion Options
Elastic Logstash
5min
meter ingestion via logstash integrating metering data into amberflo can be done with open source logstash the integration uses the logstash s3 output plugin and writes the metering records to the amberflo s3 bucket you can use logstash's rich language for parsing the metering data out of your logs/repositories logstash can extract data from files, elasticsearch, jdbc, s3, mongodb and other systems using various input plugins https //www elastic co/guide/en/logstash/current/input plugins html https //www elastic co/guide/en/logstash/current/input plugins html example logstash configuration in the example below we will share a sample logstash configuration that reads log lines from a file and writes the relevant metering records to amberflo's s3 bucket we will tail new files locally from /users/demoaccount/input/ the sample file format log line 1 log line 2 mymeter 2 mycustomerid amberflo meter log line 3 logstash will read any new file and transform them using the mutate logic into metering records in this example we ignore any line without the string amberflo meter we splits based on the space character " ", and treat the meterapiname, metervalue, cusomterid as the first, second and third object accordingly input { file { path => "/users/demoaccount/input/ " } } filter { if "amberflo meter" in \[message]{ ruby { code => "event set('time', ((event get('@timestamp') to f 1000) to i) to s)" } mutate { split => { "message" => " " } add field => { "meterapiname" => "%{\[message]\[0]}" } add field => { "metervalue" => "%{\[message]\[1]}" } add field => { "customerid" => "%{\[message]\[2]}" } remove field => \["host","message","@version","@timestamp"] } }else{ drop {} } } output { s3{ access key id => "xxxxxx" secret access key => "yyyyyyyyyyyyyy" region => "us west 2" bucket => "demo amberflo" size file => 2048 time file => 5 codec => "json" canned acl => "bucket owner full control" } } the resulting file (meter record) will be { 	"meterapiname" "mymeter", 	"metervalue" "2", 	"customerid" "mycustomerid", 	"time" "1621619742810", 	"path" "/users/demoaccount/input/sample txt" } jdbc source you can use the logstash jdbc source to extract the metering info from your repository https //www elastic co/guide/en/logstash/current/plugins inputs jdbc html https //www elastic co/guide/en/logstash/current/plugins inputs jdbc html input { jdbc { statement => "select id, mycolumn1, mycolumn2 from my table where id > \ sql last value" use column value => true tracking column => "id" jdbc driver library => "mysql connector java 5 1 36 bin jar" jdbc driver class => "com mysql jdbc driver" jdbc connection string => "jdbc\ mysql //localhost 3306/mydb" jdbc user => "mysql" schedule => " " \# other configuration bits } } elasticsearch input you can use the elasticsearch input if you have metering data in your logs an example input plugin looks like input { elasticsearch { hosts => "search myes cluster us west 2 es amazonaws com 443" index => "cwl aws lambda meter definition api lambda 2021 07" query => '{ "query" {"bool" {"filter" \[{"range" {"@timestamp" {"from" "now 1d/d", "to" "now/d", "include lower" true, "include upper" true, "format" "epoch millis", "boost" 1 } } }, {"query string" {"query" " amberflo meter ", "default field" "@message", "fields" \[], "type" "best fields", "default operator" "or", "max determinized states" 10000, "enable position increments" true, "fuzziness" "auto", "fuzzy prefix length" 0, "fuzzy max expansions" 50, "phrase slop" 0, "escape" false, "auto generate synonyms phrase query" true, "fuzzy transpositions" true, "boost" 1 } } ], "adjust pure negative" true, "boost" 1 } }, "aggregations" {} }' size => 500 scroll => "5m" docinfo => true ssl => true user => "myuser" password => "mypwd" \#schedule => " /1 " } }