AI - GOVERNANCE & CONTROL
AI Gateway Integration
I have an AI Gateway
13 min
amberflo integrates with third party ai gateways gateways act as the control point for all model traffic, capturing the metadata needed to understand usage, attribute costs, and govern ai workloads when connected to amberflo, a gateway becomes significantly more powerful its raw traffic is transformed into structured, attributed, real time usage data that unlocks budgets, guardrails, attribution, multi source aggregation, and ai governance across your organization amberflo supports connecting multiple gateways to a single amberflo account you can run different gateways for different teams, environments, or use cases, and all usage will be unified inside amberflo supported ai gateways litellm coming soon cloudflare apigee kong gateway this guide covers how to connect litellm when it is already deployed in your environment if you do not already have litellm running, or if your gateway is not currently supported, follow the guide below to deploy litellm from scratch and integrate it with amberflo π quickstart deploy an ai gateway and connect it to amberfloοΏΌ enable amberflo integration for an existing litellm deployment this guide assumes you already have litellm deployed and operational you will install the amberflo callback add a configuration entry to enable it update your deployment command redeploy and verify that usage flows into amberflo overview litellm handles and proxies llm requests amberflo provides real time usage metering, attribution, and cost governance the amberflo callback attaches to litellm and pushes metering events directly to the amberflo ingestion api once enabled, model usage appears immediately in the amberflo ai control tower prerequisites you must already have a running litellm deployment (docker, kubernetes, vm, etc ) a postgres database used by litellm required for teams, virtual keys, organizations, and routing rules the amberflo environment file ( env) your existing litellm config yaml note litellm can technically run without postgres, but many required features (virtual keys, teams, routing rules) will not function postgres is required for the amberflo integration download the amberflo callback package amberflo has created a package to collect the necessary metadata and send it to amberflo download amberflo zip from amberflo unzip it into the same directory as your litellm config file your directory should look like this your directory/ βββ amberflo/ β βββ init py β βββ (other amberflo callback files) βββ litellm config yaml βββ env (provided by amberflo) note do not modify any callback source files add the amberflo callback to litellm config yaml edit your litellm config and add litellm settings callbacks \ "amberflo litellm callback" if you already have other callbacks, add "amberflo litellm callback" as an additional entry this tells litellm to load the callback code from the amberflo/ directory add the amberflo environment file use the env file provided by amberflo it contains your amberflo api key the amberflo ingest endpoint the customer account identifier batch size, retry settings, and other callback related variables place the env file in the same directory where you run your docker command note do not commit this file to git update your deployment command (docker example) modify your existing docker command to include \ env file env a volume mount for the amberflo callback directory a volume mount for your litellm config yaml example docker run \\ \ env file env \\ \ volume /amberflo /app/amberflo\ ro \\ \ volume /litellm config yaml /app/config yaml\ ro \\ \ publish 4000 4000 \\ ghcr io/berriai/litellm\ v1 79 0 stable \\ \ config /app/config yaml key points /amberflo maps to /app/amberflo inside the container litellm automatically discovers callbacks in this folder the config file is mounted read only the env file supplies amberflo credentials and ingest information no additional installation is required redeploy the gateway stop your current litellm container and redeploy using the updated command check logs docker logs \<container id> you should see log messages indicating the amberflo callback was loaded litellm connected to postgres environment variables were detected test the integration step 1 call the gateway use a virtual key assigned to a team curl http //\<vm ip address> 4000/chat/completions \\ h "authorization bearer {virtual key}" \\ h "content type application/json" \\ d '{ "model" "{model identifier}", "messages" \[ {"role" "user", "content" "how are tokens calculated?"} ] }' step 2 verify in amberflo log in to amberflo and check ai spend dashboard business units cost breakdown pages events should appear in near real time automatic business unit creation amberflo automatically creates a new business unit the first time a virtual key is used mapping litellm team name β business unit name litellm team id β business unit id all future events for that key are attributed to that business unit you can rename business units later if needed
