litellm

LiteLLM supports the OpenAI client, allowing you to use the openai-generic provider with an overridden base_url.

See OpenAI Generic for more details about parameters.

Set up

  1. Set up LiteLLM Proxy server

  2. Set up LiteLLM Client in BAML files

  3. Use it in a BAML function!

BAML
1client<llm> MyClient {
2 provider "openai-generic"
3 options {
4 base_url "http://0.0.0.0:4000"
5 api_key env.LITELLM_API_KEY
6 model "gpt-4o"
7 }
8}