Clients are used to configure how LLMs are called, like so:

BAML
1function MakeHaiku(topic: string) -> string {
2 client "openai/gpt-4o"
3 prompt #"
4 Write a haiku about {{ topic }}.
5 "#
6}

This is <provider>/<model> shorthand for:

BAML
1client<llm> MyClient {
2 provider "openai"
3 options {
4 model "gpt-4o"
5 // api_key defaults to env.OPENAI_API_KEY
6 }
7}
8
9function MakeHaiku(topic: string) -> string {
10 client MyClient
11 prompt #"
12 Write a haiku about {{ topic }}.
13 "#
14}

Consult the provider documentation for a list of supported providers and models, and the default options.

If you want to override options like api_key to use a different environment variable, or you want to point base_url to a different endpoint, you should use the latter form.

If you want to specify which client to use at runtime, in your Python/TS/Ruby code, you can use the client registry to do so.

This can come in handy if you’re trying to, say, send 10% of your requests to a different model.

Fields

provider
stringRequired

This configures which provider to use. The provider is responsible for handling the actual API calls to the LLM service. The provider is a required field.

The configuration modifies the URL request BAML runtime makes.

Provider NameDocsNotes
anthropicAnthropic
aws-bedrockAWS Bedrock
azure-openaiAzure OpenAI
google-aiGoogle AI
openaiOpenAI
openai-genericOpenAI (generic)Any model provider that supports an OpenAI-compatible API
vertex-aiVertex AI

We also have some special providers that allow composing clients together:

Provider NameDocsNotes
fallbackFallbackUsed to chain models conditional on failures
round-robinRound RobinUsed to load balance
options
dict[str, Any]Required

These vary per provider. Please see provider specific documentation for more information. Generally they are pass through options to the POST request made to the LLM.

retry_policy

The name of the retry policy. See Retry Policy.