Switching LLMs
BAML Supports getting structured output from all major providers as well as all OpenAI-API compatible open-source models. See LLM Providers Reference for how to set each one up.
BAML can help you get structured output from any Open-Source model, with better performance than other techniques, even when it’s not officially supported via a Tool-Use API (like o1-preview) or fine-tuned for it! Read more about how BAML does this.
Using client "<provider>/<model>"
Using openai/model-name
or anthropic/model-name
will assume you have the ANTHROPIC_API_KEY or OPENAI_API_KEY environment variables set.
Using a named client
Consult the provider documentation for a list of supported providers and models, the default options, and setting retry policies.
If you want to specify which client to use at runtime, in your Python/TS/Ruby code, you can use the client registry to do so.
This can come in handy if you’re trying to, say, send 10% of your requests to a different model.