The openai-generic provider supports all APIs that use OpenAI’s request and response formats, such as Groq, HuggingFace, Ollama, OpenRouter, and Together AI.

Example:

BAML
1client<llm> MyClient {
2 provider "openai-generic"
3 options {
4 base_url "https://api.provider.com"
5 model "<provider-specified-format>"
6 }
7}

Non-forwarded options

base_url
string

The base URL for the API.

Default: https://api.openai.com/v1

default_role
string

The default role for any prompts that don’t specify a role.

We don’t do any validation of this field, so you can pass any string you wish.

Default: system

headers
object

Additional headers to send with the request.

Example:

BAML
1client<llm> MyClient {
2 provider "openai-generic"
3 options {
4 base_url "https://api.provider.com"
5 model "<provider-specified-format>"
6 headers {
7 "X-My-Header" "my-value"
8 }
9 }
10}

Forwarded options

messages
DO NOT USE

BAML will auto construct this field for you from the prompt

stream
DO NOT USE

BAML will auto construct this field for you based on how you call the client in your code

model
string

The model to use.

For OpenAI, this might be "gpt-4o-mini"; for Ollama, this might be "llama2". The exact syntax will depend on your API provider’s documentation: we’ll just forward it to them as-is.

For all other options, see the official OpenAI API documentation.