llama-api

Llama API supports the OpenAI client, allowing you to use the openai-generic provider with an overridden base_url.

Note that to call Llama, you must use its OpenAI-compatible /compat/v1 endpoint. See Llama’s OpenAI compatibility documentation.

1client<llm> LlamaAPI {
2 provider openai-generic
3 retry_policy Exponential
4 options {
5 base_url "https://llama-api.meta.com/compat/v1"
6 model "Llama-3.3-8B-Instruct"
7 api_key env.LLAMA_API_KEY
8 // see openai-generic docs for more options
9 }
10}