Prompting in BAML

We recommend reading the installation instructions first

BAML functions are special definitions that get converted into real code (Python, TS, etc) that calls LLMs. Think of them as a way to define AI-powered functions that are type-safe and easy to use in your application.

What BAML Functions Actually Do

When you write a BAML function like this:

BAML
1function ExtractResume(resume_text: string) -> Resume {
2 client "openai/gpt-4o"
3 // The prompt uses Jinja syntax.. more on this soon.
4 prompt #"
5 Extract info from this text.
6
7 {# special macro to print the output schema + instructions #}
8 {{ ctx.output_format }}
9
10 Resume:
11 ---
12 {{ resume_text }}
13 ---
14 "#
15}

BAML converts it into code that:

  1. Takes your input (resume_text)
  2. Sends a request to OpenAI’s GPT-4 API with your prompt.
  3. Parses the JSON response into your Resume type
  4. Returns a type-safe object you can use in your code

Prompt Preview + seeing the CURL request

For maximum transparency, you can see the API request BAML makes to the LLM provider using the VSCode extension. Below you can see the Prompt Preview, where you see the full rendered prompt (once you add a test case):

Prompt preview

Note how the {{ ctx.output_format }} macro is replaced with the output schema instructions.

The Playground will also show you the Raw CURL request (if you click on the “curl” checkbox):

Raw CURL request

Always include the {{ ctx.output_format }} macro in your prompt. This injects your output schema into the prompt, which helps the LLM output the right thing. You can also customize what it prints.

One of our design philosophies is to never hide the prompt from you. You control and can always see the entire prompt.

Calling the function

Recall that BAML will generate a baml_client directory in the language of your choice using the parameters in your generator config. This contains the function and types you defined.

Now we can call the function, which will make a request to the LLM and return the Resume object:

1# Import the baml client (We call it `b` for short)
2from baml_client import b
3# Import the Resume type, which is now a Pydantic model!
4from baml_client.types import Resume
5
6def main():
7resume_text = """Jason Doe\nPython, Rust\nUniversity of California, Berkeley, B.S.\nin Computer Science, 2020\nAlso an expert in Tableau, SQL, and C++\n"""
8
9 # this function comes from the autogenerated "baml_client".
10 # It calls the LLM you specified and handles the parsing.
11 resume = b.ExtractResume(resume_text)
12
13 # Fully type-checked and validated!
14 assert isinstance(resume, Resume)

Do not modify any code inside baml_client, as it’s autogenerated.

Next steps

Checkout PromptFiddle to see various interactive BAML function examples or view the example prompts

Read the next guide to learn more about choosing different LLM providers and running tests in the VSCode extension.