Pre-requisites

Follow the installation instructions and run baml init in a new project.

The starting project structure will look something like this:

Overview

Before you call an LLM, ask yourself what kind of input or output youre expecting. If you want the LLM to generate text, then you probably want a string, but if you’re trying to get it to collect user details, you may want it to return a complex type like UserDetails.

Thinking this way can help you decompose large complex prompts into smaller, more measurable functions, and will also help you build more complex workflows and agents.

Extracting a resume from text

The best way to learn BAML is to run an example in our web playground — PromptFiddle.com.

But at a high-level, BAML is simple to use — prompts are built using Jinja syntax to make working with strings easier. We extended jinja to add type-support, and static analysis of your template variables, and a realtime side-by-side playground for VSCode.

We’ll write out an example from PromptFiddle here:

baml_src/main.baml
// Declare the Resume type we want the AI function to return
class Resume {
  name string
  education Education[] @description("Extract in the same order listed")
  skills string[] @description("Only include programming languages")
}

class Education {
  school string
  degree string
  year int
}

// Declare the function signature, with the prompt that will be used to make the AI function work
function ExtractResume(resume_text: string) -> Resume {
  // An LLM client we define elsewhere, with some parameters and our API key
  client GPT4Turbo

  // The prompt uses Jinja syntax
  prompt #"
    Parse the following resume and return a structured representation of the data in the schema below.

    Resume:
    ---
    {{ resume_text }}
    ---

    {# special macro to print the output instructions. #}
    {{ ctx.output_format }}

    JSON:
  "#
}

That’s it! If you use the VSCode extension, everytime you save this .baml file, it will convert this configuration file into a usable Python or TypeScript function in milliseconds, with full types.

All your types become Pydantic models in Python, or type definitions in Typescript (soon we’ll support generating Zod types).

2. Usage in Python or TypeScript

Our VSCode extension automatically generates a baml_client in your language of choice.

from baml_client import baml as b
# BAML types get converted to Pydantic models
from baml_client.baml_types import Resume
import asyncio

async def main():
    resume_text = """Jason Doe
Python, Rust
University of California, Berkeley, B.S.
in Computer Science, 2020
Also an expert in Tableau, SQL, and C++
"""

    # this function comes from the autogenerated "baml_client".
    # It calls the LLM you specified and handles the parsing.
    resume = await b.ExtractResume(resume_text)
    
    # Fully type-checked and validated!
    assert isinstance(resume, Resume)


if __name__ == "__main__":
  asyncio.run(main())

Further reading