Tools / Function Calling

“Function calling” is a technique for getting an LLM to choose a function to call for you.

The way it works is:

  1. You define a task with certain function(s)
  2. Ask the LLM to choose which function to call
  3. Get the function parameters from the LLM for the appropriate function it choose
  4. Call the functions in your code with those parameters

In BAML, you can get represent a tool or a function you want to call as a BAML class, and make the function output be that class definition.

BAML
1class WeatherAPI {
2 city string @description("the user's city")
3 timeOfDay string @description("As an ISO8601 timestamp")
4}
5
6function UseTool(user_message: string) -> WeatherAPI {
7 client GPT4Turbo
8 prompt #"
9 Extract the info from this message
10 ---
11 {{ user_message }}
12 ---
13
14 {# special macro to print the output schema. #}
15 {{ ctx.output_format }}
16
17 JSON:
18 "#
19}

Call the function like this:

1import asyncio
2from baml_client import b
3from baml_client.types import WeatherAPI
4
5def main():
6 weather_info = b.UseTool("What's the weather like in San Francisco?")
7 print(weather_info)
8 assert isinstance(weather_info, WeatherAPI)
9 print(f"City: {weather_info.city}")
10 print(f"Time of Day: {weather_info.timeOfDay}")
11
12if __name__ == '__main__':
13 main()

Choosing multiple Tools

To choose ONE tool out of many, you can use a union:

BAML
1function UseTool(user_message: string) -> WeatherAPI | MyOtherAPI {
2 .... // same thing
3}
If you use VSCode Playground, you can see what we inject into the prompt, with full transparency.

Call the function like this:

1import asyncio
2from baml_client import b
3from baml_client.types import WeatherAPI, MyOtherAPI
4
5async def main():
6 tool = b.UseTool("What's the weather like in San Francisco?")
7 print(tool)
8
9 if isinstance(tool, WeatherAPI):
10 print(f"Weather API called:")
11 print(f"City: {tool.city}")
12 print(f"Time of Day: {tool.timeOfDay}")
13 elif isinstance(tool, MyOtherAPI):
14 print(f"MyOtherAPI called:")
15 # Handle MyOtherAPI specific attributes here
16
17if __name__ == '__main__':
18 main()

Choosing N Tools

To choose many tools, you can use a union of a list:

BAML
1function UseTool(user_message: string) -> (WeatherAPI | MyOtherAPI)[] {
2 .... // same thing
3}

Call the function like this:

1import asyncio
2from baml_client import b
3from baml_client.types import WeatherAPI, MyOtherAPI
4
5async def main():
6 tools = b.UseTool("What's the weather like in San Francisco and New York?")
7 print(tools)
8
9 for tool in tools:
10 if isinstance(tool, WeatherAPI):
11 print(f"Weather API called:")
12 print(f"City: {tool.city}")
13 print(f"Time of Day: {tool.timeOfDay}")
14 elif isinstance(tool, MyOtherAPI):
15 print(f"MyOtherAPI called:")
16 # Handle MyOtherAPI specific attributes here
17
18if __name__ == '__main__':
19 main()

Function-calling APIs vs Prompting

Injecting your function schemas into the prompt, as BAML does, outperforms function-calling across all benchmarks for major providers (see our Berkeley FC Benchmark results with BAML).

Amongst other limitations, function-calling APIs will at times:

  1. Return a schema when you don’t want any (you want an error)
  2. Not work for tools with more than 100 parameters.
  3. Use many more tokens than prompting.

Keep in mind that “JSON mode” is nearly the same thing as “prompting”, but it enforces the LLM response is ONLY a JSON blob. BAML does not use JSON mode since it allows developers to use better prompting techniques like chain-of-thought, to allow the LLM to express its reasoning before printing out the actual schema. BAML’s parser can find the json schema(s) out of free-form text for you. Read more about different approaches to structured generation here

BAML will still support native function-calling APIs in the future (please let us know more about your use-case so we can prioritize accordingly)