In this guide we’ll build a small chatbot that takes in user messages and generates responses.

chat-history.baml
1class MyUserMessage {
2 role "user" | "assistant"
3 content string
4}
5
6function ChatWithLLM(messages: MyUserMessage[]) -> string {
7 client "openai/gpt-4o"
8 prompt #"
9 Answer the user's questions based on the chat history:
10 {% for message in messages %}
11 {{ _.role(message.role) }}
12 {{ message.content }}
13 {% endfor %}
14
15 Answer:
16 "#
17}
18
19test TestName {
20 functions [ChatWithLLM]
21 args {
22 messages [
23 {
24 role "user"
25 content "Hello!"
26 }
27 {
28 role "assistant"
29 content "Hi!"
30 }
31 ]
32 }
33}

Code

1from baml_client import b
2from baml_client.types import MyUserMessage
3
4def main():
5 messages: list[MyUserMessage] = []
6
7 while True:
8 content = input("Enter your message (or 'quit' to exit): ")
9 if content.lower() == 'quit':
10 break
11
12 messages.append(MyUserMessage(role="user", content=content))
13
14 agent_response = b.ChatWithLLM(messages=messages)
15 print(f"AI: {agent_response}")
16 print()
17
18 # Add the agent's response to the chat history
19 messages.append(MyUserMessage(role="assistant", content=agent_response))
20
21if __name__ == "__main__":
22 main()