Building a Chatbot with BAML React Hooks

In this tutorial, you’ll build a real-time streaming chatbot using BAML React hooks. By following along, you’ll learn how to:

  • Create a BAML function for chat completions
  • Use BAML’s React hooks for streaming responses
  • Build a modern chat interface
  • Handle loading states and errors

Prerequisites

Before starting, ensure you have:

  • Completed the Quick Start Guide
  • A Next.js project (version 15 or higher) with BAML set up
  • An OpenAI API key

Step 1: Define the Chat Function

First, create a new BAML function for the chat completion:

baml_src/chat.baml
1class Message {
2 role "user" | "assistant"
3 content string
4}
5
6function Chat(messages: Message[]) -> string {
7 client "openai/gpt-5-mini"
8 prompt #"
9 You are a helpful and knowledgeable AI assistant engaging in a conversation.
10 Your responses should be:
11 - Clear and concise
12 - Accurate and informative
13 - Natural and conversational in tone
14 - Focused on addressing the user's needs
15
16 {{ ctx.output_format }}
17
18 {% for m in messages %}
19 {{ _.role(m.role)}}
20 {{m.content}}
21 {% endfor %}
22 "#
23}
24
25test TestName {
26 functions [Chat]
27 args {
28 messages [
29 {
30 role "user"
31 content "help me understand Chobani's success"
32 }
33 ]
34 }
35}

Generate the BAML client to create the React hooks:

$baml-cli generate

Step 2: Implement the Chat Interface

The useChat hook’s data property contains the assistant’s streaming response (a string), not the messages array. You need to maintain your own message state:

app/components/chat-interface.tsx
1'use client'
2
3import { useChat } from "@/baml_client/react/hooks";
4import { useState, useEffect } from "react";
5import type { Message } from "@/baml_client/types";
6
7export function ChatInterface() {
8 const [messages, setMessages] = useState<Message[]>([]);
9 const [input, setInput] = useState("");
10
11 const chat = useChat();
12
13 // When the assistant responds, add the response to the message history
14 useEffect(() => {
15 if (chat.isSuccess && chat.finalData) {
16 setMessages((prev) => [
17 ...prev,
18 { role: "assistant", content: chat.finalData! }
19 ]);
20 }
21 }, [chat.isSuccess, chat.finalData]);
22
23 const handleSubmit = async (e: React.FormEvent) => {
24 e.preventDefault();
25 if (!input.trim() || chat.isLoading) return;
26
27 const userMessage: Message = { role: "user", content: input };
28 const newMessages = [...messages, userMessage];
29
30 // Update local state immediately
31 setMessages(newMessages);
32 setInput("");
33
34 // Send the full message history to the Chat function
35 await chat.mutate(newMessages);
36 };
37
38 return (
39 <div>
40 <div>
41 {messages.map((message, i) => (
42 <div key={i}>
43 <strong>{message.role}:</strong> {message.content}
44 </div>
45 ))}
46 {chat.isLoading && (
47 <div>
48 <strong>assistant:</strong> {chat.data ?? "Generating..."}
49 </div>
50 )}
51 </div>
52
53 <form onSubmit={handleSubmit}>
54 <input
55 value={input}
56 onChange={(e) => setInput(e.target.value)}
57 placeholder="Type your message..."
58 />
59 <button type="submit" disabled={chat.isLoading}>
60 Send
61 </button>
62 </form>
63 </div>
64 );
65}

Using Callbacks for Fine-Grained Control

You can use the hook’s callbacks for more control over streaming events. This is useful for logging, analytics, or custom state management. The highlighted lines show the additions to the base example:

app/components/chat-interface.tsx
1'use client'
2
3import { useChat } from "@/baml_client/react/hooks";
4import { useState } from "react";
5import type { Message } from "@/baml_client/types";
6
7export function ChatInterface() {
8 const [messages, setMessages] = useState<Message[]>([]);
9 const [streamingContent, setStreamingContent] = useState("");
10 const [input, setInput] = useState("");
11
12 const chat = useChat({
13 // Called on each streaming partial update
14 onStreamData: (partial) => {
15 if (partial) {
16 setStreamingContent(partial);
17 }
18 },
19 // Called when streaming completes with the final response
20 onFinalData: (final) => {
21 if (final) {
22 setMessages((prev) => [
23 ...prev,
24 { role: "assistant", content: final }
25 ]);
26 setStreamingContent("");
27 }
28 },
29 });
30
31 const handleSubmit = async (e: React.FormEvent) => {
32 e.preventDefault();
33 if (!input.trim() || chat.isLoading) return;
34
35 const userMessage: Message = { role: "user", content: input };
36 const newMessages = [...messages, userMessage];
37
38 setMessages(newMessages);
39 setInput("");
40
41 await chat.mutate(newMessages);
42 };
43
44 return (
45 <div>
46 <div>
47 {messages.map((message, i) => (
48 <div key={i}>
49 <strong>{message.role}:</strong> {message.content}
50 </div>
51 ))}
52 {chat.isLoading && (
53 <div>
54 <strong>assistant:</strong> {streamingContent || "Generating..."}
55 </div>
56 )}
57 </div>
58
59 <form onSubmit={handleSubmit}>
60 <input
61 value={input}
62 onChange={(e) => setInput(e.target.value)}
63 placeholder="Type your message..."
64 />
65 <button type="submit" disabled={chat.isLoading}>
66 Send
67 </button>
68 </form>
69 </div>
70 );
71}

With callbacks, you can:

  • Track streaming progress with onStreamData
  • Handle completion with onFinalData
  • Add error handling with onError
  • Integrate analytics or logging on each event

Next Steps

To enhance your chatbot, you could:

  • Add error handling for different types of errors
  • Add chat history persistence
  • Implement different chat models or configurations

For more information, check out: