Building a Chatbot with BAML React Hooks

In this tutorial, you’ll build a real-time streaming chatbot using BAML React hooks. By following along, you’ll learn how to:

  • Create a BAML function for chat completions
  • Use BAML’s React hooks for streaming responses
  • Build a modern chat interface
  • Handle loading states and errors

Prerequisites

Before starting, ensure you have:

  • Completed the Quick Start Guide
  • A Next.js project with BAML set up
  • An OpenAI API key

Step 1: Define the Chat Function

First, create a new BAML function for the chat completion:

baml_src/chat.baml
1class Message {
2 role "user" | "assistant"
3 content string
4}
5
6function Chat(messages: Message[]) -> string {
7 client "openai/gpt-4o-mini"
8 prompt #"
9 You are a helpful and knowledgeable AI assistant engaging in a conversation.
10 Your responses should be:
11 - Clear and concise
12 - Accurate and informative
13 - Natural and conversational in tone
14 - Focused on addressing the user's needs
15
16 {{ ctx.output_format }}
17
18 {% for m in messages %}
19 {{ _.role(m.role)}}
20 {{m.content}}
21 {% endfor %}
22 "#
23}
24
25test TestName {
26 functions [Chat]
27 args {
28 messages [
29 {
30 role "user"
31 content "help me understand Chobani's success"
32 }
33 ]
34 }
35}

Generate the BAML client to create the React hooks:

$baml-cli generate

Step 2: Implement the Chat Interface

You can implement the chat interface in two ways:

Option A: Using the Generated Hook Directly

The simplest approach is to use the generated hook directly:

app/components/chat-interface.tsx
1'use client'
2
3import { useChat } from "@/baml_client/react/hooks";
4import { useState } from "react";
5
6export function ChatInterface() {
7 const [input, setInput] = useState("");
8
9 const chat = useChat();
10
11 const handleSubmit = async () => {
12 const newMessages = [
13 ...chat.data?.messages,
14 { role: "user", content: input }
15 ];
16
17 setInput("");
18
19 await chat.mutate({ messages: newMessages });
20 };
21
22 return (
23 <div>
24 <div>
25 {chat.data?.messages.map((message, i) => (
26 <div key={i}>
27 {message.content}
28 </div>
29 ))}
30 {chat.isLoading && <div>Generating...</div>}
31 </div>
32
33 <form onSubmit={handleSubmit}>
34 <input
35 value={input}
36 onChange={(e) => setInput(e.target.value)}
37 placeholder="Type your message..."
38 />
39 <button type="submit" disabled={chat.isLoading}>
40 Send
41 </button>
42 </form>
43 </div>
44 );
45}

Option B: Using a Custom Server Action

Alternatively, you can create a custom server action for more control over the server-side implementation:

1'use server'
2
3import { b } from "@/baml_client";
4import { Message } from "@/baml_client/types";
5
6export async function streamChat(messages: Message[]) {
7 const user = await authUser();
8
9 if (!user) {
10 throw new Error("User not authenticated");
11 }
12
13 return b.stream.Chat(messages).toStreamable();
14}

The server action approach is useful when you need to:

  • Add custom server-side logic
  • Handle authentication
  • Add logging or monitoring
  • Implement rate limiting
  • Add custom error handling

Next Steps

To enhance your chatbot, you could:

  • Add error handling for different types of errors
  • Add chat history persistence
  • Implement different chat models or configurations

For more information, check out:

Built with