Building a Chatbot with BAML React Hooks
In this tutorial, you’ll build a real-time streaming chatbot using BAML React hooks. By following along, you’ll learn how to:
- Create a BAML function for chat completions
- Use BAML’s React hooks for streaming responses
- Build a modern chat interface
- Handle loading states and errors
Prerequisites
Before starting, ensure you have:
- Completed the Quick Start Guide
- A Next.js project (version 15 or higher) with BAML set up
- An OpenAI API key
Step 1: Define the Chat Function
First, create a new BAML function for the chat completion:
baml_src/chat.baml
Generate the BAML client to create the React hooks:
Step 2: Implement the Chat Interface
The useChat hook’s data property contains the assistant’s streaming response (a string), not the messages array. You need to maintain your own message state:
app/components/chat-interface.tsx
Using Callbacks for Fine-Grained Control
You can use the hook’s callbacks for more control over streaming events. This is useful for logging, analytics, or custom state management. The highlighted lines show the additions to the base example:
app/components/chat-interface.tsx
With callbacks, you can:
- Track streaming progress with
onStreamData - Handle completion with
onFinalData - Add error handling with
onError - Integrate analytics or logging on each event
Next Steps
To enhance your chatbot, you could:
- Add error handling for different types of errors
- Add chat history persistence
- Implement different chat models or configurations
For more information, check out: