Error Handling
When BAML raises an exception, it will be an instance of a subclass of BamlError
. This allows you to catch all BAML-specific exceptions with a single except
block.
Example
BamlError
Base class for all BAML exceptions.
A human-readable error message.
BamlInvalidArgumentError
Subclass of BamlError
.
Raised when one or multiple arguments to a function are invalid.
BamlClientError
Subclass of BamlError
.
Raised when a client fails to return a valid response.
In the case of aggregate clients like fallback
or those with retry_policy
, only the last client’s error type is raised. However, the complete history of all failed attempts is preserved in the detailed_message
field, allowing you to debug the entire fallback chain.
BamlClientHttpError
Subclass of BamlClientError
.
Raised when the HTTP request made by a client fails with a non-200 status code.
The status code of the response.
Common status codes are:
- 1: Other
- 2: Other
- 400: Bad Request
- 401: Unauthorized
- 403: Forbidden
- 404: Not Found
- 429: Too Many Requests
- 500: Internal Server Error
BamlClientFinishReasonError
Subclass of BamlClientError
.
Raised when the finish reason of the LLM response is not allowed.
The finish reason of the LLM response.
An error message.
The original prompt that was sent to the LLM, formatted as a plain string. Images sent as base64-encoded strings are not serialized into this field.
The raw text from the LLM that failed to parse into the expected return type of a function.
Comprehensive error information that includes the complete history of all failed attempts when using fallback clients or retry policies. When multiple attempts are made, this field contains formatted details about each failed attempt, making it invaluable for debugging complex client configurations.
BamlValidationError
Subclass of BamlError
.
Raised when BAML fails to parse a string from the LLM into the specified object.
The raw text from the LLM that failed to parse into the expected return type of a function.
The parsing-related error message.
The original prompt that was sent to the LLM, formatted as a plain string. Images sent as base64-encoded strings are not serialized into this field.
Comprehensive error information that includes the complete history of all failed attempts when using fallback clients or retry policies. When multiple attempts are made, this field contains formatted details about each failed attempt, making it invaluable for debugging complex client configurations.
BamlAbortError
Subclass of BamlError
.
Raised when a BAML operation is cancelled via an abort controller.
A message describing why the operation was aborted.
Optional additional context about the cancellation. This can be any value provided when calling the abort()
method.
Handling Cancellation
When operations are cancelled via abort controllers, specific errors are thrown:
For more information on using abort controllers, see the Abort Controllers guide.
LLM Fixup: Dealing with Validation Errors
Our parser is very forgiving, allowing for structured data parsing even in the presence of minor errors and thought tokens in the LLM response. However, certain types of errors are too ambiguous to handle without the help of an LLM.
In cases where your LLM is having trouble producing valid data from the output schema, you can use this ‘fixup’ recipe to get valid data:
- Write a Fixup Function. For example, if your original function is called
Foo
and it returnsMyClass
:
- Then call the fixup function from your client code in response to validation errors:
Choosing a Model
LLMs are good at reconstituting data, so it is often possible to use a less powerful model for your fixup function than the model you used to produce the original data. The difficulty of producing valid JSON data depends on the complexity of the schema and the details of your data payload, so be sure to test your fixup function on realistic data payloads before moving to a smaller model.