Prerequisites

Ensure you have read the previous levels before starting this one!

Overview

A common prompting strategy to improve results is to use Self-Generated Chain of Thought, which uses natural language statements like “Let’s think step by step” to encourage the model to generate reasoning steps.

In this section, we will add a “reasoning step” to our AI function so the LLM can generate its chain of thought before classifying the message.

Adding a reasoning step

Simply add instructions to the LLM to explain why it chose the answer.

Example prompt:


impl<llm, ClassifyMessage> version3 {
  client GPT4

  prompt #"
    Classify the following INPUT into the best Category.

    {#print_enum(Category)}

    INPUT: {#input}

    Before you choose the answer, please provide some reasoning steps as to why it matches.

    Write the answer in the last line.
  "#
}

Why this works

BAML’s flexible parser detects enums in LLM outputs, even when the enums are interleaved with other text. No need to add any other complex logic or write your own parsing.