AWS Bedrock provider for BAML

The aws-bedrock provider supports all text-output models available via the Converse API.

Quick Start

BAML
1client<llm> MyClient {
2 provider aws-bedrock
3 options {
4 model "anthropic.claude-3-sonnet-20240229-v1:0"
5 inference_configuration {
6 max_tokens 100
7 temperature 0.7
8 }
9 }
10}

Authentication

AWS Bedrock uses standard AWS authentication methods. Choose the one that best fits your environment:

The simplest way to authenticate. Set these environment variables:

$export AWS_ACCESS_KEY_ID="your_key"
>export AWS_SECRET_ACCESS_KEY="your_secret"
>export AWS_REGION="us-east-1"
BAML
1client<llm> MyClient {
2 provider aws-bedrock
3 options {
4 // No need to specify credentials - they'll be picked up from environment
5 model "anthropic.claude-3-sonnet-20240229-v1:0"
6 }
7}

Credential Resolution

BAML follows a specific order when resolving AWS credentials:

  1. Explicit BAML Configuration

    BAML
    1client<llm> MyClient {
    2 provider aws-bedrock
    3 options {
    4 access_key_id env.MY_ACCESS_KEY // Highest precedence
    5 secret_access_key env.MY_SECRET_KEY
    6 region "us-east-1"
    7 }
    8}
  2. Environment Variables

    $AWS_ACCESS_KEY_ID
    >AWS_SECRET_ACCESS_KEY
    >AWS_SESSION_TOKEN # Optional
    >AWS_REGION
    >AWS_PROFILE
  3. AWS Configuration Files

    1# ~/.aws/credentials
    2[default]
    3aws_access_key_id = ...
    4aws_secret_access_key = ...
    5
    6# ~/.aws/config
    7[default]
    8region = us-east-1
  4. Instance Metadata (EC2/ECS only)

    • IAM Role credentials
    • Instance profile credentials

Important Rules

  1. All or Nothing

    • If you provide any credential explicitly, you must provide all required credentials
    • This won’t work:
      BAML
      1client<llm> MyClient {
      2 provider aws-bedrock
      3 options {
      4 access_key_id env.AWS_ACCESS_KEY_ID
      5 // Error: secret_access_key is required when access_key_id is provided
      6 model "anthropic.claude-3-sonnet-20240229-v1:0"
      7 }
      8}
  2. Session Token Requirements

    • When using session_token, you must provide all three:
      • access_key_id
      • secret_access_key
      • session_token
  3. Profile Exclusivity

    • When using profile, you cannot specify other credentials:
      BAML
      1client<llm> MyClient {
      2 provider aws-bedrock
      3 options {
      4 profile "my-profile"
      5 access_key_id env.AWS_ACCESS_KEY_ID // Error: Cannot mix profile with explicit credentials
      6 model "anthropic.claude-3-sonnet-20240229-v1:0"
      7 }
      8}
  4. Environment Variable Override

    • Explicit values in BAML always override environment variables:
      BAML
      1client<llm> MyClient {
      2 provider aws-bedrock
      3 options {
      4 access_key_id "AKIAXXXXXXXX" // This will be used even if AWS_ACCESS_KEY_ID exists
      5 secret_access_key env.AWS_SECRET_ACCESS_KEY
      6 model "anthropic.claude-3-sonnet-20240229-v1:0"
      7 }
      8}
  5. AWS Lambda/ECS/EC2

    • In AWS services, credentials are automatically provided by the runtime
    • Explicitly provided credentials will override the automatic ones
    • Best practice: Don’t specify credentials in AWS environments, use IAM roles instead

Using Custom Environment Variables

You can map your own environment variable names:

BAML
1client<llm> MyClient {
2 provider aws-bedrock
3 options {
4 access_key_id env.MY_CUSTOM_AWS_KEY_ID
5 secret_access_key env.MY_CUSTOM_AWS_SECRET
6 session_token env.MY_CUSTOM_AWS_SESSION // Optional
7 region env.MY_CUSTOM_AWS_REGION
8 model "anthropic.claude-3-sonnet-20240229-v1:0"
9 }
10}

Cross-Account Access

To use Bedrock from a different AWS account:

  1. Set up the target account role (where Bedrock is):
1{
2 "Version": "2012-10-17",
3 "Statement": [
4 {
5 "Effect": "Allow",
6 "Principal": {
7 "AWS": "arn:aws:iam::SOURCE_ACCOUNT_ID:root"
8 },
9 "Action": "sts:AssumeRole",
10 "Condition": {
11 "StringEquals": {
12 "sts:ExternalId": "YOUR_EXTERNAL_ID"
13 }
14 }
15 }
16 ]
17}
  1. Configure the source account (where your application runs):
1# ~/.aws/config
2[profile target-role]
3role_arn = arn:aws:iam::TARGET_ACCOUNT_ID:role/ROLE_NAME
4source_profile = default
5region = us-east-1
BAML
1client<llm> MyClient {
2 provider aws-bedrock
3 options {
4 profile "target-role"
5 model "anthropic.claude-3-sonnet-20240229-v1:0"
6 }
7}

IAM Permissions

Basic Permissions

The following IAM permissions are required for basic Bedrock access:

1{
2 "Version": "2012-10-17",
3 "Statement": [
4 {
5 "Effect": "Allow",
6 "Action": [
7 "bedrock:InvokeModel",
8 "bedrock:InvokeModelWithResponseStream"
9 ],
10 "Resource": "arn:aws:bedrock:*:*:model/*"
11 }
12 ]
13}

Additional Permissions

Depending on your setup, you might need additional permissions:

See Cross-Account Access section for the required trust relationships and permissions.

Best Practices

  • Follow the principle of least privilege
  • Use resource-based policies when possible
  • Consider using AWS Organizations SCPs for enterprise-wide controls
  • Regularly audit IAM permissions using AWS IAM Access Analyzer

Configuration Options

BAML-specific request options

These unique parameters (aka options) are modify the API request sent to the provider.

You can use this to modify the region, access_key_id, secret_access_key, and session_token sent to the provider.

region
string

The AWS region to use. Default: AWS_REGION environment variable

access_key_id
string

AWS access key ID. Default: AWS_ACCESS_KEY_ID environment variable

secret_access_key
string

AWS secret access key. Default: AWS_SECRET_ACCESS_KEY environment variable

session_token
string

Temporary session token. Required if using temporary credentials. Default: AWS_SESSION_TOKEN environment variable

profile
string

AWS profile name from credentials file. Default: AWS_PROFILE environment variable

default_role
string

The role to use if the role is not in the allowed_roles. Default: "user" usually, but some models like OpenAI’s gpt-4o will use "system"

Picked the first role in allowed_roles if not “user”, otherwise “user”.

allowed_roles
string[]

Which roles should we forward to the API? Default: ["system", "user", "assistant"] usually, but some models like OpenAI’s o1-mini will use ["user", "assistant"]

When building prompts, any role not in this list will be set to the default_role.

allowed_role_metadata
string[]

Which role metadata should we forward to the API? Default: []

For example you can set this to ["foo", "bar"] to forward the cache policy to the API.

If you do not set allowed_role_metadata, we will not forward any role metadata to the API even if it is set in the prompt.

Then in your prompt you can use something like:

1client<llm> Foo {
2 provider openai
3 options {
4 allowed_role_metadata: ["foo", "bar"]
5 }
6}
7
8client<llm> FooWithout {
9 provider openai
10 options {
11 }
12}
13template_string Foo() #"
14 {{ _.role('user', foo={"type": "ephemeral"}, bar="1", cat=True) }}
15 This will be have foo and bar, but not cat metadata. But only for Foo, not FooWithout.
16 {{ _.role('user') }}
17 This will have none of the role metadata for Foo or FooWithout.
18"#

You can use the playground to see the raw curl request to see what is being sent to the API.

supports_streaming
boolean

Whether the internal LLM client should use the streaming API. Default: true

Then in your prompt you can use something like:

1client<llm> MyClientWithoutStreaming {
2 provider anthropic
3 options {
4 model claude-3-haiku-20240307
5 api_key env.ANTHROPIC_API_KEY
6 max_tokens 1000
7 supports_streaming false
8 }
9}
10
11function MyFunction() -> string {
12 client MyClientWithoutStreaming
13 prompt #"Write a short story"#
14}
1# This will be streamed from your python code perspective,
2# but under the hood it will call the non-streaming HTTP API
3# and then return a streamable response with a single event
4b.stream.MyFunction()
5
6# This will work exactly the same as before
7b.MyFunction()
finish_reason_allow_list
string[]

Which finish reasons are allowed? Default: null

version 0.73.0 onwards: This is case insensitive.

Will raise a BamlClientFinishReasonError if the finish reason is not in the allow list. See Exceptions for more details.

Note, only one of finish_reason_allow_list or finish_reason_deny_list can be set.

For example you can set this to ["stop"] to only allow the stop finish reason, all other finish reasons (e.g. length) will treated as failures that PREVENT fallbacks and retries (similar to parsing errors).

Then in your code you can use something like:

1client<llm> MyClient {
2 provider "openai"
3 options {
4 model "gpt-4o-mini"
5 api_key env.OPENAI_API_KEY
6 // Finish reason allow list will only allow the stop finish reason
7 finish_reason_allow_list ["stop"]
8 }
9}
finish_reason_deny_list
string[]

Which finish reasons are denied? Default: null

version 0.73.0 onwards: This is case insensitive.

Will raise a BamlClientFinishReasonError if the finish reason is in the deny list. See Exceptions for more details.

Note, only one of finish_reason_allow_list or finish_reason_deny_list can be set.

For example you can set this to ["length"] to stop the function from continuing if the finish reason is length. (e.g. LLM was cut off because it was too long).

Then in your code you can use something like:

1client<llm> MyClient {
2 provider "openai"
3 options {
4 model "gpt-4o-mini"
5 api_key env.OPENAI_API_KEY
6 // Finish reason deny list will allow all finish reasons except length
7 finish_reason_deny_list ["length"]
8 }
9}

Provider request parameters

These are other options that are passed through to the provider, without modification by BAML. For example if the request has a temperature field, you can define it in the client here so every call has that set.

Consult the specific provider’s documentation for more information.

model (or model_id)
stringRequired

The model to use.

ModelDescription
anthropic.claude-3-5-sonnet-20240620-v1:0Smartest
anthropic.claude-3-haiku-20240307-v1:0Fastest + Cheapest
meta.llama3-8b-instruct-v1:0
meta.llama3-70b-instruct-v1:0
mistral.mistral-7b-instruct-v0:2
mistral.mixtral-8x7b-instruct-v0:1

Run aws bedrock list-foundation-models | jq '.modelSummaries.[].modelId' to see available models.

Note: You must request model access before use.

inference_configuration
object

Model-specific inference parameters. See AWS Bedrock documentation.

BAML
1client<llm> MyClient {
2 provider aws-bedrock
3 options {
4 inference_configuration {
5 max_tokens 1000
6 temperature 1.0
7 top_p 0.8
8 }
9 }
10}

Troubleshooting

Common Errors

1{
2 "Error": "AccessDeniedException",
3 "Message": "User is not authorized to perform: bedrock:InvokeModel"
4}

Solution:

  • Check IAM permissions
  • Verify execution role permissions in Lambda/ECS
  • Ensure credentials have Bedrock access
1{
2 "Error": "UnrecognizedClientException",
3 "Message": "The security token included in the request is invalid"
4}

Solution:

  • Verify credentials are set correctly
  • Check if session token is required and provided
  • Ensure credentials haven’t expired
1{
2 "Error": "ValidationException",
3 "Message": "Model is not supported in this Region"
4}

Solution:

  • Check model availability in your region
  • Request model access if needed
  • Consider using a different region
1{
2 "Error": "ValidationException",
3 "Message": "Account is not authorized to use model"
4}

Solution:

  • Request model access through AWS Console
  • Wait for approval (1-2 business days)
  • Verify model ID is correct

Environment-Specific Setup

  • Set appropriate memory and timeout
  • Configure execution role with Bedrock permissions
  • Consider VPC endpoints for private subnets
  • Use task roles (ECS) or instance profiles (EC2)
  • Configure VPC endpoints if needed
  • Check security group outbound rules
  • Set AWS credentials in environment or config files
  • Use AWS_PROFILE to manage multiple profiles
  • Run aws configure list to verify configuration
Built with