Amazon Bedrock provides OpenAI compatible API endpoints for model inference, powered by Mantle, a distributed inference engine for large-scale machine learning model serving. Mantle is designed with Zero Operator Access required - which basically means there is no technical way for an AWS operator to access the systems that power Mantle - everything is managed via automation. See blog for details https://aws.amazon.com/blogs/machine-learning/exploring-the-zero-operator-access-design-of-mantle/.
User guide and API info here https://docs.aws.amazon.com/bedrock/latest/userguide/bedrock-mantle.html
Supported regions - ensure you call the correct endpoint for the region you want: https://docs.aws.amazon.com/bedrock/latest/userguide/bedrock-mantle.html#bedrock-mantle-supported
Below I use glow for printing nicely the markdown the LLM gives you. Install for your platform https://github.com/charmbracelet/glow
Also use uv for painless Python scripts.
Not all models support Mantle. See the API call below to see supported models. Not all OpenAI APIs are supported for all models. For this example I am using the OpenAI Completions API https://platform.openai.com/docs/api-reference/completions.
# Ensure you have some AWS creds setup in your environment eg: aws sso login, personally i use granted.dev
brew install glow # pretty print the markdown directly in your terminal
# generate a short term token using your logged in AWS permissions this will expire when aws creds expire
export AWS_REGION=us-west-2 # or choose a region you want to use from the list in the link above
export AWS_BEARER_TOKEN_BEDROCK=$(uv run --with aws-bedrock-token-generator python -c "from aws_bedrock_token_generator import provide_token; token = provide_token()
print(f\"{token}\")")
# See supported models in the region you have configured
curl -X GET https://bedrock-mantle.us-west-2.api.aws/v1/models \
-H "Authorization: Bearer $AWS_BEARER_TOKEN_BEDROCK"|jq
# Make inference request to a selected model
curl -X POST https://bedrock-mantle.us-west-2.api.aws/v1/chat/completions \
-H "Authorization: Bearer $AWS_BEARER_TOKEN_BEDROCK" \
-H "Content-Type: application/json" \
-d '{
"model": "qwen.qwen3-coder-30b-a3b-instruct",
"messages":[ {"role": "system",
"content": "You are a helpful coding assistant."},
{"role": "user",
"content":"create me a small hello world python script with a twist"}]}' \
| jq -r '.choices[0].message.content' | glowHere's a fun "Hello World" Python script with a twist - it creates a interactive greeting that changes based on user input:
#!/usr/bin/env python3
import random
def main():
# Different greeting styles
greetings = [
"Hello, {}!",
"Hey there, {}!",
"Greetings, {}!",
"Hiya, {}!",
"Welcome, {}!"
]
# Different responses for different inputs
responses = {
'python': "Ah, a fellow Python enthusiast! π",
'programming': "Programming is fun! What are you building?",
'hello': "Hello yourself! π",
'world': "The world is your oyster! π",
'help': "I'm here to help with greetings and more! π"
}
print("π€ Welcome to the Interactive Hello World! π€")
print("=" * 50)
# Get user's name
name = input("What's your name? ").strip()
if not name:
name = "World"
# Get user's interest
interest = input(f"Hello, {name}! What are you interested in? ").strip().lower()
# Pick a random greeting
greeting = random.choice(greetings).format(name)
# Display personalized message
print("\n" + "=" * 50)
print(greeting)
# Check for specific interests
if interest in responses:
print(responses[interest])
else:
print("Nice to meet you! How can I assist you today?")
# Add a little surprise
if name.lower() == 'python':
print("\nπ Python! That's my name too! Let's code together!")
print("=" * 50)
if __name__ == "__main__":
main()This script has several twists:
- It asks for your name and uses it in the greeting
- It learns about your interests and responds accordingly
- It randomly selects from multiple greeting styles
- It has special responses for certain keywords like "python", "programming", etc.
- It includes a surprise when someone names themselves "Python"
- It uses emojis for a friendly touch
- The entire flow feels interactive rather than just printing static text
To run it, simply save it as hello_world.py and execute: python hello_world.py
The twist? Instead of boring old "Hello World!", you get an interactive experience that adapts to what you tell it!