This is an example of how to instrument Bedrock calls with zero code changes, using opentelemetry-instrument.
When examples are run, it exports traces and logs to an OTLP compatible endpoint. Traces include details such as the model used and the duration of the chat request. Logs capture the chat request and the generated response, providing a comprehensive view of the performance and behavior of your OpenAI requests.
Note: .env file configures additional environment variables:
- OTEL_LOGS_EXPORTER=otlp to specify exporter type.
- converse.py uses bedrock-runtime Converse API <https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_Converse.html>_.
- converse_stream.py uses bedrock-runtime ConverseStream API <https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_ConverseStream.html>_.
- invoke_model.py uses bedrock-runtime InvokeModel API <https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModel.html>_.
- invoke_model_stream.py uses bedrock-runtime InvokeModelWithResponseStrea API <https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModelWithResponseStream.html>_.
Minimally, update the .env file with your "AWS_SECRET_ACCESS_KEY", "AWS_SECRET_ACCESS_KEY", "AWS_DEFAULT_REGION" and if you are using temporary credentials "AWS_SESSION_TOKEN". An OTLP compatible endpoint should be listening for traces and logs on http://localhost:4317. If not, update "OTEL_EXPORTER_OTLP_ENDPOINT" as well.
Next, set up a virtual environment like this:
python3 -m venv .venv source .venv/bin/activate pip install "python-dotenv[cli]" pip install -r requirements.txt
Run the example like this:
dotenv run -- opentelemetry-instrument python converse.py
You should see a poem generated by Bedrock while traces exported to your configured observability tool.