Caution
Computer use feature is made available to you as a ‘Beta Service’ as defined in the AWS Service Terms. It is subject to your Agreement with AWS and the AWS Service Terms, and the applicable model EULA. Computer Use poses unique risks that are distinct from standard API features or chat interfaces. These risks are heightened when using the Computer Use to interact with the Internet. To minimize risks, consider taking precautions such as: Operate computer use functionality in a dedicated Virtual Machine or container with minimal privileges to prevent direct system attacks or accidents. To prevent information theft, avoid giving the Computer Use API access to sensitive accounts or data. Limiting the computer use API’s internet access to required domains to reduce exposure to malicious content. To ensure proper oversight, keep a human in the loop for sensitive tasks (such as making decisions that could have meaningful real-world consequences) and for anything requiring affirmative consent (such as accepting cookies, executing financial transactions, or agreeing to terms of service). Any content that you enable Claude to see or access can potentially override instructions or cause Claude to make mistakes or perform unintended actions. Taking proper precautions, such as isolating Claude from sensitive surfaces, is essential — including to avoid risks related to prompt injection. Before enabling or requesting permissions necessary to enable computer use features in your own products, please inform end users of any relevant risks, and obtain their consent as appropriate.
This project contains the AWS Cloud Development Kit (CDK) infrastructure code for deploying the computer use agent using Amazon Bedrock Agent in the us-west-2 (Oregon) region.
- Amazon VPC: Configured with public and private subnets across 2 AZs in us-west-2. This configuration also has VPC flow logs enabled.
- Amazon ECR Repository: Single repository for both environment and orchestration images.
- Amazon ECS: ECS cluster for running containers.
- Amazon Task Definition: Task definations of environment and orchestration containers with appropriate port mappings.
- Amazon Security Groups:
- Environment container: Accepts traffic only from orchestration container
- Orchestration container: Accepts
public
traffic on port 8501
- AWS IAM Roles: Task execution role with minimal permissions
- Amazon CloudWatch Logs: Configured for container logging
- AWS KMS: Encryption key for secure storage
- Environment Container:
- 8443: Amazon DCV
- 5000: Quart RESTful API
- Orchestration Container:
- 8501: Streamlit interface
├── src/amazon_bedrock_agent_app/
├── src/sandbox_environment/
├── scripts/
│ ├── create_amazon_bedrock_agent.py
│ ├── delete_amazon_bedrock_agent.py
| └── ....
├── app.py
├── computer_use_aws_stack.py
└── ....
-
AWS Command Line Interface (CLI), follow instructions here. Make sure to setup credentials, follow instructions here.
-
Require Python 3.11 or later.
-
Require Node.js 14.15.0 or later.
-
AWS CDK CLI, follow instructions here.
-
Enable model access for Anthropic’s Claude Sonnet 3.5 V2 and for Anthropic’s Claude Sonnet 3.7 V1.
-
Boto3 version >= 1.37.10.
-
Clone the repository:
git clone https://github.com/awslabs/amazon-bedrock-agent-samples/ cd amazon-bedrock-agent-samples/examples/agents/computer_use chmod +x scripts/get_urls.sh
-
Configure AWS CLI for us-west-2 (if not already configured):
Important
- You can skip this step if you already have a profile created. Make sure to use the correct profile name in subsequent commands.
- Make sure to select us-west-2 region.
make configure PROFILE_NAME=computeruse
-
Setup Environment
make setup PROFILE_NAME=computeruse
-
Activate created virtual environment
source .venv/bin/activate pip install -r requirements.txt pip install -U -r requirements.txt
-
Deploy the solution in Fail-Secure Mode (Default). This is the recommended setting for the sandbox environment.
Important
In Fail-Secure Mode if no IP address is provided, the security groups will default to a highly restrictive setting (255.255.255.255/32) that effectively blocks all access.
Important
Make sure you review all the resources created by the project before deploying.
Important
Make sure only run one of the following commands.
# Deploy with your current IP (Fail-Secure)
make deploy-infrastructure PROFILE_NAME=computeruse
# Deploy with manual IP (Fail-Secure)
make deploy IP=203.0.113.1 PROFILE_NAME=computeruse
# This will automatically be converted to 203.0.113.1/32
# Deploy with IP address range (Fail-Secure)
make deploy IP=203.0.113.0/24 PROFILE_NAME=computeruse
# Allows 203.0.113.0 through 203.0.113.255
Note
This stack takes ~10-15 minutes to deploy. After the deployment it may take a few additional minutes for the Environment/Virtual Machine to come online.
-
After the deploy has completed, you can get the URLs of the services:
make get-urls PROFILE_NAME=computeruse
-
Create Amazon Bedrock Agent (w/ Computer, Text Editor and Bash Action Groups) and Amazon Bedrock Guardrail
Important
Make sure to enable model access before creating Amazon Bedrock Agent.
Important
Make sure computer use is supported by the provided model and this project.
make bedrock PROFILE_NAME=computeruse MODEL_ID=us.anthropic.claude-3-7-sonnet-20250219-v1:0 # defaults to us.anthropic.claude-3-7-sonnet-20250219-v1:0
-
List Amazon Bedrock Agent Id and Alias Id - store these values for later use.
make list-agent PROFILE_NAME=computeruse
Navigate to the links provided in the output of the make get-urls
script to access the services, the Orchestration Service URL and Environment Service URL.
Amazon DCV is used to connect to the environment container for remote desktop access, for activities such as resetting the state of the environment, or elliciting the state of the environment prior to a new task.
Important
Login to the sandbox environemnt using Amazon DCV, with username computeruse
and password admin
.
The Streamlit interface (streamlit.py) is used to configure the Amazon Bedrock Agent Id and Aliast Id. Then ultimately instruct the Amazon Bedrock Agent via a chat interface to perform tasks.
Quart is used to write a RESTful JSON APIs and is responsible for tool execution. You can run the app.py as follows:
quart run
import aiohttp
URL = "http://127.0.0.1:5000/execute" # local host
async with aiohttp.ClientSession() as session:
payload = {
"tool": "computer",
"input": {"action": "screenshot"},
}
async with session.post(
URL, json=payload
) as api_response:
api_result = await api_response.json()
After navigating to the Orchestration Service URL (Streamlit interface), you'll need to provide the Agent Id and Agent Alias Id.
Select the foundational model that you used to create Amazon Bedrock Agent. Each foundational model has its own set of tool implementation.
The stack includes a Route 53 Resolver DNS Firewall that controls domain access. By default, it operates on an allowlist basis - only explicitly allowed domains can be accessed while all others are blocked.
The firewall allows access to:
- AWS services (*.amazonaws.com, *.aws.dev, etc.)
- Amazon domains (amazon.com, a2z.com)
- Anthropic domains (anthropic.com, claude.ai)
- GitHub domains (github.com, *.githubassets.com)
- Google domains (google.com, *.googleapis.com)
- Python package repositories (pypi.org, pythonhosted.org)
- Internal service discovery domains (*.computer-use.local)
To modify the allowed domains:
- Edit the
cfn_firewall_domain_list
incomputer_use_aws_stack.py
- Add or remove domains using the following format:
domains=[
"example.com", # Allow exact domain
"*.example.com", # Allow all subdomains
]
- Redeploy the stack:
cdk deploy
View current rules:
aws route53resolver get-firewall-rule-group --firewall-rule-group-id <ID>
Get the rule group ID from stack outputs:
aws cloudformation describe-stacks --stack-name ComputerUseAwsStack --query 'Stacks[0].Outputs[?OutputKey==`DnsFirewallRuleGroupId`].OutputValue' --output text
make clean-bedrock PROFILE_NAME=computeruse
make destroy-infrastructure PROFILE_NAME=computeruse
This project is licensed under the Apache-2.0 License.
Important
Examples in this repository are for demonstration purposes. Ensure proper security and testing when deploying to production environments.