diff --git a/mlops-multi-account-tf/README.md b/mlops-multi-account-tf/README.md index 8645d9b..2c1ce33 100644 --- a/mlops-multi-account-tf/README.md +++ b/mlops-multi-account-tf/README.md @@ -6,11 +6,11 @@ As enterprise businesses embrace Machine Learning (ML) across their organisation ## High level architecture -In this repository, we show how to use **Terraform** with **GitHub and GitHub Actions** to build a baseline infratsturcture for secure MLOps. The solution can be broken down into three parts: +In this repository, we show how to use **Terraform** with **GitHub and GitHub Actions** to build a baseline infrastructure for secure MLOps. The solution can be broken down into three parts: **Base Infrastructure** -The necessary infrastructure componenets for your accounts including SageMaker Studio, Networking, Permissions and SSM Parameters. +The necessary infrastructure components for your accounts including SageMaker Studio, Networking, Permissions and SSM Parameters. drawing @@ -26,7 +26,7 @@ This is how the end-users (Data Scientists or ML Engineers) use SageMaker projec Typically, when a SageMaker project is deployed: - GitHub private repos are created from templates that Data Scientists need to customize as per their use-case. -- These tempalates show best practices such as testing, approvals, and dashboards. They can be fully customized once deployed. +- These templates show best practices such as testing, approvals, and dashboards. They can be fully customized once deployed. - Depending on the chosen SageMaker project, other project specific resources might also be created such as a dedicated S3 bucket for the project and automation to trigger ML deployment from model registry. An architecture for the `Building, training, and deployment` project is shown below. @@ -37,7 +37,7 @@ Currently, three example project template are available. 1. **MLOps Template for Model Building, Training, and Deployment**: ML Ops pattern to train models using SageMaker pipelines and to deploy the trained model into preproduction and production accounts. This template supports Real-time inference, Batch Inference Pipeline, and BYOC containers. -2. **MLOps Template for promoting the full ML pipeline across environments**: ML Ops pattern to shows how to take the same SageMaker pipeline across environements from dev to prod. +2. **MLOps Template for promoting the full ML pipeline across environments**: ML Ops pattern to show how to take the same SageMaker pipeline across environements from dev to prod. 3. **MLOps Template for Model Building and Training**: MLOps pattern that shows a simple one-account SageMaker Pipeline setup. @@ -121,8 +121,8 @@ This one-time deployment create the following resources in your AWS account: - For Terrafrom Backend: - S3 Bucket to store state files. - - DynamoDB table to store state locking. -- AWS Idenitity provider for GitHub actions using OIDC (as explained above) + - DynamoDB table to store state locking. +- AWS Identity provider for GitHub actions using OIDC (as explained above) - IAM Role to assume from GitHub Actions using the identity provider. Once this is deployed, you're ready to move on to the next step. @@ -131,7 +131,7 @@ Once this is deployed, you're ready to move on to the next step. We will move the code from this example to your GitHub Organization. -1. [base-infrastructure](./base-infrastructure/): An internal reposotry for Base Infrastructure which wil contain all code from `./sagemaker-mlops-terraform` folder. +1. [base-infrastructure](./base-infrastructure/): An internal repository for Base Infrastructure which will contain all code from `./sagemaker-mlops-terraform` folder. 2. [template-repos](./template-repos/): GitHub [template repositories](https://docs.github.com/en/repositories/creating-and-managing-repositories/creating-a-template-repository) with code from `./template-repos/**`. Make sure to use the same name as the folder name. > **_Note_:** This is an important step to be able to deploy infrastructure. All further steps should be performed directly in your GitHub Organization.