Skip to content

Commit 402f65e

Browse files
committed
Initial push for tensorflow-train-in-sagemaker-deploy-with-lambda
1 parent 167a7b9 commit 402f65e

File tree

7 files changed

+163
-0
lines changed

7 files changed

+163
-0
lines changed

.gitignore

+2
Original file line numberDiff line numberDiff line change
@@ -306,3 +306,5 @@ xgboost-built-in-algo-train-in-sagemaker-deploy-with-lambda/model/xgboost-model
306306
xgboost-built-in-algo-train-in-sagemaker-deploy-with-lambda/model/model.tar.gz
307307
xgboost-built-in-algo-train-in-sagemaker-deploy-with-lambda/.aws-sam/build.toml
308308
xgboost-built-in-algo-train-in-sagemaker-deploy-with-lambda/samconfig.toml
309+
tensorflow-train-in-sagemaker-deploy-with-lambda/container/model/model.tar.gz
310+
tensorflow-train-in-sagemaker-deploy-with-lambda/.aws-sam/build.toml
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,83 @@
1+
## Train TensorFlow algorithm in SageMaker, inference with AWS Lambda
2+
3+
This examples illustrates how to use a TensorFlow Python script to train a classification model on the MNIST dataset. You train the model using SageMaker and inference with AWS Lambda.
4+
5+
This project contains source code and supporting files for a serverless application that you can deploy with the SAM CLI. It includes the following files and folders.
6+
7+
- container - The container directory has all the components you need to package the sample Lambda function.
8+
- events - Invocation events that you can use to invoke the function.
9+
- template.yaml - A template that defines the application's AWS resources.
10+
11+
The application uses several AWS resources, including Lambda functions. These resources are defined in the `template.yaml` file in this project. You can update the template to add AWS resources through the same deployment process that updates your application code.
12+
13+
## Deploy the sample application
14+
15+
The Serverless Application Model Command Line Interface (SAM CLI) is an extension of the AWS CLI that adds functionality for building and testing Lambda applications. It uses Docker to run your functions in an Amazon Linux environment that matches Lambda. It can also emulate your application's build environment and API.
16+
17+
To use the SAM CLI, you need the following tools.
18+
19+
* SAM CLI - [Install the SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html)
20+
* Docker - [Install Docker community edition](https://hub.docker.com/search/?type=edition&offering=community)
21+
22+
You may need the following for local testing.
23+
* [Python 3 installed](https://www.python.org/downloads/)
24+
25+
To build and deploy your application for the first time, run the following in your shell:
26+
27+
```bash
28+
sam build
29+
sam deploy --guided
30+
```
31+
32+
The first command will build a docker image from a Dockerfile and then copy the source of your application inside the Docker image. The second command will package and deploy your application to AWS, with a series of prompts:
33+
34+
* **Stack Name**: The name of the stack to deploy to CloudFormation. This should be unique to your account and region, and a good starting point would be something matching your project name.
35+
* **AWS Region**: The AWS region you want to deploy your app to.
36+
* **Confirm changes before deploy**: If set to yes, any change sets will be shown to you before execution for manual review. If set to no, the AWS SAM CLI will automatically deploy application changes.
37+
* **Allow SAM CLI IAM role creation**: Many AWS SAM templates, including this example, create AWS IAM roles required for the AWS Lambda function(s) included to access AWS services. By default, these are scoped down to minimum required permissions. To deploy an AWS CloudFormation stack which creates or modified IAM roles, the `CAPABILITY_IAM` value for `capabilities` must be provided. If permission isn't provided through this prompt, to deploy this example you must explicitly pass `--capabilities CAPABILITY_IAM` to the `sam deploy` command.
38+
* **Save arguments to samconfig.toml**: If set to yes, your choices will be saved to a configuration file inside the project, so that in the future you can just re-run `sam deploy` without parameters to deploy changes to your application.
39+
40+
## Use the SAM CLI to build and test locally
41+
42+
Build your application with the `sam build` command.
43+
44+
```bash
45+
tensorflow-train-in-sagemaker-deploy-with-lambda$ sam build
46+
```
47+
48+
The SAM CLI builds a docker image from a Dockerfile and then installs dependencies defined in `requirements.txt` inside the docker image. The processed template file is saved in the `.aws-sam/build` folder.
49+
50+
Test a single function by invoking it directly with a test event. An event is a JSON document that represents the input that the function receives from the event source. Test events are included in the `events` folder in this project.
51+
52+
Run functions locally and invoke them with the `sam local invoke` command.
53+
54+
```bash
55+
tensorflow-train-in-sagemaker-deploy-with-lambda$ sam local invoke TensorFlowMnistInferenceFunction --event events/event.json
56+
```
57+
58+
59+
## Fetch, tail, and filter Lambda function logs
60+
61+
To simplify troubleshooting, SAM CLI has a command called `sam logs`. `sam logs` lets you fetch logs generated by your deployed Lambda function from the command line. In addition to printing the logs on the terminal, this command has several nifty features to help you quickly find the bug.
62+
63+
`NOTE`: This command works for all AWS Lambda functions; not just the ones you deploy using SAM.
64+
65+
```bash
66+
tensorflow-train-in-sagemaker-deploy-with-lambda$ sam logs -n TensorFlowMnistInferenceFunction --stack-name tensorflow-train-in-sagemaker-deploy-with-lambda --tail
67+
```
68+
69+
You can find more information and examples about filtering Lambda function logs in the [SAM CLI Documentation](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-logging.html).
70+
71+
72+
## Cleanup
73+
74+
To delete the sample application that you created, use the AWS CLI. Assuming you used your project name for the stack name, you can run the following:
75+
76+
```bash
77+
aws cloudformation delete-stack --stack-name tensorflow-train-in-sagemaker-deploy-with-lambda
78+
```
79+
80+
## Resources
81+
82+
See the [AWS SAM developer guide](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/what-is-sam.html) for an introduction to SAM specification, the SAM CLI, and serverless application concepts.
83+
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
FROM public.ecr.aws/lambda/python:3.7
2+
3+
COPY requirements.txt ./requirements.txt
4+
RUN pip install -r requirements.txt
5+
6+
COPY ./model/model.tar.gz .
7+
RUN tar -xzf model.tar.gz
8+
9+
COPY ./app/app.py ./
10+
11+
CMD ["app.handler"]
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
import json
2+
3+
import boto3
4+
import numpy as np
5+
import tensorflow as tf
6+
7+
s3 = boto3.client('s3')
8+
9+
# Loading model
10+
model_path = './000000001/'
11+
loaded_model = tf.saved_model.load(model_path)
12+
infer = loaded_model.signatures['serving_default']
13+
14+
def handler(event, context):
15+
print('Received event: ' + json.dumps(event, indent=2))
16+
17+
destination = '/tmp/' + event["file"]
18+
s3.download_file(event["bucket"], event["prefix"] + event["file"], destination)
19+
data = np.load(destination)
20+
21+
predictions = infer(tf.constant(data))['dense_1']
22+
print('predictions: {}'.format(predictions))
23+
24+
result=[]
25+
for element in predictions:
26+
prediction = np.argmax(element)
27+
result.append(int(prediction))
28+
29+
print('Returning result: {}'.format(result))
30+
31+
return {
32+
'statusCode': 200,
33+
'body': json.dumps(result)
34+
}
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
boto3
2+
tensorflow==2.4.0
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
{
2+
"bucket":"sagemaker-sample-data-us-east-1",
3+
"prefix":"tensorflow/mnist/",
4+
"file": "train_data.npy"
5+
}
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
AWSTemplateFormatVersion: '2010-09-09'
2+
Transform: AWS::Serverless-2016-10-31
3+
Description: >
4+
tensorflow-mnist-inference-docker-lambda
5+
6+
SAM Template for tensorflow-mnist-inference-docker-lambda
7+
8+
Resources:
9+
TensorFlowMnistInferenceFunction:
10+
Type: AWS::Serverless::Function # More info about Function Resource: https://github.com/awslabs/serverless-application-model/blob/master/versions/2016-10-31.md#awsserverlessfunction
11+
Properties:
12+
PackageType: Image
13+
MemorySize: 1536
14+
Timeout: 120
15+
Metadata:
16+
DockerTag: python3.7-v1
17+
DockerContext: ./container/
18+
Dockerfile: Dockerfile
19+
20+
Outputs:
21+
TensorFlowMnistInferenceFunction:
22+
Description: "TensorFlowMnistInference Lambda Function ARN"
23+
Value: !GetAtt TensorFlowMnistInferenceFunction.Arn
24+
TensorFlowInferenceFunctionIamRole:
25+
Description: "Implicit IAM Role created for TensorFlowMnistInference function"
26+
Value: !GetAtt TensorFlowMnistInferenceFunction.Arn

0 commit comments

Comments
 (0)