Skip to content

Commit 3e48bb5

Browse files
committed
Add example and guide for setting up AWS Lambda function, as well as list of pre-built dependency packages (ARN layers) for simplifying Lambda function deployment
1 parent 49b79da commit 3e48bb5

File tree

7 files changed

+219
-5
lines changed

7 files changed

+219
-5
lines changed

.gitignore

+3-1
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,5 @@
11
__pycache__
22
*.whl
3-
lambda_handler.py
3+
python
4+
*.zip
5+
build.bat

README.md

+12-2
Original file line numberDiff line numberDiff line change
@@ -38,19 +38,29 @@ To use the script, install dependencies via the `requirements.txt`:
3838
4. In `last_run.txt`, specify when you wish to load log files from (e.g. `2020-01-13 00:00:00`)
3939
5. Optionally modify the signal filters or resampling frequency
4040

41+
---
42+
43+
## Automation
44+
There are multiple ways to automate the script execution.
45+
46+
### 3A: Enable dynamic start time
47+
One approach is via periodic execution, triggered e.g. by Windows Task Scheduler. In this case, the 'dynamic' start time can be enabled, which will ensure the script only processes log files uploaded since the last script execution by updating the `last_run.txt` on each execution.
48+
4149

42-
### 3: Enable dynamic start time
4350
1. In `inputs.py` set `dynamic = True`
4451
2. Follow the CANedge Intro guide for setting up e.g. Windows Task Scheduler
4552

53+
### 3B: Set up AWS Lambda function
54+
Antoher approach is to use event based triggers, e.g. via AWS Lambda functions. We provide a detailed description of setting up AWS Lambda functions in the `aws_lambda_example/` sub folder.
55+
4656
---
4757
## Other practical information
4858

4959
### Change verbosity
5060
By default, summary information is printed as part of the processing. You can parse `verbose=False` as an input argument in `list_log_files`, `SetupInflux` and `DataWriter` to avoid this.
5161

5262
### Delete data from InfluxDB
53-
If you need to delete data in InfluxDB that you e.g. uploaded as part of a test, you can use the `delete_influx(name)` function from the `SetupInflux` class. Call it by parsing the name of the 'measurement' to delete (i.e. the device serial number):
63+
If you need to delete data in InfluxDB that you e.g. uploaded as part of a test, you can use the `delete_influx(name)` function from the `SetupInflux` class. Call it by parsing the name of the 'measurement' to delete (i.e. the device ID):
5464

5565
``influx.delete_influx("958D2219")``
5666

build.py

+61
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,61 @@
1+
import os, json, sys
2+
import subprocess
3+
4+
# specify base details and region list
5+
layer_name = "css-dashboard-writer"
6+
layer_description = "CSS Electronics dashboard-writer script dependencies for use in AWS Lambda functions"
7+
csv_path = "dashboard-writer/aws_lambda_example/lambda_layer_arns.csv"
8+
run_req_build = False
9+
10+
regions = [
11+
"ap-northeast-1",
12+
"ap-northeast-2",
13+
"ap-south-1",
14+
"ap-southeast-1",
15+
"ap-southeast-2",
16+
"ca-central-1",
17+
"eu-central-1",
18+
"eu-north-1",
19+
"eu-west-1",
20+
"eu-west-2",
21+
"eu-west-3",
22+
"sa-east-1",
23+
"us-east-1",
24+
"us-east-2",
25+
"us-west-1",
26+
"us-west-2",
27+
]
28+
29+
# create zip with requirements.txt dependencies
30+
if run_req_build:
31+
os.system("rmdir /S/Q python")
32+
os.system("mkdir python\lib\python3.7\site-packages")
33+
os.system(
34+
'docker run -v "%cd%":/var/task "lambci/lambda:build-python3.7" /bin/sh -c "pip install -r requirements.txt -t python/lib/python3.7/site-packages/; exit"'
35+
)
36+
os.system("rmdir /S/Q python\\lib\\python3.7\\site-packages\\botocore")
37+
os.system("zip -r dashboard-writer.zip python")
38+
39+
# for each region, publish AWS layer with build zip
40+
region_arn_list = []
41+
42+
for region in regions:
43+
arn_output = subprocess.check_output(
44+
f'aws lambda publish-layer-version --region {region} --layer-name {layer_name} --description "{layer_description}" --cli-connect-timeout 6000 --license-info "MIT" --zip-file "fileb://dashboard-writer.zip" --compatible-runtimes python3.7',
45+
shell=True,
46+
).decode("utf-8")
47+
48+
print(arn_output)
49+
50+
arn = str(json.loads(arn_output)["LayerVersionArn"])
51+
region_arn = f"{region},{arn}\n"
52+
region_arn_list.append(region_arn)
53+
54+
# write data to CSV
55+
output_file = open(csv_path, "w")
56+
for region_arn in region_arn_list:
57+
output_file.write(region_arn)
58+
59+
output_file.close()
60+
61+
print(f"Completed writing {len(region_arn_list)} out of {len(regions)} to CSV {csv_name}")
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,99 @@
1+
# AWS Lambda Automation
2+
3+
AWS Lambda functions are a smart way to auto-execute code on every log file upload.
4+
5+
Below we describe how you can set up an AWS Lambda function to automatically run the dashboard-writer script every time a new log file is uploaded.
6+
7+
---
8+
9+
## Before you get started
10+
11+
This is an advanced topic and we recommend that you get the basics in place first:
12+
- Test that your InfluxDB/Grafana setup works with the sample data
13+
- Test that your setup works with your own data/server when manually running the script from your PC
14+
- Make sure your log file split size, `inputs.py` and InfluxDB account settings are setup as needed (see below)
15+
16+
### Regarding dependencies
17+
The dashboard-writer script relies on a number of dependencies that need to be provided to the AWS Lambda function. To make this easy, we have pre-built 'layers' for the major AWS S3 regions. You can find the latest layer list in our Releases page. See below for details. By providing your AWS Lambda function with an 'ARN identifier' for a pre-built layer, you ensure that all relevant dependencies are in place. The ARN should match the region that your S3 bucket is setup in.
18+
19+
### Regarding log file size and InfluxDB account type
20+
If you're initially testing your setup with a free InfluxDB Cloud starter account, keep in mind that there is a 'write restriction' of 5 MB per 5 minutes. This means that if you try to write e.g. 30 MB of data, it will take > 30 minutes. This exceeds the AWS Lambda max timeout. If you're using AWS Lambda, we recommend that you ensure your log file split size is 2-5 MB and that the data you extract is optimized (i.e. only push relevant signals at relevant resampled frequency). Depending on your use case, this will let you set up a basic PoC.
21+
22+
For 'production setups', we recommend that you either self-host InfluxDB or use a paid InfluxDB Cloud if you intend to utilize AWS Lambda for automation.
23+
24+
### Use a test bucket before you start
25+
Before you deploy the AWS Lambda function with your 'main bucket' we recommend to set up an empty test bucket for testing. You can then change the bucket you trigger based on once you're ready.
26+
27+
### Regarding AWS and Lambda costs
28+
We recommend tracking your AWS billing costs when working with Lambda functions to ensure everything is set up correctly. We do not take responsibility for incorrect implementations or unexpected costs related to implementation of the below. Note also that the below is intended as a 'getting started' guide - not a fully cost optimized setup.
29+
30+
---
31+
32+
## Deploy your AWS Lambda function
33+
34+
1. Create an [IAM execution](https://docs.aws.amazon.com/lambda/latest/dg/lambda-intro-execution-role.html) role with permissions: `AWSLambdaBasicExecutionRole` + `AmazonS3FullAccess`
35+
2. Go to 'Services/Lambda', then select your S3 bucket region (upper right corner)
36+
3. Add a new Lambda function with a name, a Python 3.7 environment and your execution role
37+
4. Add a 'Trigger': Select S3, your test bucket, `All object create events` and suffix `.MF4`
38+
5. Create a zip containing `lambda_handler.py`, `utils.py` and `inputs.py` (ensure your inputs are updated)
39+
6. Upload the zip via the Lambda 'Actions' button and confirm that your code shows in the online code editor
40+
7. Find the pre-built layer ARN for your AWS S3 region in the `aws_lambda_layers.txt` file on our releases page
41+
8. In the 'Designer' tap, click Layers/Add a layer/Specify an ARN and parse your ARN
42+
9. Scroll to 'Basic settings' and set the 'Timeout' to `3 min` and memory to `1024 MB` (you can test/tweak these later)
43+
10. Save the script and click 'Deploy', then 'Test' (using the below test data) and verify that it succeeds
44+
11. Click 'Actions/Publish' and test that it works by
45+
12. Click 'Test' in the upper right corner and add the test JSON content below
46+
13. When you're ready, click 'Actions/Publish' to save a new version
47+
14. In AWS Services, go to Cloudwatch/Logs/Log groups and click your Lambda function to monitor events
48+
15. Download a logfile via CANcloud from your main bucket and upload to your test bucket via CANcloud (from the Home tab)
49+
16. Verify that the Lambda function is triggered within a minute and check from the log output that it processes the data
50+
17. Verify that data is written correctly to InfluxDB
51+
52+
Once everything is tested, you can change the 'Trigger' S3 bucket to your main bucket. You should of course monitor that it works as intended over a period.
53+
54+
55+
#### Lambda function test event data
56+
57+
```
58+
{
59+
"Records": [
60+
{
61+
"s3": {
62+
"bucket": {
63+
"name": "arbitrary-bucket-name",
64+
"arn": "arn:aws:s3:::arbitrary-bucket-name"
65+
},
66+
"object": {
67+
"key": "3F78A21D/00000086/00000001-5F4E8ABC.MF4"
68+
}
69+
}
70+
}
71+
]
72+
}
73+
```
74+
75+
---
76+
77+
## Build custom ARN layer package
78+
If you need to create your own AWS Lambda layer, you can take outset in the steps below (Windows):
79+
80+
1. Add a new build folder for the build process, e.g. `aws-lambda-layer/`
81+
2. Install [Docker Desktop for Windows](https://hub.docker.com/editions/community/docker-ce-desktop-windows)
82+
3. Open your command prompt and run `docker pull lambci/lambda`
83+
4. Install the [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html)
84+
5. Open your command prompt and run `aws configure` and provide your credentials
85+
6. Open Docker and go to 'Settings/Resources/File Sharing', then add your new folder
86+
7. Copy the dashboard-writer `requirements.txt` file into your build folder
87+
8. In the build folder, create a `build.bat` file with below content (update the layer name and region)
88+
9. Open your command line in the folder and run `run.bat` - this will take a few minutes
89+
10. Once done, you can use the `LayerVersionArn` value from the `APN.txt` - e.g. as below:
90+
`arn:aws:lambda:us-east-2:319723967016:layer:css-electronics-dashboard-writer:10`
91+
92+
```
93+
rmdir /S/Q python
94+
mkdir python\lib\python3.7\site-packages
95+
docker run -v "%cd%":/var/task "lambci/lambda:build-python3.7" /bin/sh -c "pip install -r requirements.txt -t python/lib/python3.7/site-packages/; exit"
96+
rmdir /S/Q python\lib\python3.7\site-packages\botocore
97+
zip -r dashboard-writer.zip python
98+
aws lambda publish-layer-version --region us-east-2 --layer-name my-layer --description "Dashboard Writer Script Dependencies" --zip-file "fileb://dashboard-writer.zip" --compatible-runtimes python3.7 > APN.txt
99+
```
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
import s3fs
2+
from utils import setup_fs, load_dbc_files, list_log_files, SetupInflux, DataWriter
3+
import inputs
4+
5+
6+
def lambda_handler(event, context=None):
7+
bucket = event["Records"][0]["s3"]["bucket"]["name"]
8+
key = event["Records"][0]["s3"]["object"]["key"]
9+
log_files = [bucket + "/" + key]
10+
11+
fs = s3fs.S3FileSystem(anon=False)
12+
db_list = load_dbc_files(inputs.dbc_paths)
13+
14+
# initialize connection to InfluxDB
15+
influx = SetupInflux(
16+
influx_url=inputs.influx_url, token=inputs.token, org_id=inputs.org_id, influx_bucket=inputs.influx_bucket
17+
)
18+
19+
# process the log files and write extracted signals to InfluxDB
20+
writer = DataWriter(fs=fs, db_list=db_list, signals=inputs.signals, res=inputs.res, db_func=influx.write_influx)
21+
writer.decode_log_files(log_files)
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
ap-northeast-1,arn:aws:lambda:ap-northeast-1:319723967016:layer:css-dashboard-writer:3
2+
ap-northeast-2,arn:aws:lambda:ap-northeast-2:319723967016:layer:css-dashboard-writer:1
3+
ap-south-1,arn:aws:lambda:ap-south-1:319723967016:layer:css-dashboard-writer:1
4+
ap-southeast-1,arn:aws:lambda:ap-southeast-1:319723967016:layer:css-dashboard-writer:1
5+
ap-southeast-2,arn:aws:lambda:ap-southeast-2:319723967016:layer:css-dashboard-writer:1
6+
ca-central-1,arn:aws:lambda:ca-central-1:319723967016:layer:css-dashboard-writer:1
7+
eu-central-1,arn:aws:lambda:eu-central-1:319723967016:layer:css-dashboard-writer:1
8+
eu-north-1,arn:aws:lambda:eu-north-1:319723967016:layer:css-dashboard-writer:1
9+
eu-west-1,arn:aws:lambda:eu-west-1:319723967016:layer:css-dashboard-writer:1
10+
eu-west-2,arn:aws:lambda:eu-west-2:319723967016:layer:css-dashboard-writer:1
11+
eu-west-3,arn:aws:lambda:eu-west-3:319723967016:layer:css-dashboard-writer:1
12+
sa-east-1,arn:aws:lambda:sa-east-1:319723967016:layer:css-dashboard-writer:1
13+
us-east-1,arn:aws:lambda:us-east-1:319723967016:layer:css-dashboard-writer:1
14+
us-east-2,arn:aws:lambda:us-east-2:319723967016:layer:css-dashboard-writer:1
15+
us-west-1,arn:aws:lambda:us-west-1:319723967016:layer:css-dashboard-writer:4
16+
us-west-2,arn:aws:lambda:us-west-2:319723967016:layer:css-dashboard-writer:2

requirements.txt

+7-2
Original file line numberDiff line numberDiff line change
@@ -3,11 +3,11 @@ bitstruct==8.11.0
33
botocore==1.17.43
44
can-decoder
55
canedge-browser
6-
canmatrix==0.9.1
6+
canmatrix==0.9.2
77
certifi==2020.6.20
88
click==7.1.2
99
docutils==0.15.2
10-
fsspec==0.8.0
10+
fsspec==0.8.4
1111
future==0.18.2
1212
influxdb-client==1.10.0
1313
jmespath==0.10.0
@@ -21,3 +21,8 @@ Rx==3.1.1
2121
s3fs==0.4.2
2222
six==1.15.0
2323
urllib3==1.25.10
24+
lxml==4.6.1
25+
xlsxwriter==1.3.7
26+
xlwt==1.3.0
27+
xlrd==1.2.0
28+
pyyaml

0 commit comments

Comments
 (0)