Skip to content

Commit 3115d22

Browse files
authored
Merge branch 'main' into m2
2 parents bf19fce + 897892b commit 3115d22

19 files changed

+3479
-1
lines changed
Lines changed: 206 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,206 @@
1+
---
2+
title: 'Getting Started with Serverless Architecture Using AWS Lambda'
3+
sidebar_label: Serverless Architecture and AWS Lambda
4+
authors: [nayanika-mukherjee]
5+
tags: [serverless, AWS Lambda, cloud computing, Python, technology]
6+
date: 2024-07-22
7+
hide_table_of_contents: true
8+
---
9+
10+
## Introduction
11+
12+
Serverless architecture is a cloud computing execution model where the cloud provider dynamically manages the allocation and provisioning of servers. AWS Lambda, a key component of serverless architecture, allows you to run code without provisioning or managing servers. This guide will introduce you to AWS Lambda and provide a step-by-step approach to getting started with serverless architecture.
13+
14+
## Key Concepts
15+
16+
### What is AWS Lambda?
17+
18+
AWS Lambda is a compute service that lets you run code in response to events and automatically manages the compute resources required by that code. You pay only for the compute time you consume.
19+
20+
### Serverless Benefits
21+
22+
- **No Server Management:** No need to provision or manage servers.
23+
- **Scalability:** Automatically scales your application by running code in response to each trigger.
24+
- **Cost Efficiency:** Pay only for the compute time you consume.
25+
26+
### Event Sources
27+
28+
Lambda can be triggered by various AWS services such as S3, DynamoDB, Kinesis, SNS, and more.
29+
30+
## Setting Up AWS Lambda
31+
32+
### Prerequisites
33+
34+
- An AWS account.
35+
- AWS CLI installed and configured.
36+
- Basic knowledge of Python (or the language you choose for your Lambda functions).
37+
38+
### Creating an IAM Role
39+
40+
Before creating a Lambda function, you need an IAM role that Lambda assumes when it executes your function.
41+
42+
```bash
43+
aws iam create-role --role-name lambda-execution-role --assume-role-policy-document file://trust-policy.json
44+
```
45+
`trust-policy.json`:
46+
```json
47+
{
48+
"Version": "2012-10-17",
49+
"Statement": [
50+
{
51+
"Effect": "Allow",
52+
"Principal": {
53+
"Service": "lambda.amazonaws.com"
54+
},
55+
"Action": "sts:AssumeRole"
56+
}
57+
]
58+
}
59+
```
60+
Attach the necessary policies to the role:
61+
```bash
62+
aws iam attach-role-policy --role-name lambda-execution-role --policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
63+
```
64+
65+
## Writing and Deploying Lambda Functions
66+
67+
### Basic Lambda Function
68+
Here is a simple Python function that returns a greeting.
69+
70+
`lambda_function.py`:
71+
```python
72+
def lambda_handler(event, context):
73+
return {
74+
'statusCode': 200,
75+
'body': 'Hello, World!'
76+
}
77+
```
78+
79+
### Creating and Deploying the Function
80+
- Create a ZIP file containing your code:
81+
```bash
82+
zip function.zip lambda_function.py
83+
```
84+
85+
- Deploy the Lambda Function:
86+
```bash
87+
aws lambda create-function --function-name HelloWorldFunction \
88+
--zip-file fileb://function.zip --handler lambda_function.lambda_handler \
89+
--runtime python3.8 --role arn:aws:iam::123456789012:role/lambda-execution-role
90+
```
91+
92+
## Lambda Execution Model
93+
94+
Lambda functions have an execution model that includes:
95+
96+
- Invocation: Functions can be invoked synchronously or asynchronously.
97+
- Concurrency: Lambda automatically scales to handle the incoming requests.
98+
- Execution Duration: You can configure the timeout for your function (default is 3 seconds, maximum is 15 minutes).
99+
100+
## Managing Lambda Functions
101+
- Updating a Function
102+
To update the function code:
103+
```bash
104+
zip function.zip lambda_function.py
105+
aws lambda update-function-code --function-name HelloWorldFunction --zip-file fileb://function.zip
106+
```
107+
108+
- Monitoring and Logging
109+
AWS Lambda integrates with Amazon CloudWatch to provide monitoring and logging. You can view logs by navigating to the CloudWatch Logs in the AWS Management Console.
110+
111+
## Advanced Topics
112+
113+
### Environment Variables
114+
You can use environment variables to pass configuration settings to your Lambda function.
115+
```bash
116+
aws lambda update-function-configuration --function-name HelloWorldFunction \
117+
--environment "Variables={ENV_VAR1=value1,ENV_VAR2=value2}"
118+
```
119+
### Layers
120+
Lambda layers allow you to package libraries and other dependencies separately from your function code.
121+
122+
- Create a layer:
123+
```bash
124+
zip -r myLayer.zip python/
125+
aws lambda publish-layer-version --layer-name myLayer --zip-file fileb://myLayer.zip
126+
```
127+
### VPC Integration
128+
You can configure your Lambda function to access resources in a VPC.
129+
```bash
130+
aws lambda update-function-configuration --function-name HelloWorldFunction \
131+
--vpc-config SubnetIds=subnet-abc123,SecurityGroupIds=sg-abc123
132+
```
133+
134+
## Performance and Scaling
135+
136+
### Cold Starts
137+
A cold start occurs when a new instance of the function is invoked after being idle. To mitigate cold starts:
138+
139+
- Optimize initialization code.
140+
- Use Provisioned Concurrency for predictable performance.
141+
142+
### Concurrency Limits
143+
You can configure reserved concurrency to limit the number of concurrent executions:
144+
```bash
145+
aws lambda put-function-concurrency --function-name HelloWorldFunction --reserved-concurrent-executions 10
146+
```
147+
## Testing and Debugging
148+
149+
### Local Testing
150+
Use the AWS SAM CLI to test Lambda functions locally:
151+
```bash
152+
sam local invoke HelloWorldFunction -e event.json
153+
```
154+
155+
### Debugging
156+
Utilize CloudWatch Logs to debug issues by adding log statements in your code:
157+
```python
158+
import logging
159+
logger = logging.getLogger()
160+
logger.setLevel(logging.INFO)
161+
162+
def lambda_handler(event, context):
163+
logger.info("Event: %s", event)
164+
return {
165+
'statusCode': 200,
166+
'body': 'Hello, World!'
167+
}
168+
```
169+
## Real-World Examples
170+
171+
### S3 Event Trigger
172+
Trigger a Lambda function when an object is uploaded to an S3 bucket:
173+
```python
174+
import json
175+
176+
def lambda_handler(event, context):
177+
for record in event['Records']:
178+
s3 = record['s3']
179+
bucket = s3['bucket']['name']
180+
key = s3['object']['key']
181+
print(f'Received event. Bucket: {bucket}, Key: {key}')
182+
return {
183+
'statusCode': 200,
184+
'body': json.dumps('Processed S3 event')
185+
}
186+
```
187+
188+
### DynamoDB Stream
189+
Process DynamoDB stream events:
190+
```python
191+
import json
192+
193+
def lambda_handler(event, context):
194+
for record in event['Records']:
195+
if record['eventName'] == 'INSERT':
196+
new_image = record['dynamodb']['NewImage']
197+
print(f'New item added: {new_image}')
198+
return {
199+
'statusCode': 200,
200+
'body': json.dumps('Processed DynamoDB stream event')
201+
}
202+
```
203+
204+
## Conclusion
205+
206+
AWS Lambda provides a powerful and flexible way to build serverless applications. By understanding its key concepts, setting up your environment, and leveraging advanced features, you can build scalable and cost-efficient solutions. This guide serves as a starting point for your journey into serverless architecture using AWS Lambda.
Lines changed: 120 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,120 @@
1+
2+
# Backpropagation in Neural Networks
3+
4+
## Overview
5+
6+
Backpropagation is a fundamental algorithm used for training artificial neural networks. It computes the gradient of the loss function with respect to each weight by the chain rule, efficiently propagating errors backward through the network. This allows for the adjustment of weights to minimize the loss function, ultimately improving the performance of the neural network.
7+
8+
9+
10+
11+
# How Backpropagation Works
12+
13+
## Forward propogation
14+
15+
- Input Layer: The input data is fed into the network.
16+
- Hidden Layers: Each layer performs computations using weights and biases to transform the input data.
17+
- Output Layer: The final transformation produces the output, which is compared to the actual target to calculate the loss.
18+
19+
### Mathematical Formulation
20+
$$
21+
a_i^l = f\left(z_i^l\right) = f\left(\sum_j w_{ij}^l a_j^{l-1} + b_i^l\right)
22+
$$
23+
24+
25+
where f is the activation function, zᵢˡ is the net input of neuron i in layer l, wᵢⱼˡ is the connection weight between neuron j in layer l — 1 and neuron i in layer l, and bᵢˡ is the bias of neuron i in layer l.
26+
27+
## Backward propogation
28+
29+
- Compute Loss: Calculate the error (loss) using a loss function (e.g., Mean Squared Error, Cross-Entropy Loss).
30+
- Error Propagation: Propagate the error backward through the network, layer by layer.
31+
- Gradient Calculation: Compute the gradient of the loss with respect to each weight using the chain rule.
32+
- Weight Update: Adjust the weights by subtracting the gradient multiplied by the learning rate.
33+
34+
### Mathematical Formulation
35+
36+
- The loss function measures how well the neural network's output matches the target values. Common loss functions include:
37+
1) **Mean Squared Error (MSE):**
38+
39+
$$
40+
L = \frac{1}{n} \sum_{i=1}^{n} (y_i - \hat{y}_i)^2
41+
$$
42+
1) **Cross-Entropy Loss:**
43+
44+
$$
45+
L = -\frac{1}{n} \sum_{i=1}^{n} \left[ y_i \log(\hat{y}_i) + (1 - y_i) \log(1 - \hat{y}_i) \right]
46+
$$
47+
48+
49+
- For each weight 𝑤 in the network, the gradient of the loss L with respect to w is computed as:
50+
51+
$$
52+
\frac{\partial L}{\partial w} = \frac{\partial L}{\partial \hat{y}} \cdot \frac{\partial \hat{y}}{\partial w}
53+
$$
54+
55+
56+
- Weights are updated using the gradient descent algorithm:
57+
58+
$$
59+
w \leftarrow w - \eta \frac{\partial L}{\partial w}
60+
$$
61+
62+
# Backpropogation from scratch
63+
64+
65+
66+
```bash
67+
import numpy as np
68+
69+
def sigmoid(x):
70+
return 1 / (1 + np.exp(-x))
71+
72+
def sigmoid_derivative(x):
73+
return x * (1 - x)
74+
75+
# Input data
76+
X = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])
77+
y = np.array([[0], [1], [1], [0]])
78+
79+
# Initialize weights and biases
80+
np.random.seed(42)
81+
weights_input_hidden = np.random.rand(2, 2)
82+
weights_hidden_output = np.random.rand(2, 1)
83+
bias_hidden = np.random.rand(1, 2)
84+
bias_output = np.random.rand(1, 1)
85+
learning_rate = 0.1
86+
87+
# Training
88+
89+
for epoch in range(10000):
90+
91+
# Forward pass
92+
hidden_input = np.dot(X, weights_input_hidden) + bias_hidden
93+
hidden_output = sigmoid(hidden_input)
94+
final_input = np.dot(hidden_output, weights_hidden_output) + bias_output
95+
final_output = sigmoid(final_input)
96+
97+
# Error
98+
error = y - final_output
99+
d_output = error * sigmoid_derivative(final_output)
100+
101+
# Backward Propogation ( gradient decent)
102+
error_hidden = d_output.dot(weights_hidden_output.T)
103+
d_hidden = error_hidden * sigmoid_derivative(hidden_output)
104+
105+
# Update weights and biases
106+
weights_hidden_output += hidden_output.T.dot(d_output) * learning_rate
107+
bias_output += np.sum(d_output, axis=0, keepdims=True) * learning_rate
108+
weights_input_hidden += X.T.dot(d_hidden) * learning_rate
109+
bias_hidden += np.sum(d_hidden, axis=0) * learning_rate
110+
111+
print("Training complete")
112+
print("Output after training:")
113+
print(final_output)
114+
115+
```
116+
117+
118+
## Conclusion
119+
120+
Backpropagation is a powerful technique for training neural networks(ANN), enabling them to learn complex patterns and make accurate predictions. Understanding the mechanics and mathematics behind it is essential to Understand inner woking of an ANN.

0 commit comments

Comments
 (0)