Skip to content

Commit 950e881

Browse files
authored
Merge pull request #3884 from Shantnu-singh/main
Adding activation functions
2 parents 926777c + a11f71b commit 950e881

File tree

2 files changed

+207
-0
lines changed

2 files changed

+207
-0
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,107 @@
1+
# Activation Functions in Deep Learning: LaTeX Equations and Python Implementation
2+
3+
## Overview
4+
5+
This project provides LaTeX equations, explanations, and Python implementations for various activation functions used in Artificial Neural Networks (ANN) and Deep Learning. Our goal is to offer clear, visually appealing mathematical representations and practical implementations of these functions for educational and reference purposes.
6+
7+
## Contents
8+
9+
1. [Introduction to Activation Functions](#introduction-to-activation-functions)
10+
2. [Activation Functions](#activation-functions)
11+
3. [Mathematical Equations](#mathematical-equations)
12+
4. [Python Implementations](#python-implementations)
13+
5. [Jupyter Notebook](#jupyter-notebook)
14+
7. [Comparison of Activation Functions](#comparison-of-activation-functions)
15+
8. [How to Use This Repository](#how-to-use-this-repository)
16+
17+
18+
## Introduction to Activation Functions
19+
20+
Activation functions are crucial components in neural networks, introducing non-linearity to the model and allowing it to learn complex patterns. They determine the output of a neural network node, given an input or set of inputs.
21+
22+
## Activation Functions
23+
24+
This project covers the following activation functions:
25+
26+
### Non-Linear Activation Functions
27+
Non-linear activation functions introduce non-linearity into the model, enabling the network to learn and represent complex patterns.
28+
29+
- Essential for deep learning models as they introduce the non-linearity needed to capture complex patterns and relationships in the data.
30+
31+
- Here are some common non-linear activation functions:
32+
1. Sigmoid
33+
2. Hyperbolic Tangent (tanh)
34+
3. Rectified Linear Unit (ReLU)
35+
36+
### Linear Activation Functions
37+
A linear activation function is a function where the output is directly proportional to the input.
38+
39+
- **Linearity:** The function does not introduce any non-linearity. The output is just a scaled version of the input.
40+
- **Derivative:** The derivative of the function is constant, which means it does not vary with the input.
41+
42+
- Here are some common linear activation functions:
43+
44+
1. Identity
45+
2. Step Function
46+
47+
## Mathematical Equations
48+
49+
We provide LaTeX equations for each activation function. For example:
50+
51+
1. Sigmoid: $\sigma(x) = \frac{1}{1 + e^{-x}}$
52+
2. Hyperbolic Tangent: $\tanh(x) = \frac{e^x - e^{-x}}{e^x + e^{-x}}$
53+
3. ReLU: $f(x) = \max(0, x)$
54+
4. Linear : $f(x) = x$
55+
5. Step :
56+
57+
$$
58+
f(x) =
59+
\begin{cases}
60+
0 & \text{if } x < \text{threshold} \\
61+
1 & \text{if } x \geq \text{threshold}
62+
\end{cases}
63+
$$
64+
65+
66+
## Python Implementations
67+
68+
Here are the Python implementations of the activation functions:
69+
70+
```python
71+
import numpy as np
72+
73+
# Non-Linear activation functions
74+
def sigmoid(x):
75+
return 1 / (1 + np.exp(-x))
76+
77+
def tanh(x):
78+
return (np.exp(x) - np.exp(-x)) / (np.exp(x) + np.exp(-x))
79+
80+
def reLu(x):
81+
return np.maximum(x, 0)
82+
83+
# Linear activation functions
84+
def identity(x):
85+
return x
86+
87+
def step(x, thres):
88+
return np.where(x >= thres, 1, 0)
89+
```
90+
91+
92+
## How to Use This Repository
93+
94+
- Clone this repository to your local machine.
95+
96+
```bash
97+
git clone https://github.com/CodeHarborHub/codeharborhub.github.io/tree/main/docs/Deep%20Learning/Activation function
98+
```
99+
- For Python implementations and visualizations:
100+
101+
1. Ensure you have Jupyter Notebook installed
102+
103+
```bash
104+
pip install jupyter
105+
```
106+
2. Navigate to the project directory in your terminal.
107+
3. Open activation_functions.ipynb.

0 commit comments

Comments
 (0)