Skip to content

Latest commit

 

History

History
2 lines (2 loc) · 368 Bytes

File metadata and controls

2 lines (2 loc) · 368 Bytes

Implementation-of-a-2-layer-neural-network

This is an implementation of a 2 layer neural network(1 hidden layer) using the forward and backward propagation techniques and gradient descent to update weight and bias. We utilized the sigmoid activation function with the cross entropy cost function and then tried out our algorithm on the sklearn breast cancer dataset.