Skip to content

Commit 076556a

Browse files
authoredFeb 16, 2020
Update Optimizer and Backpropagation.ipynb (#1168)
1 parent 7e5c1d6 commit 076556a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed
 

‎notebooks/chapter19/Optimizer and Backpropagation.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
"\n",
1111
"## Stochastic Gradient Descent\n",
1212
"\n",
13-
"The goal of an optimization algorithm is to nd the value of the parameter to make loss function very low. For some types of models, an optimization algorithm might find the global minimum value of loss function, but for neural network, the most efficient way to converge loss function to a local minimum is to minimize loss function according to each example.\n",
13+
"The goal of an optimization algorithm is to find the value of the parameter to make loss function very low. For some types of models, an optimization algorithm might find the global minimum value of loss function, but for neural network, the most efficient way to converge loss function to a local minimum is to minimize loss function according to each example.\n",
1414
"\n",
1515
"Gradient descent uses the following update rule to minimize loss function:"
1616
]

0 commit comments

Comments
 (0)
Please sign in to comment.