We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
great code https://github.com/wiqaaas/youtube/blob/master/Machine_Learning_from_Scratch/Ridge_Regression/Ridge_Regression_using_Gradient_Descent.ipynb and video https://www.youtube.com/watch?v=1j6DClKQjZY&ab_channel=wiqaaas
only why you calculating differently intercept
if feature_is_constant == True: derivative = 2 * np.dot(errors, feature)
since in https://github.com/chasinginfinity/ml-from-scratch/blob/master/02%20Linear%20Regression%20using%20Gradient%20Descent/Linear%20Regression%20using%20Gradient%20Descent.ipynb and https://www.youtube.com/watch?v=4PHI11lX11I&t=536s&ab_channel=AdarshMenon
it calculated as D_c = (-2/n) * sum(Y - Y_pred) # Derivative wrt c
The text was updated successfully, but these errors were encountered:
No branches or pull requests
great code
https://github.com/wiqaaas/youtube/blob/master/Machine_Learning_from_Scratch/Ridge_Regression/Ridge_Regression_using_Gradient_Descent.ipynb
and video
https://www.youtube.com/watch?v=1j6DClKQjZY&ab_channel=wiqaaas
only why you calculating differently intercept
since in
https://github.com/chasinginfinity/ml-from-scratch/blob/master/02%20Linear%20Regression%20using%20Gradient%20Descent/Linear%20Regression%20using%20Gradient%20Descent.ipynb
and
https://www.youtube.com/watch?v=4PHI11lX11I&t=536s&ab_channel=AdarshMenon
it calculated as
D_c = (-2/n) * sum(Y - Y_pred) # Derivative wrt c
The text was updated successfully, but these errors were encountered: