Skip to content

Commit e6b52f1

Browse files
solves advoce for applied machine learning quiz
1 parent 7ff35b0 commit e6b52f1

14 files changed

+59
-14
lines changed

README.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -74,3 +74,8 @@ __Instructor__: Andrew Ng.
7474
- [Neural Network Gradient (Backpropagation)](week5/ex4/checkNNGradients.m)
7575
- [Neural Network Cost Function](week5/ex4/nnCostFunction.m)
7676
- [Regularized Gradient](week5/ex4/checkNNGradients.m)
77+
78+
## Week 6
79+
### Quizzes
80+
81+
### Programming Exercises

test.m

Lines changed: 43 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,46 @@
11
clc;
22
clear;
33

4-
y = [1 1 ; 1 2 ; 1 3];
5-
y = y == 2;
6-
disp(y);
7-
8-
function row = labelVector(len, label)
9-
row = zeros(len, 1);
10-
row(label) = 1;
11-
endfunction
12-
13-
disp(labelVector(4, 2)');
14-
disp(labelVector(3, 1)');
15-
disp(labelVector(10, 6)');
16-
4+
function error = logisticMisclassificationError(hypothesis, y, threshold)
5+
prediction = hypothesis >= threshold;
6+
error = sum(prediction != y) / length(y);
7+
endfunction
8+
9+
function J = regressionCost(hypothesis, data, result)
10+
trainingSampleSize = size(data, 1);
11+
J = (1 / (2 * trainingSampleSize)) * sum((data * hypothesis - result) .^ 2);
12+
endfunction
13+
14+
function grad = regressionGradient(hypothesis, X, y)
15+
trainingSampleSize = size(X, 1);
16+
grad = (1 / trainingSampleSize) * (((X * hypothesis) - y)' * X)';
17+
endfunction
18+
19+
function [minCost, hypothesis, costMemory] = gradientDescent(hypothesis, X, y, iterations, learningRate)
20+
costMemory = [];
21+
for i = 1:iterations
22+
hypothesis = hypothesis - learningRate * regressionGradient(hypothesis, X, y);
23+
minCost = regressionCost(hypothesis, X, y);
24+
costMemory = [costMemory minCost];
25+
endfor
26+
endfunction
27+
28+
hypothesis = [0.3 ; 0.5 ; 0.3 ; 0.9];
29+
result = [1 ; 0 ; 1 ; 1];
30+
error = logisticMisclassificationError(hypothesis, result, 0.5);
31+
disp(error);
32+
33+
hypothesis = [0 ; 3];
34+
data = [1 1 ; 1 2 ; 1 3];
35+
y = [1 ; 2 ; 3];
36+
disp(regressionCost(hypothesis, data, y, 3));
37+
38+
initialTheta = [0 ; 0 ];
39+
[minCost, hypothesis, costMemory] = gradientDescent(initialTheta, data, y, 100, 0.09);
40+
disp('min cost');
41+
disp(minCost);
42+
43+
disp('hypothesis');
44+
disp(hypothesis);
45+
46+
plot(costMemory);

week5/ex4/nnCostFunction.m

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -92,7 +92,6 @@
9292
Theta2_grad /= m;
9393
Theta1_grad /= m;
9494

95-
9695
% Part 3: Implement regularization with the cost function and gradients.
9796

9897
regularizationMaskTheta1 = (lambda / m) * ones(size(Theta1));
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
# Advice for Applying Machine Learning Quiz
2+
3+
![Question 1](assets/quiz-1.PNG)
4+
![Question 1](assets/question-1-2.PNG)
5+
![Question 2](assets/question-2-2.PNG)
6+
![Question 3](assets/question-3.PNG)
7+
![Question 3](assets/question-3-2.PNG)
8+
![Question 4](assets/question-4.PNG)
9+
![Question 4](assets/question-4-2.PNG)
10+
![Question 5](assets/question-5.PNG)
11+
![Question 5](assets/question-5-2.PNG)

week6/assets/question-1-2.PNG

30.4 KB
Loading

week6/assets/question-2-2.PNG

36.9 KB
Loading

week6/assets/question-2-incorrect.PNG

40.1 KB
Loading

week6/assets/question-3-2.PNG

36.4 KB
Loading

week6/assets/question-3.PNG

37.8 KB
Loading

week6/assets/question-4-2.PNG

53.5 KB
Loading

week6/assets/question-4.PNG

41.9 KB
Loading

week6/assets/question-5-2.PNG

31.8 KB
Loading

week6/assets/question-5.PNG

28.3 KB
Loading

week6/assets/quiz-1.PNG

29.3 KB
Loading

0 commit comments

Comments
 (0)