annotate tvii/logistic_regression.py @ 70:351fc97bb996

add error computation + test functions
author Jeff Hammel <k0scist@gmail.com>
date Sun, 17 Dec 2017 13:22:44 -0800
parents 0908b6cd3217
children
Ignore whitespace changes - Everywhere: Within whitespace: At end of lines:
rev   line source
2
1214c127fe43 steps towards logistic regression
Jeff Hammel <k0scist@gmail.com>
parents:
diff changeset
1 """
1214c127fe43 steps towards logistic regression
Jeff Hammel <k0scist@gmail.com>
parents:
diff changeset
2 z = w'x + b
1214c127fe43 steps towards logistic regression
Jeff Hammel <k0scist@gmail.com>
parents:
diff changeset
3 a = sigmoid(z)
1214c127fe43 steps towards logistic regression
Jeff Hammel <k0scist@gmail.com>
parents:
diff changeset
4 L(a,y) = -(y*log(a) + (1-y)*log(1-a))
11
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
5
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
6 [| | | ]
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
7 X = [x1 x2 x3]
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
8 [| | | ]
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
9
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
10 [z1 z2 z3 .. zm] = w'*X + [b b b b ] = [w'*x1+b + w'*x2+b ...]
2
1214c127fe43 steps towards logistic regression
Jeff Hammel <k0scist@gmail.com>
parents:
diff changeset
11 """
11
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
12
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
13
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
14 import numpy as np
54
0807ac8992ba [logistic regression] note further steps
Jeff Hammel <k0scist@gmail.com>
parents: 45
diff changeset
15 import sklearn
16
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
16 from .sigmoid import sigmoid
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
17
54
0807ac8992ba [logistic regression] note further steps
Jeff Hammel <k0scist@gmail.com>
parents: 45
diff changeset
18 def logistic_regression(X, Y):
0807ac8992ba [logistic regression] note further steps
Jeff Hammel <k0scist@gmail.com>
parents: 45
diff changeset
19 """"train a logisitic regression classifier"""
0807ac8992ba [logistic regression] note further steps
Jeff Hammel <k0scist@gmail.com>
parents: 45
diff changeset
20 clf = sklearn.linear_model.LogisticRegressionCV()
0807ac8992ba [logistic regression] note further steps
Jeff Hammel <k0scist@gmail.com>
parents: 45
diff changeset
21 clf.fit(X.T, Y.T)
0807ac8992ba [logistic regression] note further steps
Jeff Hammel <k0scist@gmail.com>
parents: 45
diff changeset
22 return clf
0807ac8992ba [logistic regression] note further steps
Jeff Hammel <k0scist@gmail.com>
parents: 45
diff changeset
23
22
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
24
45
4d173452377e notes on backpropagation
Jeff Hammel <k0scist@gmail.com>
parents: 33
diff changeset
25 def loss(a, y):
4d173452377e notes on backpropagation
Jeff Hammel <k0scist@gmail.com>
parents: 33
diff changeset
26 # UNTESTED!
4d173452377e notes on backpropagation
Jeff Hammel <k0scist@gmail.com>
parents: 33
diff changeset
27 # derivative = -(y/a) + (1-y)/(1-a)
4d173452377e notes on backpropagation
Jeff Hammel <k0scist@gmail.com>
parents: 33
diff changeset
28 return -y*np.log(a) - (1-y)*np.log(1-a)
4d173452377e notes on backpropagation
Jeff Hammel <k0scist@gmail.com>
parents: 33
diff changeset
29
54
0807ac8992ba [logistic regression] note further steps
Jeff Hammel <k0scist@gmail.com>
parents: 45
diff changeset
30
22
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
31 def propagate(w, b, X, Y):
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
32 """
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
33 Implement the cost function and its gradient for the propagation:
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
34 Forward Propagation:
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
35 - You get X
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
36 - You compute $A = \sigma(w^T X + b) = (a^{(0)}, a^{(1)}, ..., a^{(m-1)}, a^{(m)})$
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
37 - You calculate the cost function: $J = -\frac{1}{m}\sum_{i=1}^{m}y^{(i)}\log(a^{(i)})+(1-y^{(i)})\log(1-a^{(i)})$
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
38
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
39 Here are the two formulas you will be using:
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
40
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
41 $$ \frac{\partial J}{\partial w} = \frac{1}{m}X(A-Y)^T\tag{7}$$
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
42 $$ \frac{\partial J}{\partial b} = \frac{1}{m} \sum_{i=1}^m (a^{(i)}-y^{(i)})\tag{8}$$
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
43
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
44 Arguments:
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
45 w -- weights, a numpy array of size (num_px * num_px * 3, 1)
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
46 b -- bias, a scalar
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
47 X -- data of size (num_px * num_px * 3, number of examples)
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
48 Y -- true "label" vector (containing 0 if non-cat, 1 if cat) of size (1, number of examples)
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
49
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
50 Return:
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
51 cost -- negative log-likelihood cost for logistic regression
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
52 dw -- gradient of the loss with respect to w, thus same shape as w
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
53 db -- gradient of the loss with respect to b, thus same shape as b
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
54
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
55 Tips:
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
56 - Write your code step by step for the propagation. np.log(), np.dot()
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
57 """
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
58
28
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
59
25
c52d8173b056 [regression] cleanup + proper structure
Jeff Hammel <k0scist@gmail.com>
parents: 24
diff changeset
60
28
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
61 # FORWARD PROPAGATION (FROM X TO COST)
25
c52d8173b056 [regression] cleanup + proper structure
Jeff Hammel <k0scist@gmail.com>
parents: 24
diff changeset
62 cost = cost_function(w, b, X, Y) # compute cost
c52d8173b056 [regression] cleanup + proper structure
Jeff Hammel <k0scist@gmail.com>
parents: 24
diff changeset
63
28
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
64 # BACKWARD PROPAGATION (TO FIND GRADIENT)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
65 m = X.shape[1]
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
66 A = sigmoid(np.dot(w.T, X) + b) # compute activation
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
67 dw = (1./m)*np.dot(X, (A - Y).T)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
68 db = (1./m)*np.sum(A - Y)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
69
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
70 # sanity check
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
71 assert(A.shape[1] == m)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
72 assert(dw.shape == w.shape), "dw.shape is {}; w.shape is {}".format(dw.shape, w.shape)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
73 assert(db.dtype == float)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
74 cost = np.squeeze(cost)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
75 assert(cost.shape == ())
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
76
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
77 # return gradients
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
78 grads = {"dw": dw,
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
79 "db": db}
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
80 return grads, cost
22
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
81
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
82
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
83 def cost_function(w, b, X, Y):
16
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
84 """
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
85 Cost function for binary classification
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
86 yhat = sigmoid(W.T*x + b)
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
87 interpret yhat thhe probably that y=1
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
88
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
89 Loss function:
22
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
90 y log(yhat) + (1 - y) log(1 - yhat)
16
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
91 """
22
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
92
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
93 m = X.shape[1]
23
f34110e28a0a [logistic regression] we have a working cost function
Jeff Hammel <k0scist@gmail.com>
parents: 22
diff changeset
94 A = sigmoid(np.dot(w.T, X) + b)
f34110e28a0a [logistic regression] we have a working cost function
Jeff Hammel <k0scist@gmail.com>
parents: 22
diff changeset
95 cost = np.sum(Y*np.log(A) + (1 - Y)*np.log(1 - A))
f34110e28a0a [logistic regression] we have a working cost function
Jeff Hammel <k0scist@gmail.com>
parents: 22
diff changeset
96 return (-1./m)*cost
31
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
97
54
0807ac8992ba [logistic regression] note further steps
Jeff Hammel <k0scist@gmail.com>
parents: 45
diff changeset
98 def compute_costs(Yhat, Y):
0807ac8992ba [logistic regression] note further steps
Jeff Hammel <k0scist@gmail.com>
parents: 45
diff changeset
99 """
0807ac8992ba [logistic regression] note further steps
Jeff Hammel <k0scist@gmail.com>
parents: 45
diff changeset
100 Computes the cross-entropy cost given:
55
0908b6cd3217 [regression] add better cost function for sigmoids
Jeff Hammel <k0scist@gmail.com>
parents: 54
diff changeset
101
0908b6cd3217 [regression] add better cost function for sigmoids
Jeff Hammel <k0scist@gmail.com>
parents: 54
diff changeset
102 J = -(1/m)*sum_{i=0..m} (y(i)log(yhat(i)) + (1 - y(i))log(1 - yhat(i)))
0908b6cd3217 [regression] add better cost function for sigmoids
Jeff Hammel <k0scist@gmail.com>
parents: 54
diff changeset
103
0908b6cd3217 [regression] add better cost function for sigmoids
Jeff Hammel <k0scist@gmail.com>
parents: 54
diff changeset
104 Yhat -- The sigmoid output of the network
0908b6cd3217 [regression] add better cost function for sigmoids
Jeff Hammel <k0scist@gmail.com>
parents: 54
diff changeset
105 Y -- "true" label vector
54
0807ac8992ba [logistic regression] note further steps
Jeff Hammel <k0scist@gmail.com>
parents: 45
diff changeset
106 """
55
0908b6cd3217 [regression] add better cost function for sigmoids
Jeff Hammel <k0scist@gmail.com>
parents: 54
diff changeset
107
0908b6cd3217 [regression] add better cost function for sigmoids
Jeff Hammel <k0scist@gmail.com>
parents: 54
diff changeset
108 # compute the cross-entropy cost
0908b6cd3217 [regression] add better cost function for sigmoids
Jeff Hammel <k0scist@gmail.com>
parents: 54
diff changeset
109 logprops = np.multiply(np.log(Yhat, Y)) + np.multiply(np.log(1-Yhat), (1-Y))
0908b6cd3217 [regression] add better cost function for sigmoids
Jeff Hammel <k0scist@gmail.com>
parents: 54
diff changeset
110 cost = - np.sum(logprobs)/m
0908b6cd3217 [regression] add better cost function for sigmoids
Jeff Hammel <k0scist@gmail.com>
parents: 54
diff changeset
111
0908b6cd3217 [regression] add better cost function for sigmoids
Jeff Hammel <k0scist@gmail.com>
parents: 54
diff changeset
112 cost = np.squeeze(cost) # make sure cost is the dimension we expect
0908b6cd3217 [regression] add better cost function for sigmoids
Jeff Hammel <k0scist@gmail.com>
parents: 54
diff changeset
113 assert (isinstance(cost, float))
0908b6cd3217 [regression] add better cost function for sigmoids
Jeff Hammel <k0scist@gmail.com>
parents: 54
diff changeset
114 return cost
54
0807ac8992ba [logistic regression] note further steps
Jeff Hammel <k0scist@gmail.com>
parents: 45
diff changeset
115
31
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
116
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
117 def optimize(w, b, X, Y, num_iterations, learning_rate, print_cost = False):
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
118 """
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
119 This function optimizes w and b by running a gradient descent algorithm
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
120
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
121 Arguments:
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
122 w -- weights, a numpy array of size (num_px * num_px * 3, 1)
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
123 b -- bias, a scalar
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
124 X -- data of shape (num_px * num_px * 3, number of examples)
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
125 Y -- true "label" vector (containing 0 if non-cat, 1 if cat), of shape (1, number of examples)
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
126 num_iterations -- number of iterations of the optimization loop
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
127 learning_rate -- learning rate of the gradient descent update rule
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
128 print_cost -- True to print the loss every 100 steps
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
129
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
130 Returns:
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
131 params -- dictionary containing the weights w and bias b
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
132 grads -- dictionary containing the gradients of the weights and bias with respect to the cost function
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
133 costs -- list of all the costs computed during the optimization, this will be used to plot the learning curve.
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
134
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
135 Tips:
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
136 You basically need to write down two steps and iterate through them:
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
137 1) Calculate the cost and the gradient for the current parameters. Use propagate().
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
138 2) Update the parameters using gradient descent rule for w and b.
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
139 """
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
140
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
141 costs = []
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
142
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
143 for i in range(num_iterations):
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
144
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
145 # Cost and gradient calculation
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
146 grads, cost = propagate(w, b, X, Y)
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
147
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
148 # Retrieve derivatives from grads
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
149 dw = grads["dw"]
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
150 db = grads["db"]
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
151
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
152 # gradient descent
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
153 w = w - learning_rate*dw
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
154 b = b - learning_rate*db
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
155
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
156 # Record the costs
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
157 if i % 100 == 0:
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
158 costs.append(cost)
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
159
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
160 # Print the cost every 100 training examples
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
161 if print_cost and not (i % 100):
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
162 print ("Cost after iteration %i: %f" %(i, cost))
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
163
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
164 # package data for return
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
165 params = {"w": w,
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
166 "b": b}
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
167 grads = {"dw": dw,
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
168 "db": db}
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
169 return params, grads, costs
32
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
170
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
171
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
172 def predict(w, b, X):
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
173 '''
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
174 Predict whether the label is 0 or 1 using learned logistic regression parameters (w, b)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
175
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
176 Arguments:
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
177 w -- weights, a numpy array of size (num_px * num_px * 3, 1)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
178 b -- bias, a scalar
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
179 X -- data of size (num_px * num_px * 3, number of examples)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
180
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
181 Returns:
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
182 Y_prediction -- a numpy array (vector) containing all predictions (0/1) for the examples in X
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
183 '''
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
184
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
185 m = X.shape[1]
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
186 Y_prediction = np.zeros((1,m))
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
187 w = w.reshape(X.shape[0], 1)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
188
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
189 # Compute vector "A" predicting the probabilities
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
190 A = sigmoid(np.dot(w.T, X) + b)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
191
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
192 for i in range(A.shape[1]):
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
193 # Convert probabilities A[0,i] to actual predictions p[0,i]
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
194 Y_prediction[0][i] = 0 if A[0][i] <= 0.5 else 1
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
195
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
196 assert(Y_prediction.shape == (1, m))
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
197
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
198 return Y_prediction
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
199
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
200
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
201 def model(X_train, Y_train, X_test, Y_test, num_iterations = 2000, learning_rate = 0.5, print_cost = False):
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
202 """
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
203 Builds the logistic regression model by calling the function you've implemented previously
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
204
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
205 Arguments:
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
206 X_train -- training set represented by a numpy array of shape (num_px * num_px * 3, m_train)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
207 Y_train -- training labels represented by a numpy array (vector) of shape (1, m_train)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
208 X_test -- test set represented by a numpy array of shape (num_px * num_px * 3, m_test)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
209 Y_test -- test labels represented by a numpy array (vector) of shape (1, m_test)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
210 num_iterations -- hyperparameter representing the number of iterations to optimize the parameters
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
211 learning_rate -- hyperparameter representing the learning rate used in the update rule of optimize()
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
212 print_cost -- Set to true to print the cost every 100 iterations
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
213
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
214 Returns:
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
215 d -- dictionary containing information about the model.
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
216 """
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
217
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
218 # initialize parameters with zeros
33
e2dd9503098f finalize logistic regression notes
Jeff Hammel <k0scist@gmail.com>
parents: 32
diff changeset
219 w = np.zeros((X_train.shape[0], 1))
e2dd9503098f finalize logistic regression notes
Jeff Hammel <k0scist@gmail.com>
parents: 32
diff changeset
220 b = 0
32
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
221
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
222 # Gradient descent
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
223 parameters, grads, costs = optimize(w, b, X_train, Y_train, num_iterations, learning_rate, print_cost=print_cost)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
224
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
225 # Retrieve parameters w and b from dictionary "parameters"
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
226 w = parameters["w"]
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
227 b = parameters["b"]
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
228
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
229 # Predict test/train set examples
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
230 Y_prediction_test = predict(w, b, X_test)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
231 Y_prediction_train = predict(w, b, X_train)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
232
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
233
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
234 # Print train/test Errors
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
235 print("train accuracy: {} %".format(100 - np.mean(np.abs(Y_prediction_train - Y_train)) * 100))
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
236 print("test accuracy: {} %".format(100 - np.mean(np.abs(Y_prediction_test - Y_test)) * 100))
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
237
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
238 d = {"costs": costs,
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
239 "Y_prediction_test": Y_prediction_test,
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
240 "Y_prediction_train" : Y_prediction_train,
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
241 "w" : w,
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
242 "b" : b,
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
243 "learning_rate" : learning_rate,
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
244 "num_iterations": num_iterations}
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
245 return d