annotate tvii/logistic_regression.py @ 50:4b20694b8a16

add module + test for uniqueness
author Jeff Hammel <k0scist@gmail.com>
date Sun, 17 Sep 2017 14:28:36 -0700
parents 4d173452377e
children 0807ac8992ba
Ignore whitespace changes - Everywhere: Within whitespace: At end of lines:
rev   line source
2
1214c127fe43 steps towards logistic regression
Jeff Hammel <k0scist@gmail.com>
parents:
diff changeset
1 """
1214c127fe43 steps towards logistic regression
Jeff Hammel <k0scist@gmail.com>
parents:
diff changeset
2 z = w'x + b
1214c127fe43 steps towards logistic regression
Jeff Hammel <k0scist@gmail.com>
parents:
diff changeset
3 a = sigmoid(z)
1214c127fe43 steps towards logistic regression
Jeff Hammel <k0scist@gmail.com>
parents:
diff changeset
4 L(a,y) = -(y*log(a) + (1-y)*log(1-a))
11
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
5
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
6 [| | | ]
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
7 X = [x1 x2 x3]
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
8 [| | | ]
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
9
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
10 [z1 z2 z3 .. zm] = w'*X + [b b b b ] = [w'*x1+b + w'*x2+b ...]
2
1214c127fe43 steps towards logistic regression
Jeff Hammel <k0scist@gmail.com>
parents:
diff changeset
11 """
11
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
12
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
13
b6a146f0a61b [logistic regression] stubbing
Jeff Hammel <k0scist@gmail.com>
parents: 2
diff changeset
14 import numpy as np
16
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
15 from .sigmoid import sigmoid
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
16
22
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
17
45
4d173452377e notes on backpropagation
Jeff Hammel <k0scist@gmail.com>
parents: 33
diff changeset
18 def loss(a, y):
4d173452377e notes on backpropagation
Jeff Hammel <k0scist@gmail.com>
parents: 33
diff changeset
19 # UNTESTED!
4d173452377e notes on backpropagation
Jeff Hammel <k0scist@gmail.com>
parents: 33
diff changeset
20 # derivative = -(y/a) + (1-y)/(1-a)
4d173452377e notes on backpropagation
Jeff Hammel <k0scist@gmail.com>
parents: 33
diff changeset
21 return -y*np.log(a) - (1-y)*np.log(1-a)
4d173452377e notes on backpropagation
Jeff Hammel <k0scist@gmail.com>
parents: 33
diff changeset
22
22
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
23 def propagate(w, b, X, Y):
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
24 """
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
25 Implement the cost function and its gradient for the propagation:
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
26 Forward Propagation:
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
27 - You get X
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
28 - You compute $A = \sigma(w^T X + b) = (a^{(0)}, a^{(1)}, ..., a^{(m-1)}, a^{(m)})$
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
29 - You calculate the cost function: $J = -\frac{1}{m}\sum_{i=1}^{m}y^{(i)}\log(a^{(i)})+(1-y^{(i)})\log(1-a^{(i)})$
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
30
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
31 Here are the two formulas you will be using:
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
32
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
33 $$ \frac{\partial J}{\partial w} = \frac{1}{m}X(A-Y)^T\tag{7}$$
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
34 $$ \frac{\partial J}{\partial b} = \frac{1}{m} \sum_{i=1}^m (a^{(i)}-y^{(i)})\tag{8}$$
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
35
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
36 Arguments:
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
37 w -- weights, a numpy array of size (num_px * num_px * 3, 1)
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
38 b -- bias, a scalar
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
39 X -- data of size (num_px * num_px * 3, number of examples)
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
40 Y -- true "label" vector (containing 0 if non-cat, 1 if cat) of size (1, number of examples)
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
41
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
42 Return:
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
43 cost -- negative log-likelihood cost for logistic regression
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
44 dw -- gradient of the loss with respect to w, thus same shape as w
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
45 db -- gradient of the loss with respect to b, thus same shape as b
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
46
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
47 Tips:
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
48 - Write your code step by step for the propagation. np.log(), np.dot()
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
49 """
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
50
28
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
51
25
c52d8173b056 [regression] cleanup + proper structure
Jeff Hammel <k0scist@gmail.com>
parents: 24
diff changeset
52
28
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
53 # FORWARD PROPAGATION (FROM X TO COST)
25
c52d8173b056 [regression] cleanup + proper structure
Jeff Hammel <k0scist@gmail.com>
parents: 24
diff changeset
54 cost = cost_function(w, b, X, Y) # compute cost
c52d8173b056 [regression] cleanup + proper structure
Jeff Hammel <k0scist@gmail.com>
parents: 24
diff changeset
55
28
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
56 # BACKWARD PROPAGATION (TO FIND GRADIENT)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
57 m = X.shape[1]
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
58 A = sigmoid(np.dot(w.T, X) + b) # compute activation
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
59 dw = (1./m)*np.dot(X, (A - Y).T)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
60 db = (1./m)*np.sum(A - Y)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
61
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
62 # sanity check
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
63 assert(A.shape[1] == m)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
64 assert(dw.shape == w.shape), "dw.shape is {}; w.shape is {}".format(dw.shape, w.shape)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
65 assert(db.dtype == float)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
66 cost = np.squeeze(cost)
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
67 assert(cost.shape == ())
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
68
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
69 # return gradients
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
70 grads = {"dw": dw,
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
71 "db": db}
77f68c241b37 [logistic regression] propagate
Jeff Hammel <k0scist@gmail.com>
parents: 25
diff changeset
72 return grads, cost
22
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
73
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
74
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
75 def cost_function(w, b, X, Y):
16
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
76 """
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
77 Cost function for binary classification
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
78 yhat = sigmoid(W.T*x + b)
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
79 interpret yhat thhe probably that y=1
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
80
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
81 Loss function:
22
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
82 y log(yhat) + (1 - y) log(1 - yhat)
16
b95fe82ac9ce more notes to self
Jeff Hammel <k0scist@gmail.com>
parents: 13
diff changeset
83 """
22
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
84
3713c6733990 [logistic regression] introduce illustrative test
Jeff Hammel <k0scist@gmail.com>
parents: 16
diff changeset
85 m = X.shape[1]
23
f34110e28a0a [logistic regression] we have a working cost function
Jeff Hammel <k0scist@gmail.com>
parents: 22
diff changeset
86 A = sigmoid(np.dot(w.T, X) + b)
f34110e28a0a [logistic regression] we have a working cost function
Jeff Hammel <k0scist@gmail.com>
parents: 22
diff changeset
87 cost = np.sum(Y*np.log(A) + (1 - Y)*np.log(1 - A))
f34110e28a0a [logistic regression] we have a working cost function
Jeff Hammel <k0scist@gmail.com>
parents: 22
diff changeset
88 return (-1./m)*cost
31
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
89
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
90
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
91 def optimize(w, b, X, Y, num_iterations, learning_rate, print_cost = False):
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
92 """
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
93 This function optimizes w and b by running a gradient descent algorithm
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
94
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
95 Arguments:
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
96 w -- weights, a numpy array of size (num_px * num_px * 3, 1)
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
97 b -- bias, a scalar
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
98 X -- data of shape (num_px * num_px * 3, number of examples)
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
99 Y -- true "label" vector (containing 0 if non-cat, 1 if cat), of shape (1, number of examples)
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
100 num_iterations -- number of iterations of the optimization loop
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
101 learning_rate -- learning rate of the gradient descent update rule
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
102 print_cost -- True to print the loss every 100 steps
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
103
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
104 Returns:
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
105 params -- dictionary containing the weights w and bias b
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
106 grads -- dictionary containing the gradients of the weights and bias with respect to the cost function
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
107 costs -- list of all the costs computed during the optimization, this will be used to plot the learning curve.
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
108
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
109 Tips:
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
110 You basically need to write down two steps and iterate through them:
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
111 1) Calculate the cost and the gradient for the current parameters. Use propagate().
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
112 2) Update the parameters using gradient descent rule for w and b.
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
113 """
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
114
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
115 costs = []
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
116
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
117 for i in range(num_iterations):
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
118
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
119 # Cost and gradient calculation
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
120 grads, cost = propagate(w, b, X, Y)
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
121
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
122 # Retrieve derivatives from grads
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
123 dw = grads["dw"]
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
124 db = grads["db"]
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
125
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
126 # gradient descent
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
127 w = w - learning_rate*dw
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
128 b = b - learning_rate*db
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
129
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
130 # Record the costs
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
131 if i % 100 == 0:
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
132 costs.append(cost)
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
133
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
134 # Print the cost every 100 training examples
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
135 if print_cost and not (i % 100):
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
136 print ("Cost after iteration %i: %f" %(i, cost))
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
137
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
138 # package data for return
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
139 params = {"w": w,
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
140 "b": b}
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
141 grads = {"dw": dw,
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
142 "db": db}
fa7a51df0d90 [logistic regression] test gradient descent
Jeff Hammel <k0scist@gmail.com>
parents: 30
diff changeset
143 return params, grads, costs
32
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
144
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
145
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
146 def predict(w, b, X):
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
147 '''
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
148 Predict whether the label is 0 or 1 using learned logistic regression parameters (w, b)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
149
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
150 Arguments:
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
151 w -- weights, a numpy array of size (num_px * num_px * 3, 1)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
152 b -- bias, a scalar
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
153 X -- data of size (num_px * num_px * 3, number of examples)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
154
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
155 Returns:
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
156 Y_prediction -- a numpy array (vector) containing all predictions (0/1) for the examples in X
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
157 '''
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
158
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
159 m = X.shape[1]
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
160 Y_prediction = np.zeros((1,m))
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
161 w = w.reshape(X.shape[0], 1)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
162
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
163 # Compute vector "A" predicting the probabilities
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
164 A = sigmoid(np.dot(w.T, X) + b)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
165
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
166 for i in range(A.shape[1]):
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
167 # Convert probabilities A[0,i] to actual predictions p[0,i]
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
168 Y_prediction[0][i] = 0 if A[0][i] <= 0.5 else 1
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
169
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
170 assert(Y_prediction.shape == (1, m))
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
171
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
172 return Y_prediction
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
173
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
174
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
175 def model(X_train, Y_train, X_test, Y_test, num_iterations = 2000, learning_rate = 0.5, print_cost = False):
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
176 """
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
177 Builds the logistic regression model by calling the function you've implemented previously
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
178
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
179 Arguments:
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
180 X_train -- training set represented by a numpy array of shape (num_px * num_px * 3, m_train)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
181 Y_train -- training labels represented by a numpy array (vector) of shape (1, m_train)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
182 X_test -- test set represented by a numpy array of shape (num_px * num_px * 3, m_test)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
183 Y_test -- test labels represented by a numpy array (vector) of shape (1, m_test)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
184 num_iterations -- hyperparameter representing the number of iterations to optimize the parameters
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
185 learning_rate -- hyperparameter representing the learning rate used in the update rule of optimize()
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
186 print_cost -- Set to true to print the cost every 100 iterations
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
187
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
188 Returns:
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
189 d -- dictionary containing information about the model.
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
190 """
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
191
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
192 # initialize parameters with zeros
33
e2dd9503098f finalize logistic regression notes
Jeff Hammel <k0scist@gmail.com>
parents: 32
diff changeset
193 w = np.zeros((X_train.shape[0], 1))
e2dd9503098f finalize logistic regression notes
Jeff Hammel <k0scist@gmail.com>
parents: 32
diff changeset
194 b = 0
32
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
195
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
196 # Gradient descent
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
197 parameters, grads, costs = optimize(w, b, X_train, Y_train, num_iterations, learning_rate, print_cost=print_cost)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
198
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
199 # Retrieve parameters w and b from dictionary "parameters"
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
200 w = parameters["w"]
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
201 b = parameters["b"]
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
202
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
203 # Predict test/train set examples
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
204 Y_prediction_test = predict(w, b, X_test)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
205 Y_prediction_train = predict(w, b, X_train)
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
206
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
207
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
208 # Print train/test Errors
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
209 print("train accuracy: {} %".format(100 - np.mean(np.abs(Y_prediction_train - Y_train)) * 100))
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
210 print("test accuracy: {} %".format(100 - np.mean(np.abs(Y_prediction_test - Y_test)) * 100))
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
211
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
212 d = {"costs": costs,
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
213 "Y_prediction_test": Y_prediction_test,
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
214 "Y_prediction_train" : Y_prediction_train,
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
215 "w" : w,
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
216 "b" : b,
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
217 "learning_rate" : learning_rate,
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
218 "num_iterations": num_iterations}
0f29b02f4806 [logistic regression] add model
Jeff Hammel <k0scist@gmail.com>
parents: 31
diff changeset
219 return d