Mercurial > hg > tvii
view tvii/activation.py @ 44:857a606783e1
[documentation] notes + stubs on gradient descent
author | Jeff Hammel <k0scist@gmail.com> |
---|---|
date | Mon, 04 Sep 2017 15:06:38 -0700 |
parents | 2f0caec46e26 |
children |
line wrap: on
line source
""" activation functions """ # tanh: # g(z) = tanh(z) = (exp(z) - exp(-z))/(exp(z) + exp(-z)) # g'(z) = 1 - (tanh(z))**2 # ReLU def ReLU(z): return max((0, z)) def ReLUprime(z): return 1. if z > 0 else 0.