Mercurial > hg > tvii
view tvii/activation.py @ 80:3c7927f59b05
notes to self re deep neural networks
author | Jeff Hammel <k0scist@gmail.com> |
---|---|
date | Sun, 17 Dec 2017 13:43:42 -0800 |
parents | 857a606783e1 |
children |
line wrap: on
line source
""" activation functions """ # tanh: # g(z) = tanh(z) = (exp(z) - exp(-z))/(exp(z) + exp(-z)) # g'(z) = 1 - (tanh(z))**2 # ReLU def ReLU(z): return max((0, z)) def ReLUprime(z): return 1. if z > 0 else 0.