Activation function

Speed up training and improve performance in deep neural net

Discussion of 5 popular techniques to speed up training in deep neural net (Initialization, Activation function and Batch Normalization/Gradient Clipping) using TensoFlow