Layers for deep learning

cosmos 4th November 2016 at 2:43pm

Local functions

Apply some nonlinear function σ\sigma to each element of the input vector.

Linear layer. Linear function

ReLU layer. Rectified linear unit. Very popular. For x=0, may use subderivatives..

maxout unit

Max-pooling

Compute the maximum of all input vector elements

Softmax

Exponentiate all vector elements and normalize them to so sum to unity