ReLU:

  • Leaky ReLU
  • Parametric ReLU

Maxout: 

ReLU is a special case of Maxout.

Learnable activation function

RMSProp:

Momentum:

RMSProp + Momentum ==> Adam

Regularization:

Dropout

 

[展开全文]