ReLU:
- Leaky ReLU
- Parametric ReLU
Maxout:
ReLU is a special case of Maxout.
Learnable activation function
RMSProp:
Momentum:
RMSProp + Momentum ==> Adam
Regularization:
Dropout
ReLU:
Maxout:
ReLU is a special case of Maxout.
Learnable activation function
RMSProp:
Momentum:
RMSProp + Momentum ==> Adam
Regularization:
Dropout