日期:
来源:机器学习研究组收集编辑:
Binary step
Logistic, sigmoid, or soft step
ElliotSig 或 Softsign
双曲正切 (tanh)
Arctangent / Arctan / atan
Softplus
Rectified linear unit (ReLU) (ReLU6)
Exponential linear unit (ELU)
Gaussian Error Linear Unit (GELU)
Scaled exponential linear unit (SELU)
Mish
Leaky rectified linear unit (Leaky ReLU)
Parametric rectified linear unit (PReLU)
Parametric Exponential Linear Unit (PELU)
S-shaped rectified linear activation unit (SReLU)
Bipolar rectified linear unit (BReLU)
Randomized leaky rectified linear unit (RReLU)
Sigmoid linear unit (SiLU) or Swish
Gaussian
Growing Cosine Unit (GCU)
Shifted Quadratic Unit (SQU)
Non-Monotonic Cubic Unit (NCU)
Shifted Sinc Unit (SSU)
https://arxiv.org/pdf/2111.04020.pdf。
Decaying Sine Unit (DSU)
https://arxiv.org/pdf/2111.04020.pdf
Phish
https://www.techrxiv.org/ndownloader/files/33227273/2
SQ-RBF
Inverse square root unit (ISRU)
Square nonlinearity (SQNL)
Sigmoid shrinkage
“Squashing functions”
Maxout
Bent Identity
Sinusoid
Sinc (taming the waves)
ArSinH
Soft Clipping (goldilocks)
Piecewise Linear Unit (PLU)
Adaptive piecewise linear (APL)
Inverse Cubic
Soft Exponential
LeCun hyperbolic tangent (42?)
https://en.wikipedia.org/wiki/Activation_function
想要了解更多资讯,请扫描下方二维码,关注机器学习研究会
转自:DeepHub IMBA