Tehnički vjesnik, Vol. 30 No. 5, 2023.
Izvorni znanstveni članak
https://doi.org/10.17559/TV-20230614000735
The Adaptive Quadratic Linear Unit (AQuLU): Adaptive Non Monotonic Piecewise Activation Function
Zhandong Wu
; College of Biological and Agricultural Engineering, Jilin University, Changchun, 130022, China
Haiye Yu
; College of Biological and Agricultural Engineering, Jilin University, Changchun, 130022, China
Lei Zhang
; College of Biological and Agricultural Engineering, Jilin University, Changchun, 130022, China
Yuanyuan Sui
; College of Biological and Agricultural Engineering, Jilin University, Changchun, 130022, China
Sažetak
The activation function plays a key role in influencing the performance and training dynamics of neural networks. There are hundreds of activation functions widely used as rectified linear units (ReLUs), but most of them are applied to complex and large neural networks, which often have gradient explosion and vanishing gradient problems. By studying a variety of non-monotonic activation functions, we propose a method to construct a non-monotonic activation function, x·Φ(x), with Φ(x) [0, 1]. With the hardening treatment of Φ(x), we propose an adaptive non-monotonic segmented activation function, called the adaptive quadratic linear unit, abbreviated as AQuLU, which ensures the sparsity of the input data and improves training efficiency. In image classification based on different state-of-the-art neural network architectures, the performance of AQuLUs has significant advantages for more complex and deeper architectures with various activation functions. The ablation experimental study further validates the compatibility and stability of AQuLUs with different depths, complexities, optimizers, learning rates, and batch sizes. We thus demonstrate the high efficiency, robustness, and simplicity of AQuLUs.
Ključne riječi
activation function; AQuLU; CaLU; deep learning; ExpExpish; LogLogish; LaLU
Hrčak ID:
307710
URI
Datum izdavanja:
31.8.2023.
Posjeta: 947 *