ScholarMate
客服热线:400-1616-289

On the investigation of activation functions in gradient neural network for online solving linear matrix equation

Tan, Zhiguo*; Hu, Yueming; Chen, Ke
Science Citation Index Expanded
-

摘要

In this paper, we investigate different activation functions (AFs) on convergence performance of a gradient-based neural network (GNN) for solving linear matrix equation, AXB + X = C. It is observed that, by employing different AFs, i.e., linear, power-sigmoid, sign-power, and general sign-bi-power functions, the presented GNN model can achieve different convergence performance. More specifically, if linear function is employed, the GNN model can achieve exponential convergence; if the power-sigmoid function is employed, superior convergence can be achieved as compared to the linear case; while if the sign-power and general sign-bi-power functions are employed, the GNN model can achieve finite- and fixed-time convergence, respectively. Detailed theoretical proofs are offered to demonstrate these facts. Besides, the exponential convergence rate and the upper bounds of finite and fixed convergence time are also theoretically estimated. Finally, two illustrative examples are performed to further substantiate the aforementioned theoretical results and the effectiveness of the presented GNN model for solving the linear matrix equation.

关键词

Gradient neural network Linear matrix equation Activation function Fixed-time convergence