摘要

As a novel neural network with efficient learning capacity, broad learning system (BLS) has achieved remarkable success in various regression and classification problems. Due to the broad expansion of nodes, however, BLS is known to have many redundant parameters and nodes, which will increase the memory and computation cost and is adverse to its deployment on equipment with limited resources. To optimize the number of neurons and parameters of BLS and then find the optimal sparse model under a given resource budget, in this paper, we introduce to train BLS through L0 regularization. The regularization constraint term of the BLS objective function is replaced by the L0 regularization method, and the normalized hard threshold iterative method is used to optimize the output weight. More concretely, the size of the model is fixed by controlling the number of output weights under given the resource size, and then parameters and nodes in the network are evaluated and selected from the node set in the training to obtain a BLS with controllable sparsity (CSBLS). Experiments on various data sets demonstrate the effectiveness of our proposed method.

全文