ScholarMate
客服热线:400-1616-289

Universal Consistency of Deep Convolutional Neural Networks

Lin, Shao-Bo; Wang, Kaidong; Wang, Yao*; Zhou, Ding-Xuan
Science Citation Index Expanded
西安交通大学; y

摘要

Compared with avid research activities of deep convolutional neural networks (DCNNs) in practice, the study of theoretical behaviors of DCNNs lags heavily behind. In particular, the universal consistency of DCNNs remains open. In this paper, we prove that implementing empirical risk minimization on DCNNs with expansive convolution (with zero-padding) is strongly universally consistent. Motivated by the universal consistency, we conduct a series of experiments to show that without any fully connected layers, DCNNs with expansive convolution perform not worse than the widely used deep neural networks with hybrid structure containing contracting (without zero-padding) convolutional layers and several fully connected layers.

关键词

Convolution Risk management Feature extraction Convolutional neural networks Sparse matrices Deep learning Urban areas convolutional neural networks universal consistency