摘要
In order to achieve the transistor-less circuit, which has remarkable potentials for high speed, high-density integration, and low power consumption applications, a 1-selector-1-resistor (1S - 1R) structure is utilized in this work to replace the prevalent 1-transistor-1-resistor (1T - 1R) array of memristor synaptic architectures. The 1S - 1R structure consists of a four-terminal memristor acting as a selector and a two-terminal memristor storing information in the form of device resistances. The high resistance state (HRS) and the low resistance state (LRS) of two-terminal memristor are used in conjunction to represent binary weight in binary neural networks (BNN), and multiply-accumulate operation of convolution can be replaced by the current accumulation of the 1S - 1R circuit. With the purpose of mapping binary weights into the 1S - 1R array to implement the forward propagation at the circuit simulation level, a LeNet model has been modified herein where two full-precision convolution layers are replaced by two binary convolutional layers. Compared with the full-precision LeNet model, the storage resource consumption of the binarized network is reduced by 44.28%, while the accuracy can reach 98% in 10 epochs, which is very close to the full-precision network. The 1S - 1R array is used subsequently to store the binary weight of a convolution layer in the trained LeNet and 100 pictures with numbers 0 to 9 in MNIST are randomly selected for testing. As expected, the result obtained through circuit simulation is consistent with the convolution operation in the process of forward propagation.
-
单位上海交通大学