摘要
Recently, deep learning models have been widely studied and applied in fault diagnosis. However, two common drawbacks are: 1) they usually require a large amount of storage resources, making it difficult to run them on embedded devices and 2) there is usually no access to sufficient reliable training data to train a comprehensive diagnosis model. In this study, a fusion approach is proposed based on knowledge distillation and generative adversarial network (GAN). This approach is named small-sample dense teacher assistant knowledge distillation (SS-DTAKD), which aims to enable bearing fault diagnosis with small samples and limited on-board storage resources. First, the proposed self-attention GAN (SGAN) is used to expand the training data for the diagnostic model. The advantage is that the generator and discriminator embedded with the self-attention module can help improve the quality of the generated data. Then, the DTAKD method is proposed to compress the model parameter, where the dense distillation of multiple teacher-assistant networks helps the student network learn correct knowledge without requiring additional data and storage resources. Additionally, the dual-type data hierarchical training (DDHT) method is applied to train the student network, which is designed to utilize actual data to improve the student network's performance. Extensive experiments on two bearing fault datasets demonstrate that the data generated by the SGAN has high similarity and robustness. Furthermore, compared to other existing knowledge distillation methods, the proposed SS-DTAKD method can obtain higher fault diagnosis accuracy with small samples and limited on-board storage resources.
-
单位y