Indexed by:
Abstract:
For prognostics and health management of industrial systems, machine remaining useful life (RUL) prediction is an essential task. While deep learning-based methods have achieved great successes in RUL prediction tasks, large-scale neural networks are still difficult to deploy on edge devices owing to the constraints of memory capacity and computing power. In this article, we propose a lightweight and adaptive knowledge distillation (KD) framework to alleviate this problem. First, multiple teacher models are compressed into a student model through KD to improve the industrial prediction accuracy. Second, a dynamic exiting method is studied to enable an adaptive inference on the distilled student model. Finally, we develop a reparameterization scheme to further lessen the student network. Experiments on two turbofan engine degradation datasets and a bearing degradation dataset demonstrate that our method significantly outperforms the state-of-the-art KD methods and enables the distilled model with an adaptive inference ability.
Keyword:
Reprint Author's Address:
Email:
Source :
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS
ISSN: 1551-3203
Year: 2023
Issue: 8
Volume: 19
Page: 9060-9070
1 2 . 3 0 0
JCR@2022
ESI Discipline: ENGINEERING;
ESI HC Threshold:19
Cited Count:
SCOPUS Cited Count: 30
ESI Highly Cited Papers on the List: 3 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 3
Affiliated Colleges: