收录:
摘要:
For prognostics and health management of industrial systems, machine remaining useful life (RUL) prediction is an essential task. While deep learning-based methods have achieved great successes in RUL prediction tasks, large-scale neural networks are still difficult to deploy on edge devices owing to the constraints of memory capacity and computing power. In this article, we propose a lightweight and adaptive knowledge distillation (KD) framework to alleviate this problem. First, multiple teacher models are compressed into a student model through KD to improve the industrial prediction accuracy. Second, a dynamic exiting method is studied to enable an adaptive inference on the distilled student model. Finally, we develop a reparameterization scheme to further lessen the student network. Experiments on two turbofan engine degradation datasets and a bearing degradation dataset demonstrate that our method significantly outperforms the state-of-the-art KD methods and enables the distilled model with an adaptive inference ability.
关键词:
通讯作者信息:
电子邮件地址:
来源 :
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS
ISSN: 1551-3203
年份: 2023
期: 8
卷: 19
页码: 9060-9070
1 2 . 3 0 0
JCR@2022
ESI学科: ENGINEERING;
ESI高被引阀值:19
归属院系: