• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Du, Yongping (Du, Yongping.) (Scholars:杜永萍) | Niu, Jinyu (Niu, Jinyu.) | Wang, Yuxin (Wang, Yuxin.) | Jin, Xingnan (Jin, Xingnan.)

Indexed by:

EI Scopus SCIE

Abstract:

Sequential models based on deep learning are widely used in sequential recommendation task, but the increase of model parameters results in a higher latency in the inference stage, which limits the real-time performance of the model. In order to make the model strike a balance between efficiency and effectiveness, the knowledge distillation technology is adopted to transfer the pre-trained knowledge from the large teacher model to the small student model. We propose a multi-stage knowledge distillation method based on interest knowledge, including interest representation knowledge and interest drift knowledge. In the process of knowledge transfer, expert distillation is designed to transform the knowledge dimension of student model to alleviate the loss of original knowledge information. Specially, curriculum learning is introduced for multistage knowledge learning, which further makes the teacher model effectively transfer the knowledge to the student model with limited ability. The proposed method on three real-world datasets including MovieLen-1M, Amazon Game and Steam datasets. The experimental results demonstrate that our method is superior to the other compared distillation method significantly and multi-stage learning makes the student model achieve the knowledge step by step for improvement.

Keyword:

Model compression Interest drift Knowledge distillation Multi-stage learning Sequential recommendation

Author Community:

  • [ 1 ] [Du, Yongping]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 2 ] [Niu, Jinyu]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 3 ] [Wang, Yuxin]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 4 ] [Jin, Xingnan]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China

Reprint Author's Address:

  • [Du, Yongping]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China;;

Show more details

Related Keywords:

Related Article:

Source :

INFORMATION SCIENCES

ISSN: 0020-0255

Year: 2024

Volume: 654

8 . 1 0 0

JCR@2022

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count: 6

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 2

Affiliated Colleges:

Online/Total:491/5293672
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.