• 综合
  • 标题
  • 关键词
  • 摘要
  • 学者
  • 期刊-刊名
  • 期刊-ISSN
  • 会议名称
搜索

作者:

Xu, Kai (Xu, Kai.) | Wang, Lichun (Wang, Lichun.) | Zhang, Huiyong (Zhang, Huiyong.) | Yin, Baocai (Yin, Baocai.)

收录:

EI Scopus

摘要:

Self-knowledge distillation does not require a pre-trained teacher network like traditional knowledge distillation. Existing methods either require additional parameters or require additional memory consumption. To alleviate this problem, this paper proposes a more efficient self-knowledge distillation method, named LRMS (learning from role-model samples). In every mini-batch, LRMS selects out a role-model sample for each sampled category, and takes its prediction as the proxy semantic for the corresponding category. Then, predictions of the other samples are constrained to be consistent with the proxy semantics, which makes the distribution of predictions for samples within the same category more compact. Meanwhile, the regularization targets corresponding to proxy semantics are set with a higher distillation temperature to better utilize the classificatory information about the categories. Experimental results show that diverse architectures achieve improvements on four image classification datasets by using LRMS. Code is acaliable: https://github.com/KAI1179/LRMS © 2024 IEEE.

关键词:

Self-supervised learning Image compression Teaching Adversarial machine learning Image enhancement Neural networks

作者机构:

  • [ 1 ] [Xu, Kai]Beijing University of Technology, Faculty of Information Technology, Beijing, China
  • [ 2 ] [Wang, Lichun]Beijing University of Technology, Faculty of Information Technology, Beijing, China
  • [ 3 ] [Zhang, Huiyong]Beijing University of Technology, Faculty of Information Technology, Beijing, China
  • [ 4 ] [Yin, Baocai]Beijing University of Technology, Faculty of Information Technology, Beijing, China

通讯作者信息:

电子邮件地址:

查看成果更多字段

相关关键词:

相关文章:

来源 :

ISSN: 1520-6149

年份: 2024

页码: 5185-5189

语种: 英文

被引次数:

WoS核心集被引频次:

SCOPUS被引频次: 2

ESI高被引论文在榜: 0 展开所有

万方被引频次:

中文被引频次:

近30日浏览量: 1

归属院系:

在线人数/总访问数:534/4962859
地址:北京工业大学图书馆(北京市朝阳区平乐园100号 邮编:100124) 联系我们:010-67392185
版权所有:北京工业大学图书馆 站点建设与维护:北京爱琴海乐之技术有限公司