• 综合
  • 标题
  • 关键词
  • 摘要
  • 学者
  • 期刊-刊名
  • 期刊-ISSN
  • 会议名称
搜索

作者:

Liu, Jiabin (Liu, Jiabin.) | Li, Biao (Li, Biao.) | Lei, Minglong (Lei, Minglong.) | Shi, Yong (Shi, Yong.)

收录:

EI Scopus SCIE

摘要:

In this paper, we tackle a new learning paradigm called learning from complementary labels, where the training data specifies classes that instances do not belong to, instead of the accuracy labels. In general, it is more efficient to collect the complementary labels compared with collecting the supervised ones, with no need for selecting the correct one from a number of candidates. While current state-of -the-art methods design various loss functions to train competitive models by the limited supervised information, they overlook learning from the data and model themselves, which always contain fruitful information that can improve the performance of complementary label learning. In this paper, we propose a novel learning framework, which seamlessly integrates self-supervised and self-distillation to complementary learning. Based on the general complementary learning framework, we employ an entropy regularization term to guarantee the network outputs exhibit a sharper state. Then, to intensively learn information from the data, we leverage the self-supervised learning based on rotation and transformation operations as a plug-in auxiliary task to learn better transferable representations. Finally, knowledge distillation is introduced to further extract the "dark knowledge"from a network to guide the training of a student network. In the extensive experiments, our method surprisingly demonstrates compelling performance in accuracy over several state-of-the-art approaches.(c) 2022 Elsevier Ltd. All rights reserved.

关键词:

Knowledge distillation Self-supervision learning Complementary labels learning

作者机构:

  • [ 1 ] [Liu, Jiabin]Beijing Inst Technol, Sch Informat & Elect, Beijing 100081, Peoples R China
  • [ 2 ] [Li, Biao]Southwestern Univ Finance & Econ, Fac Business Adm, Sch Business Adm, Chengdu 611130, Peoples R China
  • [ 3 ] [Lei, Minglong]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 4 ] [Shi, Yong]Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China

通讯作者信息:

查看成果更多字段

相关关键词:

来源 :

NEURAL NETWORKS

ISSN: 0893-6080

年份: 2022

卷: 155

页码: 318-327

7 . 8

JCR@2022

7 . 8 0 0

JCR@2022

ESI学科: COMPUTER SCIENCE;

ESI高被引阀值:46

JCR分区:1

中科院分区:2

被引次数:

WoS核心集被引频次: 11

SCOPUS被引频次: 11

ESI高被引论文在榜: 0 展开所有

万方被引频次:

中文被引频次:

近30日浏览量: 1

归属院系:

在线人数/总访问数:498/4963047
地址:北京工业大学图书馆(北京市朝阳区平乐园100号 邮编:100124) 联系我们:010-67392185
版权所有:北京工业大学图书馆 站点建设与维护:北京爱琴海乐之技术有限公司