收录:
摘要:
Few-shot learning aims to recognize novel categories solely relying on a few labeled samples, with existing few-shot methods primarily focusing on the categories sampled from the same distribution. Nevertheless, this assumption cannot always be ensured, and the actual domain shift problem significantly reduces the performance of few-shot learning. To remedy this problem, we investigate an interesting and challenging cross-domain few-shot learning task, where the training and testing tasks employ different domains. Specifically, we propose aMeta-Memory scheme to bridge the domain gap between source and target domains, leveraging style-memory and content-memory components. The former stores intra-domain style information from source domain instances and provides a richer feature distribution. The latter stores semantic information through exploration of knowledge of different categories. Under the contrastive learning strategy, our model effectively alleviates the cross-domain problem in few-shot learning. Extensive experiments demonstrate that our proposed method achieves state-of-the-art performance on cross-domain few-shot semantic segmentation tasks on the COCO-20(i), PASCAL-5(i), FSS-1000, and SUIM datasets and positively affects few-shot classification tasks on Meta-Dataset.
关键词:
通讯作者信息:
电子邮件地址:
来源 :
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
ISSN: 0162-8828
年份: 2023
期: 12
卷: 45
页码: 15018-15035
2 3 . 6 0 0
JCR@2022
归属院系: