• 综合
  • 标题
  • 关键词
  • 摘要
  • 学者
  • 期刊-刊名
  • 期刊-ISSN
  • 会议名称
搜索

作者:

Liang, Yu (Liang, Yu.) | Zhang, Chenlong (Zhang, Chenlong.) | An, Shan (An, Shan.) | Wang, Zaitian (Wang, Zaitian.) | Shi, Kaize (Shi, Kaize.) | Peng, Tianhao (Peng, Tianhao.) | Ma, Yuqing (Ma, Yuqing.) | Xie, Xiaoyang (Xie, Xiaoyang.) | He, Jian (He, Jian.) | Zheng, Kun (Zheng, Kun.)

收录:

EI Scopus SCIE

摘要:

Objective. Electroencephalogram (EEG) analysis has always been an important tool in neural engineering, and the recognition and classification of human emotions are one of the important tasks in neural engineering. EEG data, obtained from electrodes placed on the scalp, represent a valuable resource of information for brain activity analysis and emotion recognition. Feature extraction methods have shown promising results, but recent trends have shifted toward end-to-end methods based on deep learning. However, these approaches often overlook channel representations, and their complex structures pose certain challenges to model fitting. Approach. To address these challenges, this paper proposes a hybrid approach named FetchEEG that combines feature extraction and temporal-channel joint attention. Leveraging the advantages of both traditional feature extraction and deep learning, the FetchEEG adopts a multi-head self-attention mechanism to extract representations between different time moments and channels simultaneously. The joint representations are then concatenated and classified using fully-connected layers for emotion recognition. The performance of the FetchEEG is verified by comparison experiments on a self-developed dataset and two public datasets. Main results. In both subject-dependent and subject-independent experiments, the FetchEEG demonstrates better performance and stronger generalization ability than the state-of-the-art methods on all datasets. Moreover, the performance of the FetchEEG is analyzed for different sliding window sizes and overlap rates in the feature extraction module. The sensitivity of emotion recognition is investigated for three- and five-frequency-band scenarios. Significance. FetchEEG is a novel hybrid method based on EEG for emotion classification, which combines EEG feature extraction with Transformer neural networks. It has achieved state-of-the-art performance on both self-developed datasets and multiple public datasets, with significantly higher training efficiency compared to end-to-end methods, demonstrating its effectiveness and feasibility.

关键词:

self-attention mechanism. electroencephalographic deep learning power spectral density emotion recognition

作者机构:

  • [ 1 ] [Liang, Yu]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 2 ] [Zhang, Chenlong]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 3 ] [Wang, Zaitian]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 4 ] [Xie, Xiaoyang]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 5 ] [He, Jian]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 6 ] [Zheng, Kun]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 7 ] [An, Shan]JD Hlth Int Inc, Beijing, Peoples R China
  • [ 8 ] [Shi, Kaize]Univ Technol Sydney, Sydney, Australia
  • [ 9 ] [Peng, Tianhao]Beihang Univ, Beihang, Peoples R China
  • [ 10 ] [Ma, Yuqing]Beihang Univ, Beihang, Peoples R China

通讯作者信息:

查看成果更多字段

相关关键词:

相关文章:

来源 :

JOURNAL OF NEURAL ENGINEERING

ISSN: 1741-2560

年份: 2024

期: 3

卷: 21

4 . 0 0 0

JCR@2022

被引次数:

WoS核心集被引频次:

SCOPUS被引频次:

ESI高被引论文在榜: 0 展开所有

万方被引频次:

中文被引频次:

近30日浏览量: 0

归属院系:

在线人数/总访问数:426/4965647
地址:北京工业大学图书馆(北京市朝阳区平乐园100号 邮编:100124) 联系我们:010-67392185
版权所有:北京工业大学图书馆 站点建设与维护:北京爱琴海乐之技术有限公司