• 综合
  • 标题
  • 关键词
  • 摘要
  • 学者
  • 期刊-刊名
  • 期刊-ISSN
  • 会议名称
搜索

作者:

Fu, Pengbin (Fu, Pengbin.) | Liu, Daxing (Liu, Daxing.) | Yang, Huirong (Yang, Huirong.)

收录:

EI Scopus

摘要:

Recently, Transformer-based models have shown promising results in automatic speech recognition (ASR), outperforming models based on recurrent neural networks (RNNs) and convolutional neural networks (CNNs). However, directly applying a Transformer to the ASR task does not exploit the correlation among speech frames effectively, leaving the model trapped in a sub-optimal solution. To this end, we propose a local attention Transformer model for speech recognition that combines the high correlation among speech frames. Specifically, we use relative positional embed-ding, rather than absolute positional embedding, to improve the generalization of the Transformer for speech sequences of different lengths. Secondly, we add local attention based on parametric positional relations to the self-attentive module and explicitly incorporate prior knowledge into the self-attentive module to make the training process insensitive to hyperparameters, thus improving the performance. Experiments carried out on the LibriSpeech dataset show that our proposed approach achieves a word error rate of 2.3/5.5% by language model fusion without any external data and reduces the word error rate by 17.8/9.8% compared to the baseline. The results are also close to, or better than, other state-of-the-art end-to-end models. © 2022 by the authors. Licensee MDPI, Basel, Switzerland.

关键词:

Recurrent neural networks Convolutional neural networks Speech recognition Speech

作者机构:

  • [ 1 ] [Fu, Pengbin]Faculty of Information Technology, Beijing University of Technology, Beijing; 100124, China
  • [ 2 ] [Liu, Daxing]Faculty of Information Technology, Beijing University of Technology, Beijing; 100124, China
  • [ 3 ] [Yang, Huirong]Faculty of Information Technology, Beijing University of Technology, Beijing; 100124, China

通讯作者信息:

电子邮件地址:

查看成果更多字段

相关关键词:

相关文章:

来源 :

Information (Switzerland)

年份: 2022

期: 5

卷: 13

被引次数:

WoS核心集被引频次:

SCOPUS被引频次: 5

ESI高被引论文在榜: 0 展开所有

万方被引频次:

中文被引频次:

近30日浏览量: 0

归属院系:

在线人数/总访问数:502/4958579
地址:北京工业大学图书馆(北京市朝阳区平乐园100号 邮编:100124) 联系我们:010-67392185
版权所有:北京工业大学图书馆 站点建设与维护:北京爱琴海乐之技术有限公司