收录:
摘要:
Deep neural networks (DNN) with skip connections, which is different from the standard feed forward network architecture, is added skip connections between networks. By adding skip connections to all layers of neural networks, the problem of gradient vanishing can be solved, which is beneficial for training deep networks and makes a faster convergence. In addition, more speech signal details are able to be passed by the skip connections, which helps the network to better recover the speech signal. In our paper, firstly, the ideal Wiener filter is chosen as the training target of DNN with skip connections (Skip-DNN) given the cepstral feature of noisy speech signal as its input. Then, we investigate the enhanced speech performance that combines the DNN-based phase estimation in complex domain with the estimated clean speech magnitude by using the ideal Wiener filter and Skip-DNN. The experiments are conducted by using the TIMIT corpus with 102 types of noises at four different signal to noise ratio (SNR) levels. According to the experiments, our proposed methods are able to achieve the higher speech quality and intelligibility than those reference approaches. © 2018 IEEE.
关键词:
通讯作者信息:
电子邮件地址: