نتایج جستجو برای: local attention

تعداد نتایج: 824569  

Journal: :Applied sciences 2022

Transformer models are now widely used for speech processing tasks due to their powerful sequence modeling capabilities. Previous work determined an efficient way model speaker embeddings using the by combining transformers with convolutional networks. However, traditional global self-attention mechanisms lack ability capture local information. To alleviate these problems, we proposed a novel g...

Journal: :Proceedings of the ... AAAI Conference on Artificial Intelligence 2023

Attention mechanisms, such as local and non-local attention, play a fundamental role in recent deep learning based speech enhancement (SE) systems. However, natural contains many fast-changing relatively briefly acoustic events, therefore, capturing the most informative features by indiscriminately using attention is challenged. We observe that noise type feature vary within sequence of can res...

Journal: :IOP Conference Series: Materials Science and Engineering 2021

Journal: :Information 2022

Since the Transformer architecture was introduced in 2017, there has been many attempts to bring self-attention paradigm field of computer vision. In this paper, we propose LHC: Local multi-Head Channel self-attention, a novel module that can be easily integrated into virtually every convolutional neural network, and is specifically designed for vision, with specific focus on facial expression ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید