نتایج جستجو برای: local attention

تعداد نتایج: 824569  

Journal: :Proceedings of the ... AAAI Conference on Artificial Intelligence 2022

Non-Local Attention (NLA) brings significant improvement for Single Image Super-Resolution (SISR) by leveraging intrinsic feature correlation in natural images. However, NLA gives noisy information large weights and consumes quadratic computation resources with respect to the input size, limiting its performance application. In this paper, we propose a novel Efficient Contrastive (ENLCA) perfor...

Journal: :IEEE Access 2023

Exploiting global factors and embedding them directly into local graphs in point clouds are challenging due to dense points irregular structure. To accomplish this goal, we propose a novel end-to-end trainable graph attention network that extracts features terms of graphs. Our presents general graph, which obtains the most fundamental based on order positions different neighborhoods. Central is...

Journal: :IEEE Transactions on Computational Social Systems 2022

In this study, we present a novel clustering-based collaborative filtering (CF) method for recommender systems. Clustering-based CF methods can effectively deal with data sparsity and scalability problems. However, most of them are applied to single representation space, which might not characterize complex user–item interactions well. We argue that the should be observed from multiple views ch...

Journal: :Symmetry 2023

Meaningful representation of large-scale non-Euclidean structured data, especially in complex domains like network security and IoT system, is one the critical problems contemporary machine learning deep learning. Many successful cases graph-based models algorithms deal with data. However, It often undesirable to derive node representations by walking through complete topology a system or (grap...

Journal: :IEEE Signal Processing Letters 2022

Automated audio captioning aims to describe data with captions using natural language. Existing methods often employ an encoder-decoder structure, where the attention-based decoder (e.g., Transformer decoder) is widely used and achieves state-of-the-art performance. Although this method effectively captures global information within via self-attention mechanism, it may ignore event short time d...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید