Reciprocal normalization for domain adaptation
نویسندگان
چکیده
Batch normalization (BN) is widely used in modern deep neural networks, which has been shown to represent the domain-related knowledge, and thus ineffective for cross-domain tasks like unsupervised domain adaptation (UDA). Existing BN variant methods aggregate source target knowledge same channel module. However, misalignment between features of corresponding channels across domains often leads a sub-optimal transferability. In this paper, we exploit relation propose novel method, Reciprocal Normalization (RN). Specifically, RN first presents Compensation (RC) module acquire compensatory each both based on channel-wise correlation. Then develops Aggregation (RA) adaptively feature with its components. As an alternative BN, more suitable UDA problems can be easily integrated into popular methods. Experiments show that proposed outperforms existing counterparts by large margin helps state-of-the-art approaches achieve better results. The code available https://github.com/Openning07/reciprocal-normalization-for-DA.
منابع مشابه
Revisiting Batch Normalization For Practical Domain Adaptation
Deep neural networks (DNN) have shown unprecedented success in various computer vision applications such as image classification and object detection. However, it is still a common annoyance during the training phase, that one has to prepare at least thousands of labeled images to fine-tune a network to a specific domain. Recent study (Tommasi et al., 2015) shows that a DNN has strong dependenc...
متن کاملSample-oriented Domain Adaptation for Image Classification
Image processing is a method to perform some operations on an image, in order to get an enhanced image or to extract some useful information from it. The conventional image processing algorithms cannot perform well in scenarios where the training images (source domain) that are used to learn the model have a different distribution with test images (target domain). Also, many real world applicat...
متن کاملBias Adaptation for Vocal Tract Length Normalization
Vocal tract length normalisation (VTLN) is a well known rapid adaptation technique. VTLN as a linear transformation in the cepstral domain results in the scaling and translation factors. The warping factor represents the spectral scaling parameter. While, the translation factor represented by bias term captures more speaker characteristics especially in a rapid adaptation framework without havi...
متن کاملConnectionist speaker normalization and adaptation
In a speaker-independent, large-vocabulary continuous speech recognition systems, recognition accuracy varies considerably from speaker to speaker, and performance may be significantly degraded for outlier speakers such as nonnative talkers. In this paper, we explore supervised speaker adaptation and normalization in the MLP component of a hybrid hidden Markov model/ multilayer perceptron versi...
متن کاملPattern Adaptation and Normalization Reweighting.
UNLABELLED Adaptation to an oriented stimulus changes both the gain and preferred orientation of neural responses in V1. Neurons tuned near the adapted orientation are suppressed, and their preferred orientations shift away from the adapter. We propose a model in which weights of divisive normalization are dynamically adjusted to homeostatically maintain response products between pairs of neuro...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Pattern Recognition
سال: 2023
ISSN: ['1873-5142', '0031-3203']
DOI: https://doi.org/10.1016/j.patcog.2023.109533