Deep regression adaptation networks with model-based transfer learning for dynamic load identification in the frequency domain

نویسندگان

چکیده

Frequency-domain dynamic load identification methods based on neural network (NN) models construct independently at each frequency, but are inaccurate and inefficient to train. To address these problems, a deep regression adaptation (DRAN) with model-transfer learning is proposed for identifying loads in the frequency domain. The aim take advantage of similarity uncorrelated multi-source multi-vibration response adjacent frequencies. First, DRAN model established using historical data specific frequency. Second, trained parameters transferred target as initial parameter values. Next, fine-tuned obtain Finally, current next This process iterated until all frequencies established. Because function continuous varying relationships between similar. can adapt different one training, then extract common feature information improve accuracy model. Moreover, instead setting weights randomly training them model, used better from method was evaluated experimental cylindrical shell structure under acoustic vibration joint excitation. results show that weights, higher accuracy, noise robustness, shorter time than network.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deep Transfer Learning with Joint Adaptation Networks

Deep networks rely on massive amounts of labeled data to learn powerful models. For a target task short of labeled data, transfer learning enables model adaptation from a different source domain. This paper addresses deep transfer learning under a more general scenario that the joint distributions of features and labels may change substantially across domains. Based on the theory of Hilbert spa...

متن کامل

Deep Unsupervised Domain Adaptation for Image Classification via Low Rank Representation Learning

Domain adaptation is a powerful technique given a wide amount of labeled data from similar attributes in different domains. In real-world applications, there is a huge number of data but almost more of them are unlabeled. It is effective in image classification where it is expensive and time-consuming to obtain adequate label data. We propose a novel method named DALRRL, which consists of deep ...

متن کامل

Residual Parameter Transfer for Deep Domain Adaptation

The goal of Deep Domain Adaptation is to make it possible to use Deep Nets trained in one domain where there is enough annotated training data in another where there is little or none. Most current approaches have focused on learning feature representations that are invariant to the changes that occur when going from one domain to the other, which means using the same network parameters in both...

متن کامل

Transfer deep convolutional activations-based features for domain adaptation in sensor networks

In this paper, we propose a novel method named transfer deep convolutional activation-based features (TDCAF) for domain adaptation in sensor networks. Specifically, we first train a siamese network with weight sharing to map the images from different domains into the same feature space, which can learn domain-invariant information. Since various feature maps in one convolutional layer of the si...

متن کامل

Unsupervised Domain Adaptation with Residual Transfer Networks

The recent success of deep neural networks relies on massive amounts of labeled data. For a target task where labeled data is unavailable, domain adaptation can transfer a learner from a different source domain. In this paper, we propose a new approach to domain adaptation in deep networks that can jointly learn adaptive classifiers and transferable features from labeled data in the source doma...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Engineering Applications of Artificial Intelligence

سال: 2021

ISSN: ['1873-6769', '0952-1976']

DOI: https://doi.org/10.1016/j.engappai.2021.104244