Distilling Knowledge From Object Classification to Aesthetics Assessment
نویسندگان
چکیده
In this work, we point out that the major dilemma of image aesthetics assessment (IAA) comes from abstract nature aesthetic labels. That is, a vast variety distinct contents can correspond to same label. On one hand, during inference, IAA model is required relate various other when training, it would be hard for learn distinguish different merely with supervision labels, since labels are not directly related any specific content. To deal dilemma, propose distill knowledge on semantic patterns multiple pre-trained object classification (POC) models an model. Expecting combination POC provide sufficient contents, easier limited number By supervising end-to-end single-backbone distilled knowledge, performance significantly improved by 4.8% in SRCC compared version trained only ground-truth categories images, improvement brought proposed method achieve up 7.2%. Peer comparison also shows our outperforms 10 previous methods.
منابع مشابه
Distilling Task Knowledge from How-To Communities
Knowledge graphs have become a fundamental asset for search engines. A fair amount of user queries seek information on problem-solving tasks such as building a fence or repairing a bicycle. However, knowledge graphs completely lack this kind of how-to knowledge. This paper presents a method for automatically constructing a formal knowledge base on tasks and task-solving steps, by tapping the co...
متن کاملDistilling Knowledge from Deep Networks with Applications to Healthcare Domain
Exponential growth in Electronic Healthcare Records (EHR) has resulted in new opportunities and urgent needs for discovery of meaningful data-driven representations and patterns of diseases in Computational Phenotyping research. Deep Learning models have shown superior performance for robust prediction in computational phenotyping tasks, but suffer from the issue of model interpretability which...
متن کاملDistilling Model Knowledge
Top-performing machine learning systems, such as deep neural networks, large ensembles and complex probabilistic graphical models, can be expensive to store, slow to evaluate and hard to integrate into larger systems. Ideally, we would like to replace such cumbersome models with simpler models that perform equally well. In this thesis, we study knowledge distillation, the idea of extracting the...
متن کاملFace Model Compression by Distilling Knowledge from Neurons
The recent advanced face recognition systems were built on large Deep Neural Networks (DNNs) or their ensembles, which have millions of parameters. However, the expensive computation of DNNs make their deployment difficult on mobile and embedded devices. This work addresses model compression for face recognition, where the learned knowledge of a large teacher network or its ensemble is utilized...
متن کاملDistilling Knowledge from an Ensemble of Models for Punctuation Prediction
This paper proposes an approach to distill knowledge from an ensemble of models to a single deep neural network (DNN) student model for punctuation prediction. This approach makes the DNN student model mimic the behavior of the ensemble. The ensemble consists of three single models. Kullback-Leibler (KL) divergence is used to minimize the difference between the output distribution of the DNN st...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Circuits and Systems for Video Technology
سال: 2022
ISSN: ['1051-8215', '1558-2205']
DOI: https://doi.org/10.1109/tcsvt.2022.3186307