Global Local Fusion Neural Network for Multimodal Sentiment Analysis
نویسندگان
چکیده
With the popularity of social networking services, people are increasingly inclined to share their opinions and feelings on networks, leading rapid increase in multimodal posts various platforms. Therefore, sentiment analysis has become a crucial research field for exploring users’ emotions. The complex complementary interactions between images text greatly heighten difficulty analysis. Previous works conducted rough fusion operations ignored study fine features task, which did not obtain sufficient interactive information This paper proposes global local neural network (GLFN), comprehensively considers features, aggregating these analyze user sentiment. model first extracts overall by attention modules as modality-based features. Then, coarse-to-fine learning is applied build effectively. Specifically, cross-modal module used fusion, fine-grained capture interaction objects words. Finally, we integrate all achieve more reliable prediction. Extensive experimental results, comparisons, visualization public datasets demonstrate effectiveness proposed classification.
منابع مشابه
Tensor Fusion Network for Multimodal Sentiment Analysis
Multimodal sentiment analysis is an increasingly popular research area, which extends the conventional language-based definition of sentiment analysis to a multimodal setup where other relevant modalities accompany language. In this paper, we pose the problem of multimodal sentiment analysis as modeling intra-modality and inter-modality dynamics. We introduce a novel model, termed Tensor Fusion...
متن کاملRecursive Nested Neural Network for Sentiment Analysis
Early sentiment prediction systems use semantic vector representation of words to express longer phrases and sentences. These methods proved to have a poor performance, since they are not considering the compositionality in language. Recently many richer models has been proposed to understand the compositionality in natural language for better sentiment predictions. Most of these algorithms are...
متن کاملDeep Convolutional Neural Network Textual Features and Multiple Kernel Learning for Utterance-level Multimodal Sentiment Analysis
We present a novel way of extracting features from short texts, based on the activation values of an inner layer of a deep convolutional neural network. We use the extracted features in multimodal sentiment analysis of short video clips representing one sentence each. We use the combined feature vectors of textual, visual, and audio modalities to train a classifier based on multiple kernel lear...
متن کاملMultimodal Sentiment Analysis
With more than 10,000 new videos posted online every day on social websites such as YouTube and Facebook, the internet is becoming an almost infinite source of information. One important challenge for the coming decade is to be able to harvest relevant information from this constant flow of multimodal data. In this talk, I will introduce the task of multimodal sentiment analysis, and present a ...
متن کاملBenchmarking Multimodal Sentiment Analysis
We propose a framework for multimodal sentiment analysis and emotion recognition using convolutional neural network-based feature extraction from text and visual modalities. We obtain a performance improvement of 10% over the state of the art by combining visual, text and audio features. We also discuss some major issues frequently ignored in multimodal sentiment analysis research: the role of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Applied sciences
سال: 2022
ISSN: ['2076-3417']
DOI: https://doi.org/10.3390/app12178453