Domain Heuristic Fusion of Multi-Word Embeddings for Nutrient Value Prediction

نویسندگان

چکیده

Being both a poison and cure for many lifestyle non-communicable diseases, food is inscribing itself into the prime focus of precise medicine. The monitoring few groups nutrients crucial some patients, methods easing their calculations are emerging. Our proposed machine learning pipeline deals with nutrient prediction based on learned vector representations short text–recipe names. In this study, we explored how results change when, instead using recipe description, use embeddings list ingredients. content one depends its ingredients; therefore, text ingredients contains more relevant information. We define domain-specific heuristic merging ingredients, which combines quantities each ingredient in order to them as features models prediction. from experiments indicate that improve when heuristic. protein were highly effective, accuracies up 97.98%. Implementing combining multi-word yields better than conventional heuristics, 60% accuracy cases.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Word Embeddings for the Construction Domain

We introduce word vectors for the construction domain. Our vectors were obtained by running word2vec on an 11M-word corpus that we created from scratch by leveraging freely-accessible online sources of construction-related text. We first explore the embedding space and show that our vectors capture meaningful constructionspecific concepts. We then evaluate the performance of our vectors against...

متن کامل

Symmetric Pattern Based Word Embeddings for Improved Word Similarity Prediction

We present a novel word level vector representation based on symmetric patterns (SPs). For this aim we automatically acquire SPs (e.g., “X and Y”) from a large corpus of plain text, and generate vectors where each coordinate represents the cooccurrence in SPs of the represented word with another word of the vocabulary. Our representation has three advantages over existing alternatives: First, b...

متن کامل

Word Embeddings for Multi-label Document Classification

In this paper, we analyze and evaluate word embeddings for representation of longer texts in the multi-label document classification scenario. The embeddings are used in three convolutional neural network topologies. The experiments are realized on the Czech ČTK and English Reuters-21578 standard corpora. We compare the results of word2vec static and trainable embeddings with randomly initializ...

متن کامل

Enhancing Backchannel Prediction Using Word Embeddings

Backchannel responses like “uh-huh”, “yeah”, “right” are used by the listener in a social dialog as a way to provide feedback to the speaker. In the context of human-computer interaction, these responses can be used by an artificial agent to build rapport in conversations with users. In the past, multiple approaches have been proposed to detect backchannel cues and to predict the most natural t...

متن کامل

Multi-view Recurrent Neural Acoustic Word Embeddings

Recent work has begun exploring neural acoustic word embeddings—fixeddimensional vector representations of arbitrary-length speech segments corresponding to words. Such embeddings are applicable to speech retrieval and recognition tasks, where reasoning about whole words may make it possible to avoid ambiguous sub-word representations. The main idea is to map acoustic sequences to fixed-dimensi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematics

سال: 2021

ISSN: ['2227-7390']

DOI: https://doi.org/10.3390/math9161941