Embedding Imputation with Self-Supervised Graph Neural Networks

نویسندگان

چکیده

Embedding learning is essential in various research areas, especially natural language processing (NLP). However, given the nature of unstructured data and word frequency distribution, general pre-trained embeddings, such as word2vec GloVe, are often inferior tasks for specific domains because missing or unreliable embedding. In many domain-specific tasks, pre-existing side information can be converted to a graph depict pair-wise relationship between words. Previous methods use kernel tricks pre-compute fixed propagating across different words imputing representations. These require predefining optimal construction strategy before any model training, resulting an inflexible two-step process. this paper, we leverage recent advances neural networks self-supervision simultaneously learn similarity impute embeddings end-to-end fashion with overall time complexity well controlled. We undertake extensive experiments show that integrated approach performs better than several baseline methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Supervised Learning for Self-Generating Neural Networks

In this paper, supervised learning for Self-Generating Neural Networks (SGNN) method, which was originally developed for the purpose of unsupervised learning, is discussed. An information analytical method is proposed to assign weights to attributes in the training examples if class information is available. This significantly improves the learning speed and the accuracy of the SGNN classiier. ...

متن کامل

GEMSEC: Graph Embedding with Self Clustering

Modern graph embedding procedures can efficiently extract features of nodes from graphs withmillions of nodes. Œe features are later used as inputs for downstream predictive tasks. In this paper we propose GEMSEC a graph embedding algorithm which learns a clustering of the nodes simultaneously with computing their features. Œe procedure places nodes in an abstract feature space where the vertex...

متن کامل

Semi-supervised Orthogonal Graph Embedding with Recursive Projections

Many graph based semi-supervised dimensionality reduction algorithms utilize the projection matrix to linearly map the data matrix from the original feature space to a lower dimensional representation. But the dimensionality after reduction is inevitably restricted to the number of classes, and the learned non-orthogonal projection matrix usually fails to preserve distances well and balance the...

متن کامل

Graph Partition Neural Networks for Semi-Supervised Classification

We present graph partition neural networks (GPNN), an extension of graph neural networks (GNNs) able to handle extremely large graphs. GPNNs alternate between locally propagating information between nodes in small subgraphs and globally propagating information between the subgraphs. To efficiently partition graphs, we experiment with several partitioning algorithms and also propose a novel vari...

متن کامل

Hybed: Hyperbolic Neural Graph Embedding

Neural embeddings have been used with great success in Natural Language Processing (NLP). They provide compact representations that encapsulate word similarity and attain state-of-the-art performance in a range of linguistic tasks. The success of neural embeddings has prompted significant amounts of research into applications in domains other than language. One such domain is graph-structured d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Access

سال: 2023

ISSN: ['2169-3536']

DOI: https://doi.org/10.1109/access.2023.3292314