Inducing Interpretability in Knowledge Graph Embeddings

نویسندگان

  • Chandrahas
  • Tathagata Sengupta
  • Cibi Pragadeesh
  • Partha Pratim Talukdar
چکیده

We study the problem of inducing interpretability in KG embeddings. Specifically, we explore the Universal Schema (Riedel et al., 2013) and propose a method to induce interpretability. There have been many vector space models proposed for the problem, however, most of these methods don’t address the interpretability (semantics) of individual dimensions. In this work, we study this problem and propose a method for inducing interpretability in KG embeddings using entity co-occurrence statistics. The proposed method significantly improves the interpretability, while maintaining comparable performance in other KG tasks.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Labeling Subgraph Embeddings and Cordiality of Graphs

Let $G$ be a graph with vertex set $V(G)$ and edge set $E(G)$, a vertex labeling $f : V(G)rightarrow mathbb{Z}_2$ induces an edge labeling $ f^{+} : E(G)rightarrow mathbb{Z}_2$ defined by $f^{+}(xy) = f(x) + f(y)$, for each edge $ xyin E(G)$.  For each $i in mathbb{Z}_2$, let $ v_{f}(i)=|{u in V(G) : f(u) = i}|$ and $e_{f^+}(i)=|{xyin E(G) : f^{+}(xy) = i}|$. A vertex labeling $f$ of a graph $G...

متن کامل

Best of Both Worlds: Making Word Sense Embeddings Interpretable

Word sense embeddings represent a word sense as a low-dimensional numeric vector. While this representation is potentially useful for NLP applications, its interpretability is inherently limited. We propose a simple technique that improves interpretability of sense vectors by mapping them to synsets of a lexical resource. Our experiments with AdaGram sense embeddings and BabelNet synsets show t...

متن کامل

Expeditious Generation of Knowledge Graph Embeddings

Knowledge Graph Embedding methods aim at representing entities and relations in a knowledge base as points or vectors in a continuous vector space. Several approaches using embeddings have shown promising results on tasks such as link prediction, entity recommendation, question answering, and triplet classification. However, only a few methods can compute low-dimensional embeddings of very larg...

متن کامل

Learning Knowledge Graph Embeddings for Natural Language Processing

Knowledge graph embeddings provide powerful latent semantic representation for the structured knowledge in knowledge graphs, which have been introduced recently. Being different from the already widely-used word embeddings that are conceived from plain text, knowledge graph embeddings enable direct explicit relational inferences among entities via simple calculation of embedding vectors. In par...

متن کامل

Topic-Based Embeddings for Learning from Large Knowledge Graphs

We present a scalable probabilistic framework for learning from multi-relational data, given in form of entity-relation-entity triplets, with a potentially massive number of entities and relations (e.g., in multirelational networks, knowledge bases, etc.). We define each triplet via a relation-specific bilinear function of the embeddings of entities associated with it (these embeddings correspo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1712.03547  شماره 

صفحات  -

تاریخ انتشار 2017