Capturing Dependency Syntax with "Deep" Sequential Models

نویسنده

  • Yoav Goldberg
چکیده

Neural network (“deep learning”) models are taking over machine learning approaches for language by storm. In particular, recurrent neural networks (RNNs), which are flexible non-markovian models of sequential data, were shown to be effective for a variety of language processing tasks. Somewhat surprisingly, these seemingly purely sequential models are very capable at modeling syntactic phenomena, and using them result in very strong dependency parsers, for a variety of languages.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Towards a Syntax-Semantics Interface for Topological Dependency Grammar

We present the first step towards a constraint-based syntax-semantics interface for Topological Dependency Grammar (TDG) (Duchier and Debusmann, 2001). We extend TDG with a new level of representation called semantic dependency dag to capture the deep semantic dependencies and clearly separate this level from the syntactic dependency tree. We stipulate an emancipation mechanism between these le...

متن کامل

A General Probabilistic Model for Dependency Parsing

We address the question what it takes to define a correct probabilistic model for syntactic natural language processing. We focus on one particular theory of syntax, called dependency syntax, and develop a framework for developing probabilistic model for that linguistic theory. Subsequently, we review existing models of probabilistic dependency syntax and show some problematic aspects of these ...

متن کامل

Extending Hidden Markov (tree) models for word representations

There is ample research in natural language processing (NLP) on obtaining word representations, including vector space modeling, clustering and techniques derived from language models. Good word representations are vital for overcoming the lexical sparseness inherent to many NLP problems. Much less studied are approaches capturing wider or global context (see e.g. Nepal and Yates (2014)). We ar...

متن کامل

Synchronous Dependency Insertion Grammars: A Grammar Formalism For Syntax Based Statistical MT

This paper introduces a grammar formalism specifically designed for syntax-based statistical machine translation. The synchronous grammar formalism we propose in this paper takes into consideration the pervasive structure divergence between languages, which many other synchronous grammars are unable to model. A Dependency Insertion Grammars (DIG) is a generative grammar formalism that captures ...

متن کامل

Syntax-Semantics Interface: A Plea for a Deep Dependency Sentence Structure

The aim of the contribution is to bring arguments for a description for natural language that (i) includes a representation (i) of a deep (underlying) sentence structure and (ii) is based on the relation of dependency. Our argumentation rests on linguistic considerations and stems from the Praguian linguistic background, both with respect to the Praguian structuralist tradition as well as to th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017