Parsing Without (Much) Pharse Structure

نویسندگان

  • Michael B. Kac
  • Alexis Manaster-Ramer
چکیده

Approaches to NL syntax conform in varying degrees to the older relational/dependency model, (essentially that assumed in traditional grammar), which treats a sentence as a group of words united by various relations, and the newer constituent model. Some modern approaches have nonetheless involved shifts away from essentially constituent-based models of the sort associated with Bloomfield and Chomsky to more relation-based ones (e.g. case grammar, relational grammar, daughter-dependency and word grammar, corepresentational grammar) while some others, notably lexical-functional grammar, have nonetheless continued to rely crucially on certain techniques inherited from constituency-based grammar, particularly context-free grammar. In computational linguistics there is a strong (if not universal) reliance on phrase structure as the medium via which to represent syntactic structure; call this the CONSENSUS VIEW. A significant amount of effort has accordingly been invested in techniques by which to build such a representation efficiently, which has in turn led to considerable work on the formal and computational properties of context-free gramamrs (or natural extensions of them) and of the associated languages. In its strongest form, the consensus view says that the recovery of a fully specified parse tree is an essential step in computational language processing, and would, if correct, provide important support for the constituent model. In this paper, we shall critically examine the rationale for this view, and will sketch (informally) an alternative view which we find more defensible. The actual position we shall take for this discussion, however, is conservative in that we will not argue that there is no place whatever for constituent analysis in parsing or in syntactic analysis generally. What we WILL argue is that phrase structure is at least partly redundant in that a direct leap to the composition of some semantic units is possible from a relatively underspecified syntactic representation (as opposed to a complete parse tree). However, see Rindflesch forthcoming for an approach to.parsing which entails a much stronger denial of the consensus view.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An improved joint model: POS tagging and dependency parsing

Dependency parsing is a way of syntactic parsing and a natural language that automatically analyzes the dependency structure of sentences, and the input for each sentence creates a dependency graph. Part-Of-Speech (POS) tagging is a prerequisite for dependency parsing. Generally, dependency parsers do the POS tagging task along with dependency parsing in a pipeline mode. Unfortunately, in pipel...

متن کامل

بررسی مقایسه‌ای تأثیر برچسب‌زنی مقولات دستوری بر تجزیه در پردازش خودکار زبان فارسی

In this paper, the role of Part-of-Speech (POS) tagging for parsing in automatic processing of the Persian language is studied. To this end, the impact of the quality of POS tagging as well as the impact of the quantity of information available in the POS tags on parsing are studied. To reach the goals, three parsing scenarios are proposed and compared. In the first scenario, the parser assigns...

متن کامل

Fast and Robust Multilingual Dependency Parsing with a Generative Latent Variable Model

We use a generative history-based model to predict the most likely derivation of a dependency parse. Our probabilistic model is based on Incremental Sigmoid Belief Networks, a recently proposed class of latent variable models for structure prediction. Their ability to automatically induce features results in multilingual parsing which is robust enough to achieve accuracy well above the average ...

متن کامل

Non-Projective Dependency Parsing via Latent Heads Representation (LHR)

In this paper we introduce a novel approach based on a bidirectional recurrent autoencoder to perform globally optimized non-projective dependency parsing via semisupervised learning. The syntactic analysis is completed at the end of the neural process that generates a Latent Heads Representation (LHR), without any algorithmic constraint and with a linear complexity. The resulting “latent synta...

متن کامل

Enhancing Automatic Acquisition of Thematic Structure in a Large-Scale Lexicon for Mandarin Chinese

This paper describes a reenement to our procedure for port-ing lexical conceptual structure (LCS) into new languages. Speciically we describe a two-step process for creating candidate thematic grids for Mandarin Chinese verbs, using the English verb heading the VP in the subdeenitions to separate senses, and roughly parsing the verb complement structure to match thematic structure templates. We...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1986