نتایج جستجو برای: syntactic dependency parsing
تعداد نتایج: 79646 فیلتر نتایج به سال:
This paper investigates and analyzes the effect of dependency information on predicateargument structure analysis (PASA) and zero anaphora resolution (ZAR) for Japanese, and shows that a straightforward approach of PASA and ZAR works effectively even if dependency information was not available. We constructed an analyzer that directly predicts relationships of predicates and arguments with thei...
We investigate active learning methods for Japanese dependency parsing. We propose active learning methods of using partial dependency relations in a given sentence for parsing and evaluate their effectiveness empirically. Furthermore, we utilize syntactic constraints of Japanese to obtain more labeled examples from precious labeled ones that annotators give. Experimental results show that our ...
In this paper we introduce a novel approach based on a bidirectional recurrent autoencoder to perform globally optimized non-projective dependency parsing via semisupervised learning. The syntactic analysis is completed at the end of the neural process that generates a Latent Heads Representation (LHR), without any algorithmic constraint and with a linear complexity. The resulting “latent synta...
Incremental parsing with a context free grammar produces partial syntactic structures for an initial fragment on the word-by-word basis. Owing to the syntactic ambiguity, however, too many structures are produced, and therefore its parsing speed becomes very slow. This paper describes a technique for efficient incremental parsing using lexical information. The probability concerning dependencie...
The interest in dependency grammar has clearly been on the rise in the last 10– 15 years. The parsing community has seen a number of benefits in the use of and parsing with dependency-based representations of syntactic phenomena. On the one hand, dependency trees are much simpler than phrase-structure trees – they contain exactly the same number of nodes as there are tokens in the sentence. The...
In this paper we introduce a joint arc-factored model for syntactic and semantic dependency parsing. The semantic role labeler predicts the full syntactic paths that connect predicates with their arguments. This process is framed as a linear assignment task, which allows to control some well-formedness constraints. For the syntactic part, we define a standard arc-factored dependency model that ...
Transforming syntactic representations in order to improve parsing accuracy has been exploited successfully in statistical parsing systems using constituency-based representations. In this paper, we show that similar transformations can give substantial improvements also in data-driven dependency parsing. Experiments on the Prague Dependency Treebank show that systematic transformations of coor...
Efficiency is a prime concern in syntactic MT decoding, yet significant developments in statistical parsing with respect to asymptotic efficiency haven’t yet been explored in MT. Recently, McDonald et al. (2005b) formalized dependency parsing as a maximum spanning tree (MST) problem, which can be solved in quadratic time relative to the length of the sentence. They show that MST parsing is almo...
-Natural Language Processing is the multidisciplinary area of Artificial Intelligence, Machine Learning and Computational Linguistic for processing human language automatically. It involves understanding and processing of human language. The way through which we share our contents or feelings have always great importance in understanding and processing of language. Parsing is the most suited ap...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید