Combining syntactic and semantic bidirectionalization
نویسندگان
چکیده
منابع مشابه
Combining Independent Syntactic and Semantic Annotation Schemes
We present MAIS, a UIMA-based environment for combining information from various annotated resources. Each resource contains one mode of linguistic annotation and remains independent from the other resources. Interactions between annotations are defined based on use cases.
متن کاملSemantic Role Chunking Combining Complementary Syntactic Views
This paper describes a semantic role labeling system that uses features derived from different syntactic views, and combines them within a phrase-based chunking paradigm. For an input sentence, syntactic constituent structure parses are generated by a Charniak parser and a Collins parser. Semantic role labels are assigned to the constituents of each parse using Support Vector Machine classifier...
متن کاملFormalizing Semantic Bidirectionalization with Dependent Types
Bidirectionalization is the task of automatically inferring one of two transformations that as a pair realize the forward and backward relationship between two domains, subject to certain consistency conditions. A specific technique, semantic bidirectionalization, has been developed that takes a getfunction (mapping forwards from sources to views) as input — but does not inspect its syntactic d...
متن کاملCombining Syntactic Frames and Semantic Roles to Acquire Verbs
For any given utterance of a verb, the referential scene offers a wide array of potential interpretations. The syntactic bootstrapping hypothesis (Landau & Gleitman, 1985) maintains that children could constrain these interpretations by exploiting systematic links between syntactic structure and verb meaning. A number of studies have provided support for this hypothesis. Children interpret a no...
متن کاملCombining semantic and syntactic structure for language modeling
Structured language models for speech recognition have been shown to remedy the weaknesses of n -gram models. All current structured language models, however, are limited in that they do not take into account dependencies between non-headwords. We show that non-headword dependencies contribute significantly to improved word error rate, and that a data-oriented parsing model trained on semantica...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: ACM SIGPLAN Notices
سال: 2010
ISSN: 0362-1340,1558-1160
DOI: 10.1145/1932681.1863571