نتایج جستجو برای: lexical coverage
تعداد نتایج: 115790 فیلتر نتایج به سال:
The present work is about automatic parsing of written texts using lexicalized grammars and large coverage language resources. More specifically, we concentrated our work on three domains : algorithmic, easy development of NLP applications useful in an industrial context, and deep syntactic parsing. Concerning the first point, we implemented new algorithms for the optimisation of local grammars...
Deep unification(constraint-)based grammars are usually hand-crafted. Scaling such grammars from fragments to unrestricted text is time-consuming and expensive. This problem can be exacerbated in multilingual broad-coverage grammar development scenarios. Cahill et al. (2002, 2004) and O’Donovan et al. (2004) present an automatic f-structure annotation-based methodology to acquire broad-coverage...
Lexical resources are basic components of many text processing system devoted to information extraction, question answering or dialogue. In paste years many resources have been developed such as FrameNet and WordNet. FrameNet describes prototypical situations (i.e. Frames) while WordNet defines lexical meaning (senses) for the majority of English nouns, verbs, adjectives and adverbs. A major di...
abstract the variables affecting the nature of reading comprehension can be classified into two general categories: reader’s variables, and text variables (alderson, 2000). despite the wave of research on vocabulary knowledge as reader’s variable, the role of this knowledge in c-test as a text-dependent test and its interaction with lexical cohesion of the test as a text feature has remained a...
This paper proposes a method for integrating intonation and information structure into the Lexicalized Tree Adjoining Grammar (LTAG) formalism. The method works fully within LTAG and requires no changes or additions to the basic formalism. From the existing CCG analysis, we denote boundary tones as lexical items and pitch accents as features of lexical items. We then show how prosodically marke...
Textbooks are an important source of knowledge input on which the transmission academic often relies, especially in early stages learning. Adopting a corpus-based approach, this study evaluates text difficulty science textbooks used secondary English-medium instruction schools Hong Kong, with focus their lexical coverage and readability. It compares English language that as foreign textbooks. T...
the present paper aims to examine the role of frequency lexicon in assessing lexical proficiency of persian learners. addressing the frequency distribution pattern of persian words, the research which is conducted in corpus-based method studies the relationship between words and lexical proficiency as well as derives the frequency lexicon from persian corpus. to do so, formal and colloquial per...
Whereas the former has been regarded as a topical issue for quite some time, the latter is only now receiving its due attention. This workshop will concentrate on lexical rules as a regulator of breadth and depth of the lexicons. Lexical rules are known under a variety of names, e.g., Leech's (1981) "semantic transfer rules," "lexical implication rules" of Ostler and Atkins (1991) and others. T...
The recent trend towards developing the lexical component of NLP systems has focussed attention on two potentially valuable sources of lexical data: printed dictionaries for humans and large text corpora. This presentation considers the types of information that might be required by MT researchers and the extent to which this information can be derived from these two sources. This raises a numb...
a r t i c l e i n f o a b s t r a c t The category system in Wikipedia can be taken as a conceptual network. We label the semantic relations between categories using methods based on connectivity in the network and lexico-syntactic matching. The result is a large scale taxonomy. For evaluation we propose a method which (1) manually determines the quality of our taxonomy, and (2) automatically c...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید