نتایج جستجو برای: contextualized memorization
تعداد نتایج: 4029 فیلتر نتایج به سال:
We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e.g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i.e., to model polysemy). Our word vectors are learned functions of the internal states of a deep bidirectional language model (biLM), which is pretrained on a large text corpus. We show ...
The spatial expression “near” describes proximity and is frequently used for web search such as “gas stations near the old market”. What is “near” depends on the context and I investigate how a context dependent model for “near” can be formulated. For doing so, I investigate the following questions: (i) what is the relevant contextual information for “near”? (ii) how does the identified informa...
RDF is a key enabling technology for linked data. However, currently RDF lacks a mechanism to connect data from different documents as well to address the contextual differences in these documents. We propose to introduce rdf:imports for context-aware integration of RDF documents.
Memorizing the Qur'an is not just adding to memorization, but how maintain memorization that has been memorized. Isy Karima Karanganyar High School of Al-Qur'an Science (STIQ) one Islamic tertiary institutions organizes tahfihz as content contained in curriculum and all students are required memorize their trademark. The research objective was determine role Musyrif Tahfiz Strengthening Student...
If an instance of conscious experience of the seemingly objective world around us could be regarded as a newly formed event memory, much as an instance of mental imagery has the content of a retrieved event memory, and if, therefore, the stream of conscious experience could be seen as evidence for ongoing formation of event memories that are linked into episodic memory sequences, then unitary c...
In this work we leverage recent advances in context-sensitive language models to improve the task of query expansion. Contextualized word representation models, such as ELMo and BERT, are rapidly replacing static embedding models. We propose a new model, Embeddings for Query Expansion (CEQE), that utilizes query-focused contextualized vectors. study behavior contextual representations generated...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید