نتایج جستجو برای: improving knowledge
تعداد نتایج: 789156 فیلتر نتایج به سال:
The increasing complexity of deep learning models led to the development Knowledge Distillation (KD) approaches that enable us transfer knowledge between a very large network, called teacher and smaller faster one, student. However, as recent evidence suggests, using powerful teachers often negatively impacts effectiveness distillation process. In this paper, reasons behind apparent limitation ...
Training scene graph classification models requires a large amount of annotated image data. Meanwhile, graphs represent relational knowledge that can be modeled with symbolic data from texts or graphs. While annotation demands extensive labor, collecting textual descriptions natural scenes less effort. In this work, we investigate whether substitute for To end, employ framework is trained not o...
Both intensional and extensional background knowledge have previously been used in inductive problems to complement the training set used for a task. In this research, we propose to explore the usefulness, for inductive learning, of a new kind of intensional background knowledge: the inter-relationships or conditional probability distributions between subsets of attributes. Such information cou...
Transfer of learned knowledge from one task to another offers an opportunity to reduce development cost of knowledgebased systems by reusing existing knowledge in novel situations. However, minor differences in the initial and target environments can reduce the effectiveness of the system substantially. In previous work, we presented a system that acquired procedural knowledge of American footb...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید