What Can You Do with a Rock? Affordance Extraction via Word Embeddings
نویسندگان
چکیده
Autonomous agents must often detect affordances: the set of behaviors enabled by a situation. Affordance detection is particularly helpful in domains with large action spaces, allowing the agent to prune its search space by avoiding futile behaviors. This paper presents a method for affordance extraction via word embeddings trained on a tagged Wikipedia corpus. The resulting word vectors are treated as a common knowledge database which can be queried using linear algebra. We apply this method to a reinforcement learning agent in a text-only environment and show that affordance-based action selection improves performance in most cases. Our method increases the computational complexity of each learning step but significantly reduces the total number of steps needed. In addition, the agent’s action selections begin to resemble those a human would choose.
منابع مشابه
Semantic Information Extraction for Improved Word Embeddings
Word embeddings have recently proven useful in a number of different applications that deal with natural language. Such embeddings succinctly reflect semantic similarities between words based on their sentence-internal contexts in large corpora. In this paper, we show that information extraction techniques provide valuable additional evidence of semantic relationships that can be exploited when...
متن کاملIt Ain’t What You Do (But the Way That You Do It): Will Safety II Transform the Way We Do Patient Safety; Comment on “False Dawns and New Horizons in Patient Safety Research and Practice”
Mannion and Braithwaite outline a new paradigm for studying and improving patient safety – Safety II. In this response, I argue that Safety I should not be dismissed simply because the safety management strategies that are developed and enacted in the name of Safety I are not always true to the original philosophy of ‘systems thinking.’
متن کاملCS261: A Second Course in Algorithms Lecture #10: The Minimax Theorem and Algorithms for Linear Programming∗
Recall rock-paper-scissors (or roshambo). Two players simultaneously choose one of rock, paper, or scissors, with rock beating scissors, scissors beating paper, and paper beating rock. Here’s an idea: what if I made you go first? That’s obviously unfair — whatever you do, I can respond with the winning move. But what if I only forced you to commit to a probability distribution over rock, paper,...
متن کاملBilingual Word Embeddings for Bilingual Terminology Extraction from Specialized Comparable Corpora
Bilingual lexicon extraction from comparable corpora is constrained by the small amount of available data when dealing with specialized domains. This aspect penalizes the performance of distributionalbased approaches, which is closely related to the reliability of word’s cooccurrence counts extracted from comparable corpora. A solution to avoid this limitation is to associate external resources...
متن کاملCombining Word Embeddings and Feature Embeddings for Fine-grained Relation Extraction
Compositional embedding models build a representation for a linguistic structure based on its component word embeddings. While recent work has combined these word embeddings with hand crafted features for improved performance, it was restricted to a small number of features due to model complexity, thus limiting its applicability. We propose a new model that conjoins features and word embedding...
متن کامل