نتایج جستجو برای: facial gestures

تعداد نتایج: 68477  

2014
Samantha Rowbotham April J. Wardy Donna M. Lloyd Alison Wearden Judith Holler Yeur-Hur Lai

Effective pain communication is essential if adequate treatment and support are to be provided. Pain communication is often multimodal, with sufferers utilising speech, nonverbal behaviours (such as facial expressions), and co-speech gestures (bodily movements, primarily of the hands and arms that accompany speech and can convey semantic information) to communicate their experience. Research su...

2013
Elizabeth A. Simpson Annika Paukner Valentina Sclafani Stephen J. Suomi Pier F. Ferrari

Newborn rhesus macaques imitate facial gestures even after a delay, revealing the flexible nature of their early communicative exchanges. In the present study we examined whether newborn macaques are also sensitive to the identities of the social partners with whom they are interacting. We measured infant monkeys' (n = 90) lipsmacking and tongue protrusion gestures in a face-to-face interaction...

2016
Yang Xiao Hui Liang Junsong Yuan Daniel Thalmann

In this chapter, a nonverbal way of communication for human–robot interaction by understanding human upper body gestures will be addressed. The human– robot interaction system based on a novel combination of sensors is proposed. It allows one person to interact with a humanoid social robot with natural body language. The robot can understand the meaning of human upper body gestures and express ...

2016
Linda Scheider Bridget M. Waller Leonardo Oña Anne M. Burrows Katja Liebal

Non-human primates use various communicative means in interactions with others. While primate gestures are commonly considered to be intentionally and flexibly used signals, facial expressions are often referred to as inflexible, automatic expressions of affective internal states. To explore whether and how non-human primates use facial expressions in specific communicative interactions, we stu...

2006
Dimitris N. Metaxas Gavriil Tsechpenakis Zhiguo Li Yuichi Huang Atul Kanaujia

We present a dynamic data-driven framework for tracking gestures and facial expressions from monocular sequences. Our system uses two cameras, one for the face and one for the body view for processing in different scales. Specifically, and for the gesture tracking module, we track the hands and the head, obtaining as output the blobs (ellipses) of the ROIs, and we detect the shoulder positions ...

Journal: :Gestalt theory 2021

Summary Methodological problems often arise when a special case is confused with the general principle. So you will find affordances only for ‚artifacts’ if restrict analysis to ‚artifacts’. The principle, however, an ‚invitation character’, which triggers action. Consequently, action-theoretical approach known as ‚pragmatic turn’ in cognitive science recommended. According this approach, human...

Journal: :Applied sciences 2022

Automatic sign language recognition is a challenging task in machine learning and computer vision. Most works have focused on recognizing using hand gestures only. However, body motion facial play an essential role interaction. Taking this into account, we introduce automatic system based multiple gestures, including hands, body, face. We used depth camera (OAK-D) to obtain the 3D coordinates o...

2007
Elisabetta Bevacqua Maurizio Mancini Radoslaw Niewiadomski Catherine Pelachaud

Embodied Conversational Agents (ECAs) are a new paradigm of computer interface with a human-like aspect that allow users to interact with the machine through natural speech, gestures, facial expressions, and gaze. In this paper we present an head animation system for our ECA Greta and we focus on two of its aspects: the expressivity of movement and the computation of complex facial expressions....

2010
Nicholas Michael Carol Neidle Dimitris Metaxas

Most research in the field of sign language recognition has focused on the manual component of signing, despite the fact that there is critical grammatical information expressed through facial expressions and head gestures. We, therefore, propose a novel framework for robust tracking and analysis of nonmanual behaviors, with an application to sign language recognition. Our method uses computer ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید