نتایج جستجو برای: facial gestures
تعداد نتایج: 68477 فیلتر نتایج به سال:
The face motion plays a special role in sustaining the human gestures. The emerging media applications are conveying the human expressions by means of virtual faces which are typically controlled by a parameterized description. In this context, we developed a Facial Analysis and Synthesis Scheme able to extract a spatial-temporal semantic face description from an image sequence. This research p...
Since social robots will cooperate and assist humans in a variety of daily life situations, interacting with these kind of robots must be easy and intuitive for all kind of users, ranging from children to elderly. Therefore, in the design of social robots, the human must be placed central and thus, social robots need to be equipped with human-like social and communicative skills. Human communic...
Speech perception involves multiple input modalities. Research has indicated that perceivers establish cross-modal associations between auditory and visuospatial events to aid perception. Such intermodal relations can be particularly beneficial for speech development and learning, where infants and non-native perceivers need additional resources to acquire and process new sounds. This study exa...
Audiovisual speech has a stereotypical rhythm that is between 2 and 7 Hz, and deviations from this frequency range in either modality reduce intelligibility. Understanding how audiovisual speech evolved requires investigating the origins of this rhythmic structure. One hypothesis is that the rhythm of speech evolved through the modification of some pre-existing cyclical jaw movements in a prima...
It is well-established that the observation of emotional facial expression induces facial mimicry responses in the observers. However, how the interaction between emotional and motor components of facial expressions can modulate the motor behavior of the perceiver is still unknown. We have developed a kinematic experiment to evaluate the effect of different oro-facial expressions on perceiver's...
The paper discusses two body-language humancomputer interaction modalities, namely facial expressions and hand gestures, for healthcare and smart environment
It is envisioned that autonomous software agents that can communicate using speech and gesture will soon be on everybody’s computer screen. This paper describes an architecture that can be used to design and animate characters capable of lip-synchronised synthetic speech as well as body gestures, for use in for example spoken dialogue systems. A general scheme for computationally efficient para...
Automatic recognition of facial gestures is becoming increasingly important as real world AI agents become a reality. In this paper, we present an automated system that recognizes facial gestures by capturing local changes and encoding the motion into a histogram of frequencies. We evaluate the proposed method by demonstrating its effectiveness on spontaneous face action benchmarks: the FEEDTUM...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید