نتایج جستجو برای: facial gestures

تعداد نتایج: 68477  

Background. Four of the most relevant gestures in rugby (RU) are the pass, the tackle, the line out, and the scrum. RU is the third most common contact sport on the planet, and being a fast-paced collision game and carries a high risk of injury. Objectives. To describe and compare plantar dynamics during four sports gestures in rugby players through speed, strength, and balance. Methods. Twen...

2005
Mehmet Emre Sargin Ferda Ofli Yelena Yasinnik Oya Aran Alexey Karpov Stephen Wilson Yucel Yemez Engin Erzin Murat Tekalp

Multi-modal speech and speaker modelling and recognition are widely accepted as vital aspects of state of the art human-machine interaction systems. While correlations between speech and lip motion as well as speech and facial expressions are widely studied, relatively little work has been done to investigate the correlations between speech and gesture. Detection and modelling of head, hand and...

2006
Karlo Smid Goranka Zoric Igor S. Pandzic

We introduce a universal architecture for statistically based HUman GEsturing (HUGE) system, for producing and using statistical models for facial gestures based on any kind of inducement. As inducement we consider any kind of signal that occurs in parallel to the production of gestures in human behaviour and that may have a statistical correlation with the occurrence of gestures, e.g. text tha...

Journal: :NeuroImage 2009
Tobias Flaisch Harald T. Schupp Britta Renner Markus Junghöfer

Humans are the only species known to use symbolic gestures for communication. This affords a unique medium for nonverbal emotional communication with a distinct theoretical status compared to facial expressions and other biologically evolved nonverbal emotion signals. While a frown is a frown all around the world, the relation of emotional gestures to their referents is arbitrary and varies fro...

Journal: :IEEE Transactions on Multimedia 2022

Human emotion is expressed, perceived and captured using a variety of dynamic data modalities, such as speech (verbal), videos (facial expressions) motion sensors (body gestures). We propose generalized approach to recognition that can adapt across modalities by modeling structured graphs. The motivation behind the graph build compact models without compromising on performance. To alleviate pro...

1996
Carlos Hitoshi Morimoto Yaser Yacoob Larry S. Davis

This paper explores the use of Hidden Markov Models (HMMs) for the recognition of head gestures. A gesture corresponds to a particular pattern of head movement. The facial plane is tracked using a parameterized model and the temporal sequence of three image rotation parameters are used to describe four gestures. A dynamic vector quantization scheme was implemented to transform the parameters in...

2006
Maarten Lambers

An overview is given of the different components that are needed when a system is built to understand a human conductor (in order to conduct a virtual orchestra or choir). Additionally, an investigation is done to describe the technique of conducting. It is outlined how input via different modalities used in conducting (hand gestures, facial expression, gaze behaviour) can be linked to various ...

1989
Justine Cassell

In this chapter I’m going to discuss the issues that arise when we design automatic spoken dialogue systems that can use not only voice, but also facial and head movements and hand gestures to communicate with humans. For the most part I will concentrate on the generation side of the problem—that is, building systems that can speak, move their faces and heads and make hand gestures. As with mos...

2002
O. Déniz M. Castrillón J. Lorenzo C. Guerra D. Hernández M. Hernández

The physical appearance and behavior of a robot is an important asset in terms of Human-Computer Interaction. Multimodality is also fundamental, as we humans usually expect to interact in a natural way with voice, gestures, etc. People approach complex interaction devices with stances similar to those used in their interaction with other people. In this paper we describe a robot head, currently...

Journal: :Journal of neurophysiology 1999
K Nakamura R Kawashima K Ito M Sugiura T Kato A Nakamura K Hatano S Nagumo K Kubota H Fukuda S Kojima

We measured regional cerebral blood flow (rCBF) using positron emission tomography (PET) to determine which brain regions are involved in the assessment of facial emotion. We asked right-handed normal subjects to assess the signalers' emotional state based on facial gestures and to assess the facial attractiveness, as well as to discriminate the background color of the facial stimuli, and compa...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید