نتایج جستجو برای: facial bioelectric signals

تعداد نتایج: 249761  

Journal: :The Journal of experimental biology 2013
Vielka L Salazar Rüdiger Krahe John E Lewis

Gymnotiform weakly electric fish produce an electric signal to sense their environment and communicate with conspecifics. Although the generation of such relatively large electric signals over an entire lifetime is expected to be energetically costly, supporting evidence to date is equivocal. In this article, we first provide a theoretical analysis of the energy budget underlying signal product...

Journal: :J. Vis. Lang. Comput. 2006
Fatma Nasoz Christine L. Lisetti

2 Abstract— In this paper we describe the Multimodal Affective User Interface (MAUI) we created to capture its users' emotional physiological signals via wearable computers and visualize the categorized signals in terms of recognized emotion. MAUI aims at 1) giving feedback to the users about their emotional states via various modalities (e.g. mirroring the user's facial expressions and describ...

2013
Stephan Tschechne Georg Layher Heiko Neumann

Non-verbal communication signals are to a large part conveyed by visual motion information of the user’s facial components (intrinsic motion) and head (extrinsic motion). An observer perceives the visual flow as a superposition of both types of motions. However, when visual signals are used for training of classifiers for non-articulated communication signals, a decomposition is advantageous. W...

1999
Isabella Poggi Catherine Pelachaud

This paper shows that emotional information conveyed by facial expression is often contained not only in the expression of emotions per se, but also in other communicative signals, namely the performatives of communicative acts. An analysis is provided of the performatives of suggesting, warning, ordering, imploring, approving and praising, both on the side of their cognitive structure and on t...

Journal: :NeuroImage 2003
Clinton D Kilts Glenn Egan Deborah A Gideon Timothy D Ely John M Hoffman

Facial expressions of emotion powerfully influence social behavior. The distributed network of brain regions thought to decode these social signals has been empirically defined using static, usually photographic, displays of such expressions. Facial emotional expressions are however highly dynamic signals that encode the emotion message in facial action patterns. This study sought to determine ...

2002
Douglas DeCarlo Corey Revilla Matthew Stone Jennifer J. Venditti

People highlight the intended interpretation of their utterances within a larger discourse by a diverse set of nonverbal signals. These signals represent a key challenge for animated conversational agents because they are pervasive, variable, and need to be coordinated judiciously in an effective contribution to conversation. In this paper, we describe a freely-available cross-platform real-tim...

Journal: :Journal of Visualization and Computer Animation 2004
Douglas DeCarlo Matthew Stone Corey Revilla Jennifer J. Venditti

People highlight the intended interpretation of their utterances within a larger discourse by a diverse set of non-verbal signals. These signals represent a key challenge for animated conversational agents because they are pervasive, variable, and need to be coordinated judiciously in an effective contribution to conversation. In this paper, we describe a freely available cross-platform real-ti...

Journal: :I. J. Social Robotics 2013
Caixia Liu Jaap Ham Eric O. Postma Cees J. H. Midden Bart Joosten Martijn Goudbeek

Affective robots and embodied conversational agents require convincing facial expressions to make them socially acceptable. To be able to virtually generate facial expressions, we need to investigate the relationship between technology and human perception of affective and social signals. Facial landmarks, the locations of the crucial parts of a face, are important for perception of the affecti...

2017
Yongrui Huang Jianhao Yang Pengkai Liao Jiahui Pan

This paper proposes two multimodal fusion methods between brain and peripheral signals for emotion recognition. The input signals are electroencephalogram and facial expression. The stimuli are based on a subset of movie clips that correspond to four specific areas of valance-arousal emotional space (happiness, neutral, sadness, and fear). For facial expression detection, four basic emotion sta...

Journal: :Proceedings of the National Academy of Sciences 1949

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید