نتایج جستجو برای: facial bioelectric signals
تعداد نتایج: 249761 فیلتر نتایج به سال:
Gymnotiform weakly electric fish produce an electric signal to sense their environment and communicate with conspecifics. Although the generation of such relatively large electric signals over an entire lifetime is expected to be energetically costly, supporting evidence to date is equivocal. In this article, we first provide a theoretical analysis of the energy budget underlying signal product...
2 Abstract— In this paper we describe the Multimodal Affective User Interface (MAUI) we created to capture its users' emotional physiological signals via wearable computers and visualize the categorized signals in terms of recognized emotion. MAUI aims at 1) giving feedback to the users about their emotional states via various modalities (e.g. mirroring the user's facial expressions and describ...
Non-verbal communication signals are to a large part conveyed by visual motion information of the user’s facial components (intrinsic motion) and head (extrinsic motion). An observer perceives the visual flow as a superposition of both types of motions. However, when visual signals are used for training of classifiers for non-articulated communication signals, a decomposition is advantageous. W...
This paper shows that emotional information conveyed by facial expression is often contained not only in the expression of emotions per se, but also in other communicative signals, namely the performatives of communicative acts. An analysis is provided of the performatives of suggesting, warning, ordering, imploring, approving and praising, both on the side of their cognitive structure and on t...
Facial expressions of emotion powerfully influence social behavior. The distributed network of brain regions thought to decode these social signals has been empirically defined using static, usually photographic, displays of such expressions. Facial emotional expressions are however highly dynamic signals that encode the emotion message in facial action patterns. This study sought to determine ...
People highlight the intended interpretation of their utterances within a larger discourse by a diverse set of nonverbal signals. These signals represent a key challenge for animated conversational agents because they are pervasive, variable, and need to be coordinated judiciously in an effective contribution to conversation. In this paper, we describe a freely-available cross-platform real-tim...
People highlight the intended interpretation of their utterances within a larger discourse by a diverse set of non-verbal signals. These signals represent a key challenge for animated conversational agents because they are pervasive, variable, and need to be coordinated judiciously in an effective contribution to conversation. In this paper, we describe a freely available cross-platform real-ti...
Affective robots and embodied conversational agents require convincing facial expressions to make them socially acceptable. To be able to virtually generate facial expressions, we need to investigate the relationship between technology and human perception of affective and social signals. Facial landmarks, the locations of the crucial parts of a face, are important for perception of the affecti...
This paper proposes two multimodal fusion methods between brain and peripheral signals for emotion recognition. The input signals are electroencephalogram and facial expression. The stimuli are based on a subset of movie clips that correspond to four specific areas of valance-arousal emotional space (happiness, neutral, sadness, and fear). For facial expression detection, four basic emotion sta...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید