BNU-LSVED 2.0: Spontaneous multimodal student affect database with multi-dimensional labels

نویسندگان

  • Qinglan Wei
  • Bo Sun
  • Jun He
  • Lejun Yu
چکیده

In college classrooms, large quantities of digital-media data showing students’ affective behaviors are continuously captured by cameras on a daily basis. To provide a bench mark for affect recognition using these big data collections, in this paper we propose the first large-scale spontaneous and multimodal student affect database. All videos in our database were selected from daily big data recordings. The recruited subjects extracted one-person image sequences of their own affective behaviors, and then they made affect annotations under standard rules set beforehand. Ultimately, we have collected 2117 image sequences with 11 types of students’ affective behaviors in a variety of classes. The Beijing Normal University Large-scale Spontaneous Visual Expression Database version 2.0 (BNU-LSVED2.0) is an extension database of our previous BNU-LSVED1.0 and it has a number of new characteristics. The nonverbal behaviors and emotions in the new version database are more spontaneous since all image sequences are from the recording videos recorded in actual classes, rather than of behaviors stimulated by induction videos. Moreover, it includes a greater variety of affective behaviors, from which can be inferred students’ learning status during classes; these behaviors include facial expressions, eye movements, head postures, body movements, and gestures. In addition, instead of providing only categorical emotion labels, the new version also provides affective behavior labels and multi-dimensional Pleasure–Arousal–Dominance (PAD) labels that have been assigned to the image sequences. Both the detailed subjective descriptions and the statistical analyses of the self-annotation results demonstrate the reliability and the effectiveness of the multi-dimensional labels in the database. © 2017 Elsevier B.V. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Building a Multimodal Laughter Database for Emotion Recognition

Laughter is a significant paralinguistic cue that is largely ignored in multimodal affect analysis. In this work, we investigate how a multimodal laughter corpus can be constructed and annotated both with discrete and dimensional labels of emotions for acted and spontaneous laughter. Professional actors enacted emotions to produce acted clips, while spontaneous laughter was collected from volun...

متن کامل

CHEAVD: a Chinese natural emotional audio-visual database

This paper presents a recently collected natural, multimodal, rich-annotated emotion database, CASIA Chinese Natural Emotional Audio–Visual Database (CHEAVD), which aims to provide a basic resource for the research on multimodal multimedia interaction. This corpus contains 140 min emotional segments extracted from films, TV plays and talk shows. 238 speakers, aging from child to elderly, consti...

متن کامل

Multivariate Output-Associative RVM for Multi-Dimensional Affect Predictions

The current trends in affect recognition research are to consider continuous observations from spontaneous natural interactions in people using multiple feature modalities, and to represent affect in terms of continuous dimensions, incorporate spatio-temporal correlation among affect dimensions, and provide fast affect predictions. These research efforts have been propelled by a growing effort ...

متن کامل

Multimodal Spontaneous Expressive Speech Corpus for Hungarian

A Hungarian multimodal spontaneous expressive speech corpus was recorded following the methodology of a similar French corpus. The method relied on a Wizard of Oz scenario-based induction of varying affective states. The subjects were interacting with a supposedly voice-recognition driven computer application using simple command words. Audio and video signals were captured for the 7 recorded s...

متن کامل

Multi-level Annotations of Nonverbal Behaviors in French Spontaneous Conversation

This paper describes a multi-level scheme to annotate gesture, posture and gaze in a spontaneous interactional corpus in French. In nonverbal behaviour research there is a lack of available, accessible and spontaneous multimodal corpora. One of the challenges is to use structural and reliable coding schemes to study the relationships among different nonverbal modalities. We propose a method to ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Sig. Proc.: Image Comm.

دوره 59  شماره 

صفحات  -

تاریخ انتشار 2017