Gender Recognition of Human Behaviors Using Neural Ensembles

نویسندگان

  • Jungwon Ryu
  • Sung-Bae Cho
چکیده

In this paper, we have developed two ensembles of neural network classifiers in order to recognize actors’ gender from their biological movements. One is the ensemble of modular MLPs (experts), the other is the ensemble of modular MLPs and an inductive decision tree which combines the output of experts. The human movement database consists of 13 males’ and 13 females’ movements, and contains 10 repetitions of knocking, waving and lifting movements both in neutral and angry style. Features have been extracted with 4 different representations such as the 2D and 3D velocities and positions, recorded from 6 point lights attached on body. We have compared the results of ensembles to the regular classifiers such as MLP, decision tree, self-organizing map and support vector machine. Furthermore, the discriminability and efficiency have been calculated for the comparison with the human performance that has been obtained with the same experiment. Our experimental results indicate that the ensemble models are superior to the conventional classifiers and human participants.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Speech Emotion Recognition Using Scalogram Based Deep Structure

Speech Emotion Recognition (SER) is an important part of speech-based Human-Computer Interface (HCI) applications. Previous SER methods rely on the extraction of features and training an appropriate classifier. However, most of those features can be affected by emotionally irrelevant factors such as gender, speaking styles and environment. Here, an SER method has been proposed based on a concat...

متن کامل

A Comparative Study of Gender and Age Classification in Speech Signals

Accurate gender classification is useful in speech and speaker recognition as well as speech emotion classification, because a better performance has been reported when separate acoustic models are employed for males and females. Gender classification is also apparent in face recognition, video summarization, human-robot interaction, etc. Although gender classification is rather mature in a...

متن کامل

Effect of sound classification by neural networks in the recognition of human hearing

In this paper, we focus on two basic issues: (a) the classification of sound by neural networks based on frequency and sound intensity parameters (b) evaluating the health of different human ears as compared to of those a healthy person. Sound classification by a specific feed forward neural network with two inputs as frequency and sound intensity and two hidden layers is proposed. This process...

متن کامل

Mixture of experts for classification of gender, ethnic origin, and pose of human faces

In this paper we describe the application of mixtures of experts on gender and ethnic classification of human faces, and pose classification, and show their feasibility on the FERET database of facial images. The FERET database allows us to demonstrate performance on hundreds or thousands of images. The mixture of experts is implemented using the "divide and conquer" modularity principle with r...

متن کامل

Hand Gesture Recognition from RGB-D Data using 2D and 3D Convolutional Neural Networks: a comparative study

Despite considerable enhances in recognizing hand gestures from still images, there are still many challenges in the classification of hand gestures in videos. The latter comes with more challenges, including higher computational complexity and arduous task of representing temporal features. Hand movement dynamics, represented by temporal features, have to be extracted by analyzing the total fr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001