نتایج جستجو برای: visual sign

تعداد نتایج: 412359  

2005
Jörg Zieren Karl-Friedrich Kraiss

Sign language recognition constitutes a challenging field of research in computer vision. Common problems like overlap, ambiguities, and minimal pairs occur frequently and require robust algorithms for feature extraction and processing. We present a system that performs person-dependent recognition of 232 isolated signs with an accuracy of 99.3% in a controlled environment. Person-independent r...

2011
Pamela Perniss Inge Zwitserlood Asli Özyürek

Spatial language in signed language is assumed to be shaped by affordances of the visual-spatial modality – where the use of the hands and space allow the mapping of spatial relationships in an iconic, analogue way – and thus to be similar across sign languages. In this study, we test assumptions regarding the modality-driven similarity of spatial language by comparing locative expressions (e.g...

2007
Eun-Jung Holden Robyn Owens Geoffrey G. Roy

The gesture recognition process, in general, may be divided into two stages: motion sensing, which extracts useful data from hand motion; and the classification process, which classifies the motion sensing data as gestures. We have developed the visionbased Hand Motion Understanding (HMU) system that recognises static and dynamic Australian Sign Language (Auslan) signs by extracting and classif...

2003
Laura Muir Iain Richardson Steven Leaper

Sign language communication via videotelephone has demanding visual quality requirements. In order to optimise video coding for sign language it is necessary to quantify the importance of areas of the video scene. Eye movements of deaf users are tracked whilst watching a sign language video sequence. The results indicate that the gaze tends to concentrate on the face region with occasional excu...

Journal: :Neuropsychologia 1996
G Grossi C Semenza S Corazza V Volterra

Most studies on sign lateralization provide inconclusive results about the role of the two hemispheres in sign language processing, whereas the cases reported in the clinical literature show sign language impairment only following left hemisphere damage, suggesting a similar neural organization to spoken languages. By discriminating different levels of processing, a tachistoscopic study found t...

2015
Júlia Kučerová Jaroslav Polec

Visual information is very important in human communication. It is used in any type of sign language communication, and in non-verbal communication of the entire population, as well. Therefore, visual information is crucial for communication of hearing impaired people. Video is the most common way to capture this type of information and it is very important to correctly process it. In this pape...

Journal: :ICST Trans. e-Education e-Learning 2016
Nicoletta Adamo-Villani Saikiran Anasingaraju

The paper discusses ongoing research on the effects of a signing avatar's modeling/rendering features on the perception of sign language animation. It reports a recent study that aimed to determine whether a character's visual style has an effect on how signing animated characters are perceived by viewers. The stimuli of the study were two polygonal characters presenting two different visual st...

Journal: :Interacting with Computers 2010
Rubén San-Segundo-Hernández José Manuel Pardo Javier Ferreiros Valentín Sama Rojo Roberto Barra-Chicote Juan Manuel Lucas D. Sánchez Antonio García

Please cite this article in press as: San-Segun j.intcom.2009.11.011 This paper describes the development of a Spoken Spanish generator from sign-writing. The sign language considered was the Spanish sign language (LSE: Lengua de Signos Española). This system consists of an advanced visual interface (where a deaf person can specify a sequence of signs in signwriting), a language translator (for...

2008
Oya Aran Thomas Burger Lale Akarun Alice Caplier

Recent research in Human-Computer Interaction (HCI) has focused on equipping machines with means of communication that are used between humans, such as speech and accompanying gestures. For the hearing impaired , the visual components of speech, such as lip movements, or ges-tural languages such as sign language are available means of communication. This has led researchers to focus on lip read...

1999
Eun-Jung Holden Robyn Owens Geoffrey G. Roy

The Hand Motion Understanding (HMU) system is a vision-based Australian sign language recognition system that recognises static and dynamic hand signs. It uses a visual hand tracker to extract 3D hand configuration data from a visual motion sequence, and a classifier that recognises the changes of these 3D kinematic data as a sign. This paper presents the HMU classifier that uses an adaptive fu...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید