نتایج جستجو برای: visual synchrony

تعداد نتایج: 367165  

2017
Naomi Gotow Tatsu Kobayakawa

Vision is a physical sense, whereas olfaction and gustation are chemical senses. Active sensing might function in vision, olfaction, and gustation, whereas passive sensing might function in vision and olfaction but not gustation. To investigate whether each sensory property affected synchrony perception, participants in this study performed simultaneity judgment (SJ) for three cross-modal combi...

2018
You Zhai Jian Zhai

This paper uses a newly defined functional connectome and connectome values calculated in time domain of simulated neurotransmitter release (NTR) from an electrocorticogram (ECoG) to distinguish between conditioned and unconditioned stimuli. The NTR derived from multiple channels releasing one quantum at the same time suggests that one functional connectome occurs across those channels at that ...

2012
Yoshimori Sugano Mirjam Keetels Jean Vroomen

The timing relation between a motor action and the sensory consequences of that action can be adapted by exposing participants to artificially delayed feedback (temporal recalibration). Here, we demonstrate that a sensorimotor synchronization task (i.e., tapping the index finger in synchrony with a pacing signal) can be used as a measure of temporal recalibration. Participants were first expose...

2015
Etienne Marcheret Gerasimos Potamianos Josef Vopicka Vaibhava Goel

In this paper, we address the problem of automatically detecting whether the audio and visual speech modalities in frontal pose videos are synchronous or not. This is of interest in a wide range of applications, for example spoof detection in biometrics, lip-syncing, speaker detection and diarization in multi-subject videos, and video data quality assurance. In our adopted approach, we investig...

2011
László Czap

The temporal synchrony of auditory and visual signals is known to affect the perception of audiovisual speech. Several papers have discussed the asymmetry of acoustic and visual timing cues. These results are usually based on subjective intelligibility tests and the reason is remained obscure. It is not clear that the observation is perception or production origin. In this paper the effect of a...

2007
Michael Pilling Sharon M. Thomas

Recent research had shown that concurrent visual speech modulates the cortical event-related potential N1/P2 to auditory speech. Audiovisually presented speech results in an N1-P2 that is reduced in peak amplitude and with shorter peak latencies than unimodal auditory speech [11]. This effect on the N1/P2 is consistent with a model in which visual speech integrates with auditory speech at an ea...

2004
Brianna L. Conrey David B. Pisoni Luis Hernandez

Two experiments were conducted to examine the temporal limitations on the detection of asynchrony in auditory-visual (AV) signals. Each participant made asynchrony judgments about speech and nonspeech signals presented over an 800-ms range of AV onset asynchronies. Consistent with previous findings, all conditions revealed a wide window of several hundred milliseconds over which AV signals were...

2015
Kishore Reddy Konda Roland Memisevic

We present an approach to predicting velocity and direction changes from visual information (”visual odometry”) using an end-to-end, deep learning-based architecture. The architecture uses a single type of computational module and learning rule to extract visual motion, depth, and finally odometry information from the raw data. Representations of depth and motion are extracted by detecting sync...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید