نتایج جستجو برای: word in noise training
تعداد نتایج: 17060157 فیلتر نتایج به سال:
Convolutional neural network is one of the effective methods for classifying images that performs learning using convolutional, pooling and fully-connected layers. All kinds of noise disrupt the operation of this network. Noise images reduce classification accuracy and increase convolutional neural network training time. Noise is an unwanted signal that destroys the original signal. Noise chang...
Training the phrase table by force-aligning (FA) the training data with the reference translation has been shown to improve the phrasal translation quality while significantly reducing the phrase table size on medium sized tasks. We apply this procedure to several large-scale tasks, with the primary goal of reducing model sizes without sacrificing translation quality. To deal with the noise in ...
Speech misperceptions provide a window into the processes underlying spoken language comprehension. One approach shown to catalyse robust misperceptions is to embed words in noise. However, the use of masking noise makes it difficult to measure the relative contributions of low-level auditory processing and higher-level factors which involve the deployment of linguistic experience. The current ...
Access to large samples of listeners is an appealing prospect for speech perception researchers, but lack of control over key factors such as listeners’ linguistic backgrounds and quality of stimulus delivery is a formidable barrier to the application of crowdsourcing. We describe the outcome of a web-based listening experiment designed to discover consistent confusions amongst words presented ...
Listeners make mistakes when communicating under adverse conditions, with overall error rates reasonably well-predicted by existing speech intelligibility metrics. However, a detailed examination of confusions made by a majority of listeners is more likely to provide insights into processes of normal word recognition. The current study measured the rate at which robust misperceptions occurred f...
Word embeddings have been demonstrated to benefit NLP tasks impressively. Yet, there is room for improvement in the vector representations, because current word embeddings typically contain unnecessary information, i.e., noise. We propose two novel models to improve word embeddings by unsupervised learning, in order to yield word denoising embeddings. The word denoising embeddings are obtained ...
This paper presents an evaluation of the RWTH large vocabulary speech recognition system on the Aurora 4 noisy Wall Street Journal database. First, the influence of different root functions replacing the logarithm in the feature extraction is studied. Then quantile based histogram equalization is applied, a parametric method to increase the noise robustness by reducing the mismatch between the ...
OBJECTIVE The objective of this study was to evaluate the effectiveness of a training program for hearing-impaired listeners to improve their speech-recognition performance within a background noise when listening to amplified speech. Both noise-masked young normal-hearing listeners, used to model the performance of elderly hearing-impaired listeners, and a group of elderly hearing-impaired lis...
2 Method 7 2.1 Speaker Recognition as Binary Detection . . . . . . . . . . . 7 2.2 Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . . 8 2.3 Word Extraction . . . . . . . . . . . . . . . . . . . . . . . . . 9 2.3.1 Word selection . . . . . . . . . . . . . . . . . . . . . . 9 2.3.2 Forced Alignment Word Identification . . . . . . . . . 10 2.3.3 ASR Word Identification . . . . . . ...
this thesis basically deals with the well-known notion of the bear-invariant of groups, which is the generalization of the schur multiplier of groups. in chapter two, section 2.1, we present an explicit formula for the bear-invariant of a direct product of cyclic groups with respect to nc, c>1. also in section 2.2, we caculate the baer-invatiant of a nilpotent product of cyclic groups wuth resp...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید