Comparison of input and feature space nonlinear kernel nuisance attribute projections for speaker verification
نویسندگان
چکیده
Nuisance attribute projection (NAP) was an effective method to reduce session variability in SVM-based speaker verification systems. As the expanded feature space of nonlinear kernels is usually high or infinite dimensional, it is difficult to find nuisance directions via conventional eigenvalue analysis and to do projection directly in the feature space. In this paper, two different approaches to nonlinear kernel NAP are investigated and compared. In one way, NAP projection is formulated in the expanded feature space and kernel PCA is employed to do kernel eigenvalue analysis. In the second approach, a gradient descent algorithm is proposed to find out projection over input variables. Experimental results on the 2006 NIST SRE corpus show that both kinds of NAP can reduce unwanted variability in nonlinear kernels to improve verification performance; and NAP performed in expanded feature space using kernel PCA obtains slightly better performance than NAP over input variables.
منابع مشابه
Linear and non linear kernel GMM supervector machines for speaker verification
This paper presents a comparison between Support Vector Machines (SVM) speaker verification systems based on linear and non linear kernels defined in GMM supervector space. We describe how these kernel functions are related and we show how the nuisance attribute projection (NAP) technique can be used with both of these kernels to deal with the session variability problem. We demonstrate the imp...
متن کاملOn the Use of Non-Linear Polynomial Kernel SVMs in Language Recognition
Reduced-dimensional supervector representations are shown to outperform their supervector counterparts in a variety of speaker recognition tasks. They have been exploited in automatic language verification (ALV) tasks as well but, to the best of our knowledge, their performance is comparable with their supervector counterparts. This paper demonstrates that nonlinear polynomial kernel support ve...
متن کاملVariability compensated support vector machines applied to speaker verification
Speaker verification using SVMs has proven successful, specifically using the GSV Kernel [1] with nuisance attribute projection (NAP) [2]. Also, the recent popularity and success of joint factor analysis [3] has led to promising attempts to use speaker factors directly as SVM features [4]. NAP projection and the use of speaker factors with SVMs are methods of handling variability in SVM speaker...
متن کاملAnalysis of subspace within-class covariance normalization for SVM-based speaker verification
Nuisance attribute projection (NAP) and within-class covariance normalization (WCCN) are two effective techniques for intersession variability compensation in SVM based speaker verification systems. However, by normalizing or removing the nuisance subspace containing the session variability can not guarantee to enlarge the distance between speakers. In this paper, we investigated the probabilit...
متن کاملKernel Principal Components Are Maximum Entropy Projections
Principal Component Analysis (PCA) is a very well known statistical tool. Kernel PCA is a nonlinear extension to PCA based on the kernel paradigm. In this paper we characterize the projections found by Kernel PCA from a information theoretic perspective. We prove that Kernel PCA provides optimum entropy projections in the input space when the Gaussian kernel is used for the mapping and a sample...
متن کامل