Kernel Mean Estimation via Spectral Filtering: Supplementary Material
نویسندگان
چکیده
This note contains supplementary materials to Kernel Mean Estimation via Spectral Filtering. 1 Proof of Theorem 1 (i) Since μ̌λ = μ̂ λ λ+1 = μ̂P λ+1 , we have ‖μ̌λ − μP‖ = ∥∥∥∥ μ̂P λ+ 1 − μP ∥∥∥∥ ≤ ∥∥∥∥ μ̂P λ+ 1 − μP λ+ 1 ∥∥∥∥+ ∥∥∥∥ μP λ+ 1 − μP ∥∥∥∥ ≤ ‖μ̂P − μP‖+ λ‖μP‖. From [1], we have that ‖μ̂P − μP‖ = OP(n) and therefore the result follows. (ii) Define ∆ := EP‖μ̂P − μP‖ = ∫ k(x,x) dP(x)−‖μP‖ 2 n . Consider EP‖μ̌λ − μP‖ −∆ = EP ∥∥∥∥ n nβ + c (μ̂P − μP)− μP ∥∥∥∥ 2 −∆ = ( n nβ + c )2 ∆+ c (nβ + c)2 ‖μP‖ −∆ = c‖μP‖ − (c + 2cn)∆ (nβ + c)2 . Substituting for ∆ in the r.h.s. of the above equation, we have EP‖μ̌λ − μP‖ −∆ = (nc + c + 2cn)‖μP‖ − (c + 2cn) ∫ k(x, x) dP(x) n(nβ + c)2 . It is easy to verify that EP‖μ̌λ − μP‖ −∆ < 0 if ‖μP‖ ∫ k(x, x) dP(x) < inf n c + 2cn nc2 + c2 + 2cnβ = 2β 21/ββ + c1/β(β − 1)(β−1)/β . Remark. If k(x, y) = 〈x, y〉, then it is easy to check that Pc,β = {P ∈ M +(R) : ‖θ‖ 2 2 trace(Σ) < A 1−A} where θ and Σ represent the mean vector and covariance matrix. Note that this choice of kernel yields a setting similar to classical James-Stein estimation, wherein for all n and all P ∈ Pc,β := {P ∈ Nθ,σ : ‖θ‖ < σ √ dA/(1−A)}, μ̌λ is admissible for any d, where Nθ,σ := {P ∈ M +(R) : dP(x) = (2πσ2)−d/2e ‖x−θ‖2 2σ2 dx, θ ∈ R, σ > 0}. On the other hand, the James-Stein estimator is admissible for only d ≥ 3 but for any P ∈ Nθ,σ.
منابع مشابه
Kernel Mean Estimation via Spectral Filtering
The problem of estimating the kernel mean in a reproducing kernel Hilbert space (RKHS) is central to kernel methods in that it is used by classical approaches (e.g., when centering a kernel PCA matrix), and it also forms the core inference step of modern kernel methods (e.g., kernel-based non-parametric tests) that rely on embedding probability distributions in RKHSs. Previous work [1] has show...
متن کاملOptimal kernels for nonstationary spectral estimation
Current theories of a time-varying spectrum of a nonstationary process all involve, either by deenition or by diiculties in estimation, an assumption that the signal statistics vary slowly over time. This restrictive quasi-stationarity assumption limits the use of existing estimation techniques to a small class of nonstationary processes. We overcome this limitation by deriving a statistically ...
متن کاملSinging Voice Separation from Monaural Music Based on Kernel Back-Fitting Using Beta-Order Spectral Amplitude Estimation
Separating the leading singing voice from the musical background from a monaural recording is a challenging task that appears naturally in several music processing applications. Recently, kernel additive modeling with generalized spatial Wiener filtering (GW) was presented for music/voice separation. In this paper, an adaptive auditory filtering based on β-order minimum mean-square error spectr...
متن کاملMixture Proportion Estimation via Kernel Embeddings of Distributions
Mixture Proportion Estimation via Kernel Embeddings of Distributions Supplementary Material A. Proof of Propositions 1, 2, 3 and 4 Proposition. d( ) = 0, 8 2 [0, ⇤], b d( ) = 0, 8 2 [0, 1]. Proof. The second equality is obvious and follows from convexity of CS and that both ( b F ) and ( b H) are in CS . The first statement is due to the following. Let 2 [0, ⇤], then we have that, d( ) = inf w2...
متن کاملSpeech Enhancement Using Gaussian Mixture Models, Explicit Bayesian Estimation and Wiener Filtering
Gaussian Mixture Models (GMMs) of power spectral densities of speech and noise are used with explicit Bayesian estimations in Wiener filtering of noisy speech. No assumption is made on the nature or stationarity of the noise. No voice activity detection (VAD) or any other means is employed to estimate the input SNR. The GMM mean vectors are used to form sets of over-determined system of equatio...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014