Estimating Mixture of Gaussian Processes by Kernel Smoothing

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Estimating Mixture of Gaussian Processes by Kernel Smoothing.

When the functional data are not homogeneous, e.g., there exist multiple classes of functional curves in the dataset, traditional estimation methods may fail. In this paper, we propose a new estimation procedure for the Mixture of Gaussian Processes, to incorporate both functional and inhomogeneous properties of the data. Our method can be viewed as a natural extension of high-dimensional norma...

متن کامل

Gaussian Kernel Smoothing

where K is the kernel of the integral. Given the input signal X , Y represents the output signal. The smoothness of the output depends on the smoothness of the kernel. We assume the kernel to be unimodal and isotropic. When the kernel is isotropic, it has radial symmetry and should be invariant under rotation. So it has the form K(t, s) = f(‖t− s‖) for some smooth function f . Since the kernel ...

متن کامل

Multi-Kernel Gaussian Processes

Although Gaussian process inference is usually formulated for a single output, in many machine learning problems the objective is to infer multiple tasks jointly, possibly exploring the dependencies between them to improve results. Real world examples of this problem include ore mining where the objective is to infer the concentration of several chemical components to assess the ore quality. Si...

متن کامل

Estimating Semiparametric Arch (∞) Models by Kernel Smoothing Methods

We investigate a class of semiparametric ARCH(∞) models that includes as a special case the partially nonparametric (PNP) model introduced by Engle and Ng (1993) and which allows for both flexible dynamics and flexible function form with regard to the 'news impact' function. We propose an estimation method that is based on kernel smoothing and profiled likelihood. We establish the distribution ...

متن کامل

MiDGaP: Mixture Density Gaussian Processes

Gaussian Processes (GPs) have become a core technique in machine learning over the last decade, with numerous extensions and applications. Although several approaches exist for warping the conditional Gaussian posterior distribution to other members of the exponential family, most tacitly assume a unimodal posterior. In this paper we present a mixture density model (MDM) allowing multi-modal po...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SSRN Electronic Journal

سال: 2013

ISSN: 1556-5068

DOI: 10.2139/ssrn.2356155