نتایج جستجو برای: kernel trick
تعداد نتایج: 52726 فیلتر نتایج به سال:
Analysis of system security becomes a major task for researchers. Intrusion detection plays a vital role in the security domain in these days, Internet usage has been increased enormously and with this, the threat to system resources has also increased. Anomaly based intrusion changes its behaviour dynamically, to detect these types of intrusions need to adopt the novel approaches are required....
The allure of a molecular dynamics simulation is that, given a sufficiently accurate force field, it can provide an atomic-level view of many interesting phenomena in biology. However, the result of a simulation is a large, high-dimensional time series that is difficult to interpret. Recent work has introduced the time-structure based Independent Components Analysis (tICA) method for analyzing ...
Non-linear kernel methods can be approximated by fast linear ones using suitable explicit feature maps allowing their application to large scale problems. To this end, explicit feature maps of kernels for vectorial data have been extensively studied. As many real-world data is structured, various kernels for complex data like graphs have been proposed. Indeed, many of them directly compute feat...
We extend the direct approach to semiclassical Bergman kernel asymptotics, developed recently in Deleporte et al. (Ann Fac Sci Toulouse Math, 2020) for real analytic exponential weights, smooth case. Similar (2020), our avoids use of Kuranishi trick and it allows us construct amplitude asymptotic projection by means an inversion explicit Fourier integral operator.
We present a class of algorithms for learning the structure of graphical models from data. The algorithms are based on a measure known as the kernel generalized variance (KGV), which essentially allows us to treat all variables on an equal footing as Gaussians in a feature space obtained from Mercer kernels. Thus we are able to learn hybrid graphs involving discrete and continuous variables of ...
In this paper, a novel algorithm for feature extraction, named supervised kernel locally principle component analysis (SKLPCA), is proposed. The SKLPCA is a non-linear and supervised subspace learning method, which maps the data into a potentially much higher dimension feature space by kernel trick and preserves the geometric structure of data according to prior class-label information. SKLPCA ...
Mika et al. [1] introduce a non-linear formulation of the Fisher discriminant based the well-known “kernel trick”, later shown to be equivalent to the Least-Squares Support Vector Machine [2, 3]. In this paper, we show that the cross-validation error can be computed very efficiently for this class of kernel machine, specifically that leave-one-out cross-validation can be performed with a comput...
Cross-correlator plays a significant role in many visual perception tasks, such as object detection and tracking. Beyond the linear cross-correlator, this paper proposes a kernel crosscorrelator (KCC) that breaks traditional limitations. First, by introducing the kernel trick, the KCC extends the linear crosscorrelation to non-linear space, which is more robust to signal noises and distortions....
Kronecker product kernel provides the standard approach in the kernel methods' literature for learning from graph data, where edges are labeled and both start and end vertices have their own feature representations. The methods allow generalization to such new edges, whose start and end vertices do not appear in the training data, a setting known as zero-shot or zero-data learning. Such a setti...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید