نتایج جستجو برای: reducing subspace
تعداد نتایج: 259841 فیلتر نتایج به سال:
We prove, using the subspace embedding guarantee in a black box way, that one can achieve the spectral norm guarantee for approximate matrix multiplication with a dimensionality-reducing map having m = O(r̃/ 2) rows. Here r̃ is the maximum stable rank, i.e., the squared ratio of Frobenius and operator norms, of the two matrices being multiplied. This is a quantitative improvement over previous wo...
This paper demonstrates of Prior Subspace Analysis (PSA) as a method for transcribing drums in the presence of pitched instruments. PSA uses prior subspaces that represent the sources to be transcribed to overcome some of the problems associated with other subspace methods such as Independent Subspace Analysis (ISA) or sub-band ISA. The use of prior knowledge results in improved robustness for ...
and Applied Analysis 3 Proposition 1.3 see 4, 10 . A closed subspace X of L2 n is a reducing subspace if and only if X { f ∈ L2 n : supp ( f̂ ) ⊆ S } 1.5 for some measurable set S ⊆ n with ãS S. So, to be specific, one denotes a reducing subspace by L2 S ∨ instead of X. In particular, L2 n is a reducing subspace of L2 n . Definition 1.4 see 5–7 . Let B n be a subgroup of the integral affine grou...
We present low complexity, quickly converging robust adaptive beamformers that combine robust Capon beamformer (RCB) methods and data-adaptive Krylov subspace dimensionality reduction techniques. We extend a recently proposed reduced-dimension RCB framework, which ensures proper combination of RCBs with any form of dimensionality reduction that can be expressed using a full-rank dimension reduc...
Many conventional statistical machine learning algorithms generalise poorly if distribution bias exists in the datasets. For example, distribution bias arises in the context of domain generalisation, where knowledge acquired from multiple source domains need to be used in a previously unseen target domains. We propose Elliptical Summary Randomisation (ESRand), an efficient domain generalisation...
This article introduces a tensor network subspace algorithm for the identification of specific polynomial state space models. The polynomial nonlinearity in the state space model is completely written in terms of a tensor network, thus avoiding the curse of dimensionality. We also prove how the block Hankel data matrices in the subspace method can be exactly represented by low rank tensor netwo...
Clustering is the task of grouping a set of objects in such a way that objects in the same group (called cluster) are more similar (in some sense or another) to each other than to those in other groups (clusters). The dimension can be reduced by using some techniques of dimension reduction. Recently new non linear methods introduced for reducing the dimensionality of such data called Locally Li...
The bio-molecular diagnosis of malignancies represents a difficult learning task, because of the high dimensionality and low cardinality of the data. Many supervised learning techniques, among them support vector machines, have been experimented, using also feature selection methods to reduce the dimensionality of the data. In alternative to feature selection methods, we proposed to apply rando...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید