Matrix spectral norm Wielandt inequalities with statistical applications
نویسندگان
چکیده
منابع مشابه
Operator-valued extensions of matrix-norm inequalities
The bilinear inequality is derived from the linear one with the help of an operatorvalued version of the Cauchy-Schwarz inequality. All these results, at least in their finite form, are obtained by simple and elegant methods well within the scope of a basic course on Hilbert spaces. (They can alternatively be obtained by tensor product techniques, but in the author’s view, these methods are les...
متن کاملSome inequalities involving lower bounds of operators on weighted sequence spaces by a matrix norm
Let A = (an;k)n;k1 and B = (bn;k)n;k1 be two non-negative ma-trices. Denote by Lv;p;q;B(A), the supremum of those L, satisfying the followinginequality:k Ax kv;B(q) L k x kv;B(p);where x 0 and x 2 lp(v;B) and also v = (vn)1n=1 is an increasing, non-negativesequence of real numbers. In this paper, we obtain a Hardy-type formula forLv;p;q;B(H), where H is the Hausdor matrix and 0 < q p 1. Also...
متن کاملSpectral Norm of Random Kernel Matrices with Applications to Privacy
Kernel methods are an extremely popular set of techniques used for many important machine learning and data analysis applications. In addition to having good practical performance, these methods are supported by a well-developed theory. Kernel methods use an implicit mapping of the input data into a high dimensional feature space defined by a kernel function, i.e., a function returning the inne...
متن کاملSpectral Methods for Matrix Rigidity with Applications
The rigidity of a matrix measures the number of entries that must be changed in order to reduce its rank below a certain value. The known lower bounds on the rigidity of explicit matrices are very weak. It is known that stronger lower bounds would have implications to complexity theory. We consider restricted variants of the rigidity problem over the complex numbers. Using spectral methods, we ...
متن کاملSeveral Matrix Euclidean Norm Inequalities Involving Kantorovich Inequality
where λ1 ≥ · · · ≥ λn > 0 are the eigenvalues of A. It is a very useful tool to study the inefficiency of the ordinary least-squares estimate with one regressor in the linear model. Watson 1 introduced the ratio of the variance of the best linear unbiased estimator to the variance of the ordinary least-squares estimator. Such a lower bound of this ratio was provided by Kantorovich inequality 1....
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Inequalities and Applications
سال: 2014
ISSN: 1029-242X
DOI: 10.1186/1029-242x-2014-110