On the Worst-Case Approximability of Sparse PCA
نویسندگان
چکیده
It is well known that Sparse PCA (Sparse Principal Component Analysis) is NP-hard to solve exactly on worst-case instances. What is the complexity of solving Sparse PCA approximately? Our contributions include: 1. a simple and efficient algorithm that achieves an n-approximation; 2. NP-hardness of approximation to within (1 − ε), for some small constant ε > 0; 3. SSE-hardness of approximation to within any constant factor; and 4. an exp exp ( Ω (√ log logn )) (“quasi-quasi-polynomial”) gap for the standard semidefinite program.
منابع مشابه
On the Approximability of Sparse PCA
It is well known that Sparse PCA (Sparse Principal Component Analysis) is NP-hard to solve exactly on worst-case instances. What is the complexity of solving Sparse PCA approximately? Our contributions include: 1. a simple and efficient algorithm that achieves an n−1/3-approximation; 2. NP-hardness of approximation to within (1− ε), for some small constant ε > 0; 3. SSE-hardness of approximatio...
متن کاملOnline PCA with Optimal Regrets
We carefully investigate the online version of PCA, where in each trial a learning algorithm plays a k-dimensional subspace, and suffers the compression loss on the next instance when projected into the chosen subspace. In this setting, we give regret bounds for two popular online algorithms, Gradient Descent (GD) and Matrix Exponentiated Gradient (MEG). We show that both algorithms are essenti...
متن کاملAverage case approximability of optimisation problems
This thesis combines average-case complexity theory with the approximability of optimisation problems. Both are ways of dealing with the fact that many computational problems are not solvable in polynomial time, unless P = NP. A theoretical framework is established that allows both the classification of optimisation problems with respect to their average-case approximability and the study of th...
متن کاملSum-of-Squares Lower Bounds for Sparse PCA
This paper establishes a statistical versus computational trade-off for solving a basic high-dimensional machine learning problem via a basic convex relaxation method. Specifically, we consider the Sparse Principal Component Analysis (Sparse PCA) problem, and the family of Sum-of-Squares (SoS, aka Lasserre/Parillo) convex relaxations. It was well known that in large dimension p, a planted k-spa...
متن کاملA New IRIS Segmentation Method Based on Sparse Representation
Iris recognition is one of the most reliable methods for identification. In general, itconsists of image acquisition, iris segmentation, feature extraction and matching. Among them, iris segmentation has an important role on the performance of any iris recognition system. Eyes nonlinear movement, occlusion, and specular reflection are main challenges for any iris segmentation method. In thi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1507.05950 شماره
صفحات -
تاریخ انتشار 2015