Recovering the Optimal Solution by Dual Random Projection

نویسندگان

  • Lijun Zhang
  • Mehrdad Mahdavi
  • Rong Jin
  • Tianbao Yang
  • Shenghuo Zhu
چکیده

Random projection has been widely used in data classification. It maps high-dimensional data into a low-dimensional subspace in order to reduce the computational cost in solving the related optimization problem. While previous studies are focused on analyzing the classification performance of using random projection, in this work, we consider the recovery problem, i.e., how to accurately recover the optimal solution to the original optimization problem in the high-dimensional space based on the solution learned from the subspace spanned by random projections. We present a simple algorithm, termed Dual Random Projection, that uses the dual solution of the low-dimensional optimization problem to recover the optimal solution to the original problem. Our theoretical analysis shows that with a high probability, the proposed algorithm is able to accurately recover the optimal solution to the original problem, provided that the data matrix is of low rank or can be well approximated by a low rank matrix.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Learning for Large-Scale and High-Dimensional Data: A Randomized Convex-Concave Optimization Approach

In this paper, we develop a randomized algorithm and theory for learning a sparse model from large-scale and high-dimensional data, which is usually formulated as an empirical risk minimization problem with a sparsity-inducing regularizer. Under the assumption that there exists a (approximately) sparse solution with high classification accuracy, we argue that the dual solution is also sparse or...

متن کامل

Theory of Dual-sparse Regularized Randomized Reduction

In this paper, we study randomized reduction methods, which reduce high-dimensional features into low-dimensional space by randomized methods (e.g., random projection, random hashing), for large-scale high-dimensional classification. Previous theoretical results on randomized reduction methods hinge on strong assumptions about the data, e.g., low rank of the data matrix or a large separable mar...

متن کامل

Duals of Some Constructed $*$-Frames by Equivalent $*$-Frames

Hilbert frames theory have been extended to frames in Hilbert $C^*$-modules. The paper introduces equivalent $*$-frames and presents ordinary duals of a constructed $*$-frame by an adjointable and invertible operator. Also, some necessary and sufficient conditions are studied such that $*$-frames and ordinary duals or operator duals of another $*$-frames are equivalent under these conditions. W...

متن کامل

Towards Making High Dimensional Distance Metric Learning Practical

In this work, we study distance metric learning (DML) for high dimensional data. A typical approach for DML with high dimensional data is to perform the dimensionality reduction first before learning the distance metric. The main shortcoming of this approach is that it may result in a suboptimal solution due to the subspace removed by the dimensionality reduction method. In this work, we presen...

متن کامل

Some new results on semi fully fuzzy linear programming problems

There are two interesting methods, in the literature, for solving fuzzy linear programming problems in which the elements of coefficient matrix of the constraints are represented by real numbers and rest of the parameters are represented by symmetric trapezoidal fuzzy numbers. The first method, named as fuzzy primal simplex method, assumes an initial primal basic feasible solution is at hand. T...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013