Greed Is Good: Near-Optimal Submodular Maximization via Greedy Optimization

نویسندگان

  • Moran Feldman
  • Christopher Harshaw
  • Amin Karbasi
چکیده

It is known that greedy methods perform well for maximizing monotone submodular functions. At the same time, such methods perform poorly in the face of non-monotonicity. In this paper, we show—arguably, surprisingly—that invoking the classical greedy algorithm O( √ k)-times leads to the (currently) fastest deterministic algorithm, called REPEATEDGREEDY, for maximizing a general submodular function subject to k-independent system constraints. REPEATEDGREEDY achieves (1+O(1/ √ k))k approximation using O(nr √ k) function evaluations (here, n and r denote the size of the ground set and the maximum size of a feasible solution, respectively). We then show that by a careful sampling procedure, we can run the greedy algorithm only once and obtain the (currently) fastest randomized algorithm, called SAMPLEGREEDY, for maximizing a submodular function subject to kextendible system constraints (a subclass of k-independent system constrains). SAMPLEGREEDY achieves (k+3)-approximation with only O(nr/k) function evaluations. Finally, we derive an almost matching lower bound, and show that no polynomial time algorithm can have an approximation ratio smaller than k + 1/2− ε. To further support our theoretical results, we compare the performance of REPEATEDGREEDY and SAMPLEGREEDY with prior art in a concrete application (movie recommendation). We consistently observe that while SAMPLEGREEDY achieves practically the same utility as the best baseline, it performs at least two orders of magnitude faster.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adaptive Maximization of Pointwise Submodular Functions With Budget Constraint

We study the worst-case adaptive optimization problem with budget constraint that is useful for modeling various practical applications in artificial intelligence and machine learning. We investigate the near-optimality of greedy algorithms for this problem with both modular and non-modular cost functions. In both cases, we prove that two simple greedy algorithms are not near-optimal but the be...

متن کامل

Adaptive Submodular Optimization under Matroid Constraints

Many important problems in discrete optimization require maximization of a monotonic submodular function subject to matroid constraints. For these problems, a simple greedy algorithm is guaranteed to obtain near-optimal solutions. In this article, we extend this classic result to a general class of adaptive optimization problems under partial observability, where each choice can depend on obser...

متن کامل

Multi-document Summarization via Budgeted Maximization of Submodular Functions

We treat the text summarization problem as maximizing a submodular function under a budget constraint. We show, both theoretically and empirically, a modified greedy algorithm can efficiently solve the budgeted submodular maximization problem near-optimally, and we derive new approximation bounds in doing so. Experiments on DUC’04 task show that our approach is superior to the bestperforming me...

متن کامل

Monotone Submodular Maximization over a Matroid via Non-Oblivious Local Search

We present an optimal, combinatorial 1−1/e approximation algorithm for monotone submodular optimization over a matroid constraint. Compared to the continuous greedy algorithm (Calinescu, Chekuri, Pál and Vondrák, 2008), our algorithm is extremely simple and requires no rounding. It consists of the greedy algorithm followed by local search. Both phases are run not on the actual objective functio...

متن کامل

On maximizing a monotone k-submodular function subject to a matroid constraint

A k-submodular function is an extension of a submodular function in that its input is given by k disjoint subsets instead of a single subset. For unconstrained nonnegative ksubmodular maximization, Ward and Živný proposed a constant-factor approximation algorithm, which was improved by the recent work of Iwata, Tanigawa and Yoshida presenting a 1/2-approximation algorithm. Iwata et al. also pro...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017