Physarum Powered Differentiable Linear Programming Layers and Applications

نویسندگان

چکیده

Consider a learning algorithm, which involves an internal call to optimization routine such as generalized eigenvalue problem, cone programming problem or even sorting. Integrating method layers within trainable deep network in numerically stable way is not simple – for instance, only recently, strategies have emerged eigendecomposition and differentiable We propose efficient solver general linear problems can be used plug play manner neural networks layer. Our development inspired by fascinating but widely link between dynamics of slime mold (physarum) mathematical schemes steepest descent. describe our demonstrate the use video object segmentation task meta-learning few-shot learning. review relevant known results provide technical analysis describing its applicability cases. performs comparably with customized projected gradient descent on first outperforms very recently proposed CVXPY second task. Experiments show that converges quickly without need feasible initial point. Interestingly, scheme easy implement easily serve whenever procedure needs fast approximate solution LP, larger network.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fuzzy linear programming and applications

This paper presents a survey on methods for solving fuzzy linear programs. First LP models with soft constraints are discussed. Then LP problems in which coefficients of constraints and/or of the objective function may be fuzzy are outlined. Pivotal questions are the interpretation of the inequality relation in fuzzy constraints and the meaning of fuzzy objectives. In addition to the commonly a...

متن کامل

Differentiable Genetic Programming

We introduce the use of high order automatic differentiation, implemented via the algebra of truncated Taylor polynomials, in genetic programming. Using the Cartesian Genetic Programming encoding we obtain a high-order Taylor representation of the program output that is then used to back-propagate errors during learning. The resulting machine learning framework is called differentiable Cartesia...

متن کامل

Generalized Colorful Linear Programming and Further Applications

Colorful linear programming (CLP) is a generalization of linear programming that was introduced by Bárány and Onn. Given k point sets C1, . . . , Ck ⊂ R that each contain a point b ∈ R in their positive span, the problem is to compute a set C ⊆ C1 ∪ · · · ∪ Ck that contains at most one point from each set Ci and that also contains b in its positive span, or to state that no such set exists. CLP...

متن کامل

Continuously Differentiable Exponential Linear Units

Exponential Linear Units (ELUs) are a useful rectifier for constructing deep learning architectures, as they may speed up and otherwise improve learning by virtue of not have vanishing gradients and by having mean activations near zero [1]. However, the ELU activation as parametrized in [1] is not continuously differentiable with respect to its input when the shape parameter α is not equal to 1...

متن کامل

Lecture Notes 5 : Applications of Linear Programming

Proof. Let x ∈ P . We show a more general claim: if x tightly fulfills r independent constraints, then x can be expressed as a convex combination of at most n+ 1− r vertices. Substituting r = 0 we get the theorem. We prove the claim by induction on r, where the basis is r = n and we decrease r in each step. The basis of the induction is r = n and thus n constraints are fulfilled tightly. By the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2021

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v35i10.17081