The cyclic coordinate descent in hydrothermal optimization problems with non-regular Lagrangian

نویسنده

  • L. Bayón
چکیده

In this paper we present an algorithm, inspired by the cyclic coordinate descent method, which allows the resolution of hydrothermal optimization problems involving pumped-storage plants. The proof of the convergence of the succession generated by the algorithm was based on the use of an appropriate adaptation of Zangwill’s global theorem of convergence.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods

Cyclic coordinate descent is a classic optimization method that has witnessed a resurgence of interest in Signal Processing, Statistics and Machine Learning. Reasons for this renewed interest include the simplicity, speed, and stability of the method as well as its competitive performance on `1 regularized smooth optimization problems. Surprisingly, very little is known about its non-asymptotic...

متن کامل

Speeding-Up Convergence via Sequential Subspace Optimization: Current State and Future Directions

This is an overview paper written in style of research proposal. In recent years we introduced a general framework for large-scale unconstrained optimization – Sequential Subspace Optimization (SESOP) and demonstrated its usefulness for sparsity-based signal/image denoising, deconvolution, compressive sensing, computed tomography, diffraction imaging, support vector machines. We explored its co...

متن کامل

On the Finite Time Convergence of Cyclic Coordinate Descent Methods

Cyclic coordinate descent is a classic optimization method that has witnessed a resurgence of interest in machine learning. Reasons for this include its simplicity, speed and stability, as well as its competitive performance on l1 regularized smooth optimization problems. Surprisingly, very little is known about its finite time convergence behavior on these problems. Most existing results eithe...

متن کامل

Iteration Complexity of Feasible Descent Methods Iteration Complexity of Feasible Descent Methods for Convex Optimization

In many machine learning problems such as the dual form of SVM, the objective function to be minimized is convex but not strongly convex. This fact causes difficulties in obtaining the complexity of some commonly used optimization algorithms. In this paper, we proved the global linear convergence on a wide range of algorithms when they are applied to some non-strongly convex problems. In partic...

متن کامل

Inexact block coordinate descent methods with application to the nonnegative matrix factorization

This work is concerned with the cyclic block coordinate descent method, or nonlinear Gauss-Seidel method, where the solution of an optimization problem is achieved by partitioning the variables in blocks and successively minimizing with respect to each block. The properties of the objective function that guarantee the convergence of such alternating scheme have been widely investigated in the l...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007