Shape Optimization with Nonlinear Conjugate Gradient Methods

نویسندگان

چکیده

In this chapter, we investigate recently proposed nonlinear conjugate gradient (NCG) methods for shape optimization problems. We briefly introduce the as well corresponding theoretical background and their performance numerically. The obtained results confirm that NCG are efficient attractive solution algorithms

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On nonlinear generalized conjugate gradient methods

where F (ξ) is a nonlinear operator from a real Euclidean space of dimension n or Hilbert space into itself. The Euclidean norm and corresponding inner product will be denoted by ‖·‖1 and (·, ·)1 respectively. A general different inner product with a weight function and the corresponding norm will be denoted by (·, ·)0 and ‖ · ‖ respectively. In the first part of this article (Sects. 2 and 3) w...

متن کامل

Convergence Properties of Nonlinear Conjugate Gradient Methods

Recently, important contributions on convergence studies of conjugate gradient methods have been made by Gilbert and Nocedal [6]. They introduce a “sufficient descent condition” to establish global convergence results, whereas this condition is not needed in the convergence analyses of Newton and quasi-Newton methods, [6] hints that the sufficient descent condition, which was enforced by their ...

متن کامل

A Survey of Nonlinear Conjugate Gradient Methods

This paper reviews the development of different versions of nonlinear conjugate gradient methods, with special attention given to global convergence properties.

متن کامل

Nonlinear Conjugate Gradient Methods with Wolfe Type Line Search

and Applied Analysis 3 = ‖d k−1 ‖2 ‖g k−1 ‖4 + 1 ‖g k ‖2 − β2 k (gT k d k−1 ) 2 /‖g k ‖4

متن کامل

Nonlinear Conjugate Gradient Methods with Sufficient Descent Condition for Large-Scale Unconstrained Optimization

Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are proposed. An attractive property of the methods, is that, without any line search, the generated directions always descend. Under some mild conditions, global convergence results for both methods are established. Preliminary numerical results show that these proposed methods are promising, and comp...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture notes in computational science and engineering

سال: 2022

ISSN: ['1439-7358', '2197-7100']

DOI: https://doi.org/10.1007/978-3-031-20432-6_9