نتایج جستجو برای: sufficient descent directions

تعداد نتایج: 286567  

Journal: :Computational Statistics & Data Analysis 2010
Xuerong Meggie Wen

The requirement of constant censoring parameter β in Koziol–Green (KG) model is too restrictive. When covariates are present, the conditional KG model (Veraverbekea and Cadarso-Suárez, 2000) which allows β to be dependent on the covariates is more realistic. In this paper, using sufficient dimension reduction methods, we provide a model-free diagnostic tool to test if β is a function of the cov...

2003
JAMES R. HOFMANN BRUCE H. WEBER

Creationists who object to evolution in the science curriculum of public schools often cite Jonathan Well’s book Icons of Evolution in their support (Wells 2000). In the third chapter of his book Wells claims that neither paleontological nor molecular evidence supports the thesis that the history of life is an evolutionary process of descent from preexisting ancestors. We argue that Wells inapp...

1998
Alex S. Fukunaga Dennis J.-H. Huang Andrew B. Kahng

We examine the utility of the Large-Step Markov Chain (LSMC) technique [13], a variant of the iterated descent heuristic of Baum [2], for VLSI netlist bipartitioning. LSMC iteratively nds a local optimum solution according to some greedy search (in our case, the Fiduccia-Mattheyses heuristic) and then perturbs this local optimum via a \kick move" into the starting solution of the next greedy de...

Journal: :CoRR 2014
Hilton Bristow Simon Lucey

Gradient-descent methods have exhibited fast and reliable performance for image alignment in the facial domain, but have largely been ignored by the broader vision community. They require the image function be smooth and (numerically) differentiable – properties that hold for pixel-based representations obeying natural image statistics, but not for more general classes of non-linear feature tra...

Journal: :Foundations of data science 2022

Computing the gradient of a function provides fundamental information about its behavior. This is essential for several applications and algorithms across various fields. One common application that require gradients are optimization techniques such as stochastic descent, Newton's method trust region methods. However, these methods usually requires numerical computation at every iteration which...

Journal: :SIAM Journal on Optimization 1999
Dexuan Xie Tamar Schlick

To e ciently implement the truncated-Newton (TN) optimization method for largescale, highly nonlinear functions in chemistry, an unconventional modi ed Cholesky (UMC) factorization is proposed to avoid large modi cations to a problem-derived preconditioner, used in the inner loop in approximating the TN search vector at each step. The main motivation is to reduce the computational time of the o...

Journal: :Math. Program. Comput. 2012
Oliver Exler Thomas Lehmann Klaus Schittkowski

We present numerical results of a comparative study of codes for nonlinear and nonconvex mixed-integer optimization. The underlying algorithms are based on sequential quadratic programming (SQP) with stabilization by trust-regions, linear outer approximations, and branch-and-bound techniques. Themixed-integer quadratic programming subproblems are solved by a branch-and-cut algorithm. Second ord...

2011
Jinkui Liu Shaoheng Wang

In this paper, an efficient modified nonlinear conjugate gradient method for solving unconstrained optimization problems is proposed. An attractive property of the modified method is that the generated direction in each step is always descending without any line search. The global convergence result of the modified method is established under the general Wolfe line search condition. Numerical r...

2009
Jianguo Zhang Yunhai Xiao Zengxin Wei Joaquim J. Júdice

Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are proposed. An attractive property of the methods, is that, without any line search, the generated directions always descend. Under some mild conditions, global convergence results for both methods are established. Preliminary numerical results show that these proposed methods are promising, and comp...

2011
Hao Fan Zhibin Zhu Anwa Zhou

In this paper, a new nonlinear conjugate gradient method is proposed for large-scale unconstrained optimization. The sufficient descent property holds without any line searches. We use some steplength technique which ensures the Zoutendijk condition to be held, this method is proved to be globally convergent. Finally, we improve it, and do further analysis.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید