نتایج جستجو برای: uniformly gateaux differentiable norm

تعداد نتایج: 83779  

Journal: :SIAM J. Control and Optimization 2015
Eduardo Casas Christopher Ryll Fredi Tröltzsch

Abstract. Optimal sparse control problems are considered for the FitzHugh-Nagumo system including the so-called Schlögl model. The non-differentiable objective functional of tracking type includes a quadratic Tikhonov regularization term and the L1-norm of the control that accounts for the sparsity. Though the objective functional is not differentiable, a theory of second order sufficient optim...

Journal: :Philosophical transactions. Series A, Mathematical, physical, and engineering sciences 2012
U Kohlenbach L Leuştean

This paper addresses new developments in the ongoing proof mining programme, i.e. the use of tools from proof theory to extract effective quantitative information from prima facie ineffective proofs in analysis. Very recently, the current authors developed a method of extracting rates of metastability (as defined by Tao) from convergence proofs in nonlinear analysis that are based on Banach lim...

Journal: :Journal of Inequalities and Applications 2023

Abstract In this paper, we introduce notable Jensen–Mercer inequality for a general class of convex functions, namely uniformly functions. We explore some interesting properties such functions along with examples. As result, establish Hermite–Jensen–Mercer inequalities pertaining by considering the fractional integral operators. Moreover, Mercer–Ostrowski conformable operator via differentiable...

Journal: :Nonlinearity 2021

Two quantitative notions of mixing are the decay correlations and a mix-norm -- negative Sobolev norm intensity can be measured by rates these quantities. From duality, uniformly dominated mix-norm; but they asymptotically faster than mix-norm? We answer this question constructing an observable with correlation that comes arbitrarily close to achieving rate mix-norm. Therefore is sharpest in bo...

2010
R. R. PHELPS

A study is made of differentiability of the metric projection P onto a closed convex subset K of a Hubert space H. When K has nonempty interior, the Gateaux or Fréchet smoothness of its boundary can be related with some precision to Gateaux or Fréchet differentiability properties of P. For instance, combining results in §3 with earlier work of R. D. Holmes shows that K has a C2 boundary if and ...

1998
Chuan Wang Hsiao-Chun Wu Jose C. Principe

It is well known that Principal Components Analysis (PCA) is optimal in the sense of Mean Square Error (MSE). However, the estimation based on MSE is sensitive to noise or outliers, therefore, it is not a robust estimator. In order to get a robust estimation, absolute error criterion ( norm) could be used, but it is not differentiable at the origin point; and minimax criterion ( norm) could be ...

2001
Mario Romeo Paolo Tilli

We deal with two recent conjectures of R.-C. Li [Linear Algebra Appl. 278 (1998) 317– 326], involving unitarily invariant norms and Hadamard products. In the particular case of the Frobenius norm, the first conjecture is known to be true, whereas the second is still an open problem. In fact, in this paper we show that the Frobenius norm is essentially the only invariant norm which may comply wi...

2012
Krzysztof Jan Nowak

Given a quasianalytic structure, we prove that the singular locus of a quasi-subanalytic set E is a closed quasi-subanalytic subset of E. We rely on some stabilization effects linked to Gateaux differentiability and formally composite functions. An essential ingredient of the proof is a quasianalytic version of Glaeser’s composite function theorem, presented in our previous paper.

2010
D. F. Shanno D. F. SHANNO

The relationship between variable-metric methods derived by norm minimization and those derived by symmetrization of rank-one updates for sparse systems is studied, and an analogue of Dennis's nonsparse symmetrization formula derived. A new method of using norm minimization to produce a sparse analogue of any nonsparse variable-metric method is proposed. The sparse BFGS generated by this method...

2018
Christos Louizos

We propose a practical method for L0 norm regularization for neural networks: pruning the network during training by encouraging weights to become exactly zero. Such regularization is interesting since (1) it can greatly speed up training and inference, and (2) it can improve generalization. AIC and BIC, well-known model selection criteria, are special cases of L0 regularization. However, since...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید