نتایج جستجو برای: squared log error loss function
تعداد نتایج: 1839436 فیلتر نتایج به سال:
We develop and test robust methods for design construction, for estimation and for prediction in spatial studies. The designs are robust against misspecified variance/covariance structures, and against misspecified regression responses. Robustness against contaminated error distributions is provided by the use of generalized M-estimators in the estimation and prediction procedures. The loss fun...
Dependent Variable: Y Method: Least Squares Date: 05/23/00 Time: 05:55 Sample: 1 33 Included observations: 33 Variable Coefficient Std. Error t-Statistic Prob. C 102192.4 12799.83 7.983891 0.0000 N -9074.674 2052.674 -4.420904 0.0001 P 0.354668 0.072681 4.879810 0.0000 I 1.287923 0.543294 2.370584 0.0246 R-squared 0.618154 Mean dependent var 125634.6 Adjusted R-squared 0.578653 S.D. dependent v...
We consider pointwise mean squared errors of several known Bayesian wavelet estimators, namely, posterior mean, posterior median and Bayes Factor, where the prior imposed on wavelet coefficients is a mixture of an atom of probability zero and a Gaussian density. We show that for the properly chosen hyperparameters of the prior, all the three estimators are (up to a log-factor) asymptotically mi...
We consider binary classification problems with positive definite kernels and square loss, and study the convergence rates of stochastic gradient methods. We show that while the excess testing loss (squared loss) converges slowly to zero as the number of observations (and thus iterations) goes to infinity, the testing error (classification error) converges exponentially fast if low-noise condit...
We prove a theorem about the relative entropy of quantum states, which roughly states that if the relative entropy, S(ρ‖σ) ∆ = Tr ρ(log ρ− log σ), of two quantum states ρ and σ is at most c, then ρ/2 ‘sits inside’ σ. Using this ‘substate’ theorem, we give tight lower bounds for the privacy loss of bounded error quantum communication protocols for the index function problem. We also give tight l...
of Thesis Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Master of Science COST FUNCTIONS FOR SUPERVISED LEARNING BASED ON A ROBUST SIMILARITY METRIC By Abhishek Singh May 2010 Chair: José C. Prı́ncipe Major: Electrical and Computer Engineering This thesis proposes cost functions for supervised learning algorithms, based...
Modern control systems widely use network to decrease the implementation cost and also increase the performance. Although they have several advantages, they suffer from some limitations and deficiencies. Packet loss is one of the main limitations which affect the control system in different conditions and finally can lead to system instability. To prevent such problems it is important to model ...
OBJECTIVE To compare the fetal loss rate of monochorionic (MC) twin pregnancies according to their amnionicity. METHODS A retrospective review of all MC pregnancy outcomes in a tertiary centre. Pregnancy outcomes were compared for monochorionic monoamniotic (MCMA) versus monochorionic diamniotic (MCDA) pregnancies. RESULTS 29 MCMA and 117 MCDA twin pregnancies were identified. The overall f...
K E Y W O R D S AND PHRASES Minimax decision rule; squared error loss; Dirichlet process; compact sets; Bayes rules; isotonic regression. l . INTRODUCTION In an often cited paper, BOHLMANN (1976) has considered linear minimax estimators for the mean of a univariate distribution in a nonparametr ic setting under squared error loss. To be precise, BOHLMANN assumes that Fo(x ) is a family of C D F...
Data clustering is a combinatorial optimization problem. This article shows that clustering is also an optimization problem for an analytic function. The mean squared error, or in this case, the squared error can expressed as an analytic function. With an analytic function we benefit from the existence of standard optimization methods: the gradient of this function is calculated and the descent...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید