نتایج جستجو برای: squared log error loss function

تعداد نتایج: 1839436  

Journal: :journal of sciences, islamic republic of iran 2011
a. karimnezhad

let be a random sample from a normal distribution with unknown mean and known variance the usual estimator of the mean, i.e., sample mean is the maximum likelihood estimator which under squared error loss function is minimax and admissible estimator. in many practical situations, is known in advance to lie in an interval, say for some in this case, the maximum likelihood estimator changes and d...

2011
Eisa Mahmoudi Hojatollah Zakerzadeh

Estimation in truncated parameter space is one of the most important features in statistical inference, because the frequently used criterion of unbiasedness is useless, since no unbiased estimator exists in general. So, other optimally criteria such as admissibility and minimaxity have to be looked for among others. In this paper we consider a subclass of the exponential families of distributi...

This paper is concerned with the problem of finding the minimax estimators of the scale parameter ? in a family of transformed chi-square distributions, under asymmetric squared log error (SLE) and modified linear exponential (MLINEX) loss functions, using the Lehmann Theorem [2]. Also we show that the results of Podder et al. [4] for Pareto distribution are a special case of our results for th...

A. Karimnezhad

Let be a random sample from a normal distribution with unknown mean and known variance The usual estimator of the mean, i.e., sample mean is the maximum likelihood estimator which under squared error loss function is minimax and admissible estimator. In many practical situations, is known in advance to lie in an interval, say for some In this case, the maximum likelihood estimator...

ژورنال: اندیشه آماری 2017

‎In this study‎, ‎E-Bayesian of parameters of two parameter exponential distribution under squared error loss function is obtained‎. ‎The estimated and the efficiency of the proposed method has been compared with Bayesian estimator using Monte Carlo simulation‎. 

Journal: :جنگل و فرآورده های چوب 0
زهرا قربانی دانشجوی کارشناسی ‏ارشد مهندسی جنگل، دانشکدة منابع طبیعی، دانشگاه تهران، کرج، ایران مقداد جورغلامی دانشیار گروه جنگل داری و اقتصاد جنگل، دانشکدة منابع طبیعی، دانشگاه تهران، کرج، ایران

the log damage and value loss were examined by felling and bucking function, species, and damage type, in kheyrud forest in namkhaneh district in the hyrcanian forest of iran. in order to calculate the wood value following tree felling and bucking, 250 trees and 167 logs consisting of beech and hornbeam species were measured. there were five types of damage recorded following the tree felling o...

Extended Abstract. The study of truncated parameter space in general is of interest for the following reasons: 1.They often occur in practice. In many cases certain parameter values can be excluded from the parameter space. Nearly all problems in practice have a truncated parameter space and it is most impossible to argue in practice that a parameter is not bounded. In truncated parameter...

2016
Aarti Singh

In many machine learning task, we have data Z from some distribution p and the task is to minimize the risk: R(f) = EZ∼p[`(f(Z), Z)] (11.1) where ` is a loss function of interest, e.g. in classification Z = (X,Y ) and we use 0/1 loss `(f(Z), Z) = 1f(X)6=Y , in regression Z = (X,Y ) and we use squared error `(f(Z), Z) = (f(X) − Y ) and in density estimation Z = X and we use negative log likeliho...

Journal: :Data Science Journal 2009
Gyan Prakash Singh D. C. Singh

It is now well recognized that the use of the squared error loss function (SELF) in Bayesian estimation may not be appropriate when positive and negative errors have different consequences. To overcome this difficulty, Varian (1975) and Zellner (1986) proposed an asymmetric loss function known as the LINEX loss function (LLF) and its invariant form (Basu & Ebrahimi, 1991) for any parameter θ is...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید