نتایج جستجو برای: squared log error loss function
تعداد نتایج: 1839436 فیلتر نتایج به سال:
P erhaps the most surprising result in Statistics arises in a remarkably simple estimation problem. Let X1, ..., Xp be independent random variables, with Xi ∼ N(θi , 1) for i = 1, ..., p. Writing X = (X1, ..., Xp), suppose we want to find a good estimator θ̂ = θ̂(X) of θ = (θ1, ..., θp). To define more precisely what is meant by a good estimator, we use the language of statistical decision theory...
Minimax estimation problems with restricted parameter space reached increasing interest within the last two decades Some authors derived minimax and admissible estimators of bounded parameters under squared error loss and scale invariant squared error loss In some truncated estimation problems the most natural estimator to be considered is the truncated version of a classic...
A loss function is a mapping l : Y×Y "→ R (sometimes R×R "→ R). For example, in binary classification the 0/1 loss function l(y, p) = I(y ≠ p) is often used and in regression the squared error loss function l(y, p) = (y − p) is often used. Other loss functions include the following: absolute loss, Huber loss, εinsensitive loss, hinge loss, logistic loss, exponential loss, modified least squares...
Abstract This paper addresses the problem of Bayesian estimation of the parameters of Erlang distribution under squared error loss function by assuming different independent informative priors as well as joint priors for both shape and scale parameters. The motivation is to explore the most appropriate prior for Erlang distribution among different priors. A comparison of the Bayes estimates and...
This paper considers the problem of estimating a high-dimensional vector of parameters θ ∈ R from a noisy observation. The noise vector is i.i.d. Gaussian with known variance. For a squared-error loss function, the James-Stein (JS) estimator is known to dominate the simple maximum-likelihood (ML) estimator when the dimension n exceeds two. The JS-estimator shrinks the observed vector towards th...
in a subclass of the scale-parameter exponential family, we consider the sequential pointestimation of a function of the scale parameter under the loss function given as the sum of the weightedsquared error loss and a linear cost. for a fully sequential sampling scheme, second order expansions areobtained for the expected sample size as well as for the regret of the procedure. the former resear...
Recent advances in the measurement of volatility have utilized high frequency intraday data to produce what are generally known as realised volatility estimates. It has been shown that forecasts generated from such estimates are of positive economic value in the context of portfolio allocation. This paper considers the link between the value of such forecasts and the loss function under which m...
Given a probability space (Ω,F , P ), a F -measurable random variable X , and a sub-σ-algebra G ⊂ F , it is well known that the conditional expectation E[X|G] is the optimal L-predictor (also known as the least mean square error predictor) of X among all the G-measurable random variables [8, 11]. In this paper, we provide necessary and sufficient conditions under which the conditional expecta...
In this paper we use Gaussian Process (GP) regression to propose a novel approach for predicting volatility of financial returns by forecasting the envelopes of the time series. We provide a direct comparison of their performance to traditional approaches such as GARCH. We compare the forecasting power of three approaches: GP regression on the absolute and squared returns; regression on the env...
A technique is presented for subband adaptive ltering with nonuniform lter banks. The bandwidth allocations of the subband analysis and synthesis lters are adapted to the spectral characteristics of the input data in such a manner as to minimize an objective function built from the subband error powers. The nonuniform lter bank structure allows for fast convergence times for high order systems ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید