Asymptotic Theory of A Frequency Estimator
نویسندگان
چکیده
Traditional methods of estimating the frequency of sinusoids from noisy data include the periodogram maximization and the nonlinear least squares. It is well-known that these methods lead to efficient frequency estimates whose asymptotic standard error is of order O(n−3/2). To actually compute the estimates, some sort of iterative search procedures must be employed in view of the high nonlinearity in the parameters. The presence of many local extrema in this problem requires iterative search algorithms to be started with an accurate initial guess—the required precision of the starting values is typically O(n−1), which is not readily available. In this paper we investigate a recent promising approach, contraction-mapping (CM) method for frequency estimation. The CM method is based on an iterative filtering idea in which the estimated first-order autocorrelation of the filtered process contracts to a fixed point. The CM frequency estimator is derived from this fixed point. The critical issue of how the accuracy of the initial guess for the fixed-point iteration controls the precision of the CM estimator is studied, and the asymptotic relationship between the initial estimator and the CM estimator is quantified together with the limiting distributions and almost sure convergence of the fixed points. It is shown that we can start this CM algorithm with poor initial values of precision O(1) and end with a final estimator whose precision can be made arbitrarily close to O(n−3/2). Moreover, this procedure is guaranteed to converge. AMS 1991 subject classifications. Primary 62M10; Secondary 60G35, 93E12.
منابع مشابه
Asymptotic Behaviors of Nearest Neighbor Kernel Density Estimator in Left-truncated Data
Kernel density estimators are the basic tools for density estimation in non-parametric statistics. The k-nearest neighbor kernel estimators represent a special form of kernel density estimators, in which the bandwidth is varied depending on the location of the sample points. In this paper, we initially introduce the k-nearest neighbor kernel density estimator in the random left-truncatio...
متن کاملAsymptotic Behaviors of the Lorenz Curve for Left Truncated and Dependent Data
The purpose of this paper is to provide some asymptotic results for nonparametric estimator of the Lorenz curve and Lorenz process for the case in which data are assumed to be strong mixing subject to random left truncation. First, we show that nonparametric estimator of the Lorenz curve is uniformly strongly consistent for the associated Lorenz curve. Also, a strong Gaussian approximation for ...
متن کاملSome Asymptotic Results of Kernel Density Estimator in Length-Biased Sampling
In this paper, we prove the strong uniform consistency and asymptotic normality of the kernel density estimator proposed by Jones [12] for length-biased data.The approach is based on the invariance principle for the empirical processes proved by Horváth [10]. All simulations are drawn for different cases to demonstrate both, consistency and asymptotic normality and the method is illustrated by ...
متن کاملSecond order optimality for estimators in time series regression models
We consider the second order asymptotic properties of an efficient frequency domain regression coefficient estimator β̂ proposed by Hannan [4]. This estimator is a semiparametric estimator based on nonparametric spectral estimators. We derive the second order Edgeworth expansion of the distribution of β̂. Then it is shown that the second order asymptotic properties are independent of the bandwidt...
متن کاملAsymptotic properties of the sample mean in adaptive sequential sampling with multiple selection criteria
We extend the method of adaptive two-stage sequential sampling toinclude designs where there is more than one criteria is used indeciding on the allocation of additional sampling effort. Thesecriteria, or conditions, can be a measure of the targetpopulation, or a measure of some related population. We developMurthy estimator for the design that is unbiased estimators fort...
متن کامل