The expectation-maximization algorithm for autoregressive models with normal inverse Gaussian innovations
نویسندگان
چکیده
In this paper, we study the autoregressive (AR) model with normal inverse Gaussian (NIG) innovations. The NIG distribution is semi heavy-tailed and helpful in capturing extreme observations present data. expectation-maximization (EM) algorithm used to estimate parameters of considered AR(p) model. efficacy estimation procedure shown on simulated data for AR(2) AR(1) models. A comparative presented, where classical algorithms are also incorporated, namely, Yule-Walker conditional least squares methods along EM method parameter estimation. simulation study, maximum likelihood (MLE) by iterative Newton-Raphson compared. real-life applications introduced demonstrated NASDAQ stock market index US gasoline price studies show that residuals good fit financial values as well
منابع مشابه
Inverse Gaussian Autoregressive Models
A first-order autoregressive process with inverse gaussian marginals is introduced. The innovation distributions are obtained under certain special cases. The unknown parameters are estimated using different methods and these estimators are shown to be consistent and asymptotically normal. The behavior of the estimators for small samples is studied through simulation experiments. On Sums of Tri...
متن کاملthe algorithm for solving the inverse numerical range problem
برد عددی ماتریس مربعی a را با w(a) نشان داده و به این صورت تعریف می کنیم w(a)={x8ax:x ?s1} ، که در آن s1 گوی واحد است. در سال 2009، راسل کاردن مساله برد عددی معکوس را به این صورت مطرح کرده است : برای نقطه z?w(a)، بردار x?s1 را به گونه ای می یابیم که z=x*ax، در این پایان نامه ، الگوریتمی برای حل مساله برد عددی معکوس ارانه می دهیم.
15 صفحه اولGaussian Mixure Models and Expectation Maximization
The goal of the assignment is to use the Expectation Maximization (EM) algorithm to estimate the parameters of a two-component Guassian Mixture in two dimensions. This involves estimating the mean vector μk and covariance matrix Σk for both distributions as well as the mixing coefficients (or prior probabilities) πk for each component k. EM works by first choosing an arbitrary parameter set. In...
متن کاملAn Expectation-Maximization Algorithm for the Fractal Inverse Problem
We present an Expectation-Maximization algorithm for the fractal inverse problem: the problem of fitting a fractal model to data. In our setting the fractals are Iterated Function Systems (IFS), with similitudes as the family of transformations. The data is a point cloud in R with arbitrary dimension H. Each IFS defines a probability distribution on R , so that the fractal inverse problem can b...
متن کاملAn Expectation-Maximization algorithm for Learning the Latent Gaussian Model with Gaussian Likelihood
In this note, we derive an expectation-maximization (EM) algorithm for a latent Gaussian model with Gaussian likelihood. This model contains many popular models as a special case, such as factor analysis and linear regression. Our derived EM algorithm is general, and contains almost all the updates required for the special cases. We also describe modification of the algorithm in the presence of...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Communications in Statistics - Simulation and Computation
سال: 2023
ISSN: ['0361-0918', '1532-4141']
DOI: https://doi.org/10.1080/03610918.2023.2186334