نتایج جستجو برای: markovian chain

تعداد نتایج: 303054  

Journal: :Computational Statistics & Data Analysis 2013
Brian L. Mark Yariv Ephraim

We study properties and parameter estimation of finite-state homogeneous continuous-time bivariate Markov chains. Only one of the two processes of the bivariate Markov chain is observable. The general form of the bivariate Markov chain studied here makes no assumptions on the structure of the generator of the chain, and hence, neither the underlying process nor the observable process is necessa...

Journal: :Mathematics 2021

In this paper, we propose a new weak order 2.0 numerical scheme for solving stochastic differential equations with Markovian switching (SDEwMS). Using the Malliavin analysis, theoretically prove that has local 3.0 convergence rate. Combining special property of Markov chain, study effects from changes state space on rate scheme. Two experiments are given to verify theoretical results.

2017
Austin R. Benson David F. Gleich Lek-Heng Lim

Random walks are a fundamental model in applied mathematics and are a common example of a Markov chain. The limiting stationary distribution of the Markov chain represents the fraction of the time spent in each state during the stochastic process. A standard way to compute this distribution for a random walk on a finite set of states is to compute the Perron vector of the associated transition ...

Journal: :IEEE Access 2021

A model of a cell communication network, divided into zones, with the dependence users' service time on zone, in which they are located, is considered. The arrival flow users defined by marked Markovian process. number that can receive simultaneously finite. If reached limit, then new lost, except come to already during (handover users). For short-term storage such users, there buffer finite ca...

Journal: :Neural Processing Letters 2023

In this article, the Input-to-state stability theory is used to investigate stochastic Cohen–Grossberg bidirectional associative memory neural network with time-varying delay. addition, Markovian jump parameters are considered in model determine continuous-time, discrete-state Markov chain. By utilizing Lyapunov functional and weak infinitesimal generator algebraic conditions derived for criter...

2009
Miquel Trias Alberto Vecchio John Veitch

Bayesian analysis of LISA data sets based on Markov chain Monte Carlo methods has been shown to be a challenging problem, in part due to the complicated structure of the likelihood function consisting of several isolated local maxima that dramatically reduces the efficiency of the sampling techniques. Here we introduce a new fully Markovian algorithm, a Delayed Rejection MetropolisHastings Mark...

2004
Hsing Luh

We consider a queueing model with finite capacities. External arrivals follow a Coxian distribution. Due to the limitation of the capacity, arrivals may be lost if the buffer is full. Our goal is to study the probability of blocking. In order to obtain the steady-state probability distribution of this model, we construct an embedded Markov chain at the departure points. The solution is solved ...

1998
Yiqiang Q. ZHAO

A new procedure for computing stationary probabilities for an overloaded Markovian model is proposed in terms of the rotated Markov chain. There are two advantages to use this procedure: i) This procedure allows us to approximate an overloaded nite model using a stable innnite Markov chain. This will make the study easier when the innnite model has a simpler solution. ii) Numerically, this proc...

2015
P. Vijaya Laxmi K. Jyothsna

This paper presents the analysis of a finite buffer renewal input queue wherein the customers can decide either to join the queue with a probability or to balk. The service process is Markovian service process (MSP ) governed by an underlyingm-state Markov chain. Employing the supplementary variable and embedded Markov chain techniques, the steadystate system length distributions at pre-arrival...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید