نتایج جستجو برای: markovian process
تعداد نتایج: 1318363 فیلتر نتایج به سال:
The Petri Box Calculus (PBC) combines two well known paradigms of the design of concurrent systems: process algebras and Petri nets. In our first proposal of sPBC (stochastic PBC) [12] we defined a Markovian extension of finite PBC, i.e., we had a Markovian process algebra for which both an operational and a denotational (based on stochastic Petri nets) semantics were defined. Our goal in this ...
In this paper a bond market model and the related term structure of interest rates are studied where prices of zero coupon bonds are driven by a jump-diffusion process. A criterion is derived on the deterministic forward rate volatilities under which the short rate process is Markovian. In the case that the volatilities depend on the short rate sufficient conditions are presented for the existe...
In this paper a method is proposed called circulant matching method to approximate the superposition of a number of discrete time batch Markovian arrival sources by a circulant batch Markovian process while matching the stationary cumulative distribution and the autocorrelation sequence of the input rate process Special attention is paid to periodic sources The method is applied to the superpos...
Markovian process calculi constitute a useful framework for reasoning about the functional and performance aspects of concurrent systems. This is achieved by means of behavioral equivalences that take into account both the action names and their exponentially distributed durations. A notable extension to the expressiveness of Markovian process calculi derives from the adoption of GSPNlike immed...
We consider a Markovian regime-switching risk model (also called the Markov-modulated risk model) with stochastic premium income, in which the premium income and the claim occurrence are driven by the Markovian regime-switching process. The purpose of this paper is to study the integral equations satisfied by the expected discounted penalty function. In particular, the discount interest force p...
Under certain conditions, the state space of a homogeneous Markov process can be partitionned to construct an aggregated markovian process. However, the verification of these conditions requires expensive computations. In this note, we expose a necessary condition for having a markovian aggregated process. This condition is based on properties of the eigenvalues of certain submatrices of the tr...
We study the evolution of entanglement and nonlocality of a non-interacting qubit-qutrit system under the effect of random telegraph noise (RTN) in independent and common environments in Markovian and non-Markovian regimes. We investigate the dynamics of qubit-qutrit system for different initial states. These systems could be existed in far astronomical objects. A monotone decay of the nonlocalit...
We introduce a new approach to distribution fitting, called Decision on Beliefs (DOB). The objective is to identify the probability distribution function (PDF) of a random variable X with the greatest possible confidence. It is known that f X is a member of = { , , }. 1 m S f L f To reach this goal and select X f from this set, we utilize stochastic dynamic programming and formulate this proble...
Interest rates have traditionally been modeled in the literature as following continuous-time Markov processes, and more specifically diffusions. By contrast, recent term structure models often imply non-Markovian continuous-time dynamics. Can discretely sampled interest rate data help decide which continuous-time models are sensible? First, how reasonable is the Markovian assumption? A test of...
Stochastic differential games are considered in a non-Markovian setting. Typically, in stochastic differential games the modulating process of the diffusion equation describing the state flow is taken to be Markovian. Then Nash equilibria or other types of solution such as Pareto equilibria are constructed using Hamilton-Jacobi-Bellman (HJB) equations. But in a non-Markovian setting the HJB met...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید