Arrival probability in the stochastic networks with an established discrete time Markov chain
نویسندگان
چکیده مقاله:
The probable lack of some arcs and nodes in the stochastic networks is considered in this paper, and its effect is shown as the arrival probability from a given source node to a given sink node. A discrete time Markov chain with an absorbing state is established in a directed acyclic network. Then, the probability of transition from the initial state to the absorbing state is computed. It is assumed to have some wait states, if there is a physical connection but not any immediate communication between two nodes. The Numerical results show, the critical nodes and arcs are detected by the proposed method and it can be used to anticipate probablecongestion in communication and transportation networks.
منابع مشابه
arrival probability in the stochastic networks with an established discrete time markov chain
the probable lack of some arcs and nodes in the stochastic networks is considered in this paper, and its effect is shown as the arrival probability from a given source node to a given sink node. a discrete time markov chain with an absorbing state is established in a directed acyclic network. then, the probability of transition from the initial state to the absorbing state is computed. it is as...
متن کاملThe critical node problem in stochastic networks with discrete-time Markov chain
The length of the stochastic shortest path is defined as the arrival probability from a source node to a destination node. The uncertainty of the network topology causes unstable connections between nodes. A discrete-time Markov chain is devised according to the uniform distribution of existing arcs where the arrival probability is computed as a finite transition probability from the initial st...
متن کاملDiscrete Time Markov Chain (DTMC)
A. A stochastic process is a collection of random variables {X t , t ∈ T }. B. A sample path or realization of a stochastic process is the collection of values assumed by the random variables in one realization of the random process, e.g. C. The state space is the collection of all possible values the random variables can take on, i.e. it is the sample space of the random variables. For example...
متن کاملanalysis of ruin probability for insurance companies using markov chain
در این پایان نامه نشان داده ایم که چگونه می توان مدل ریسک بیمه ای اسپیرر اندرسون را به کمک زنجیره های مارکوف تعریف کرد. سپس به کمک روش های آنالیز ماتریسی احتمال برشکستگی ، میزان مازاد در هنگام برشکستگی و میزان کسری بودجه در زمان وقوع برشکستگی را محاسبه کرده ایم. هدف ما در این پایان نامه بسیار محاسباتی و کاربردی تر از روش های است که در گذشته برای محاسبه این احتمال ارائه شده است. در ابتدا ما نشا...
15 صفحه اولOptimal route choice in stochastic time-varying transportation networks considering on-time arrival probability
This paper studies the problem of finding a priori optimal paths to guarantee a maximum likelihood of arriving on-time in a stochastic time-varying transportation network. The reliable path can help travelers better plan their trip by measuring the risk of late under uncertain conditions. We first identify a set of mathematical relationships between the on-time arrival probability, mean and var...
متن کاملDiscrete Time Stochastic Networks
In this section we study the relation between throughput and queue-length distributions in discrete-time closed cyclic networks when the number of customers and the number of nodes grows unboundedly. To state our findings more precisely: For each positive integer N , consider a cyclic network with n(N) nodes and k(N) customers. We suppose that there are only finitely many types of nodes. That i...
متن کاملمنابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ذخیره در منابع من قبلا به منابع من ذحیره شده{@ msg_add @}
عنوان ژورنال
دوره 2 شماره 1
صفحات 74- 89
تاریخ انتشار 2014-05-01
با دنبال کردن یک ژورنال هنگامی که شماره جدید این ژورنال منتشر می شود به شما از طریق ایمیل اطلاع داده می شود.
میزبانی شده توسط پلتفرم ابری doprax.com
copyright © 2015-2023