Optimizing Red Blood Cells Consumption Using Markov Decision Process
Authors
Abstract:
In healthcare systems, one of the important actions is related to perishable products such as red blood cells (RBCs) units that its consumption management in different periods can contribute greatly to the optimality of the system. In this paper, main goal is to enhance the ability of medical community to organize the RBCs units’ consumption in way to deliver the unit order timely with a focus on minimizing total costs of the system. In each medical center such as hospitals or clinics, decision makers consider a one-day period for their policy making about supply and demand of RBCs. Based on the inventory status of the previous day, decisions are made for following day. In this paper, we use Markov decision process (MDP) as a sequential decision-making approach for blood inventory problem considering red blood cells consumption. The proposed MDP model for RBCs consumption management is solved using sequential approximation algorithm. We perform a case study for the proposed model using blood consumption data of Zanjan, Iran. Results for several blood types are discussed accordingly. In terms of total cost of the system, LIFO-LIFO policy is best policy for RBCs consumption among all other policies. In order to analyze the importance of some parameters in the model, a sensitivity analysis is done over shortage cost.
similar resources
Modeling Healthcare Data using Markov Decision Process
Outline Objective Background: Stochastic tools used in healthcare MDP in healthcare Preliminaries Optimality Equations and the Principle of Optimality Solving MDPs Examples References Objective: To discuss the construction and evaluation of Markov Decision Process (MDP) To investigate the role of MDP in healthcare. To identify the most appropriate solution techniques for finite and infinite-hor...
full textQuantile Markov Decision Process
In this paper, we consider the problem of optimizing the quantiles of the cumulative rewards of Markov Decision Processes (MDP), to which we refers as Quantile Markov Decision Processes (QMDP). Traditionally, the goal of a Markov Decision Process (MDP) is to maximize expected cumulative reward over a defined horizon (possibly to be infinite). In many applications, however, a decision maker may ...
full textRegulation of nitric oxide consumption by hypoxic red blood cells.
The homeostasis of nitric oxide (NO) is attained through a balance between its production and consumption. Shifts in NO bioavailability have been linked to a variety of diseases. Although the regulation of NO production has been well documented, its consumption is largely thought to be unregulated. Here, we have demonstrated that under hypoxic conditions, NO accelerates its own consumption by i...
full textA generalized Markov decision process
— In this paper we present a generalized Markov décision process that subsumes the traditional discounted, infinité horizon, finite state and action Markov décision process, VeinotCs discountéd décision processes, and Koehler's generalization of these two problem classes. Résumé. — Nous présentons dans cet article un processus de Markov généralisé qui englobe le processus de décision markovien ...
full textUsing Markov Decision Process for Construction Site Management in Cameroon
Building construction sites are enclosed and restricted environments with minimal space to maneuver objects. Thus the organisational management of cost control, time and quality of construction projects is a huge challenge. Methods of project management commonly used by companies in developing countries are essentially planning schedules. Often, these methods are based on the assumption that co...
full textMy Resources
Journal title
volume 4 issue 2
pages 113- 132
publication date 2019-05-01
By following a journal you will be notified via email when a new issue of this journal is published.
Hosted on Doprax cloud platform doprax.com
copyright © 2015-2023