نتایج جستجو برای: stochastic dynamic programming
تعداد نتایج: 805092 فیلتر نتایج به سال:
OF THE DISSERTATION Scheduling Policy Design using Stochastic Dynamic Programming by Robert Glaubius Doctor of Philosophy in Computer Science Washington University in St. Louis, 2009 Research Advisor: Professor William D. Smart Scheduling policies for open soft real-time systems must be able to balance the competing concerns of meeting their objectives under exceptional conditions while achievi...
Stochastic dynamic programming models are extensively used for sequential decision making when outcomes are uncertain. These models have been widely applied in different business contexts such as inventory control, capacity expansion, cash management, etc. The objective in these models is to deduce optimal policies based on expected reward criteria. However, in many cases, managers are concerne...
We develop an exact dynamic programming algorithm for partially observable stochastic games (POSGs). The algorithm is a synthesis of dynamic programming for partially observable Markov decision processes (POMDPs) and iterative elimination of dominated strategies in normal form games. We prove that it iteratively eliminates very weakly dominated strategies without first forming the normal form r...
A finite element method for stochastic dynamic programming is developed. The computational method is valid for a general class of optimal control problems that are nonlinear and perturbed by general Markov noise in continuous time, including jump Poisson noise. Stability and convergence of the method are verified and its storage utilization efficiency over the traditional finite difference meth...
not more than 200 words) Resource allocation and management is an important part of the future network-based defence. In order to provide an adequate situation picture for commanders in the field, sensor platforms must be guided correctly and the needs of different users must be prioritized correctly. Other important problems which require resource allocation include determining where soldiers ...
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive o...
Proper investment decision making is key to success for every investor in their efforts to keep pace with the competitive business environment. Mitigation of exposure to risk plays a vital role, since investors are now directly exposed to the uncertain decision environment. The uncertainty (and risk) of an investment is increasing with the increased number of competing investors entering to mar...
Markov decision processes (MDPs) have proven to be popular models for decision-theoretic planning, but standard dynamic programming algorithms for solving MDPs rely on explicit, state-based specifications and computations. To alleviate the combinatorial problems associated with such methods, we propose new representational and computational techniques for MDPs that exploit certain types of prob...
Excerpted Section A. MARKOV CHAIN APPROXIMATION Another approach to finite differences is the well developed Markov Chain Approximation (MCA) of Kushner [3, 4]. Recent developments are surveyed and further advanced by Kushner [5], and by Kushner and Dupuis [6], with special attention to methods for jump and reflected diffusions. This method applies a Markov chain approximation to continuous tim...
We provide an American version of the Geometric Dynamic Programming Principle of Soner and Touzi [22] for stochastic target problems. This opens the doors to a wide range of applications, particularly in risk control in finance and insurance, in which a controlled stochastic process has to be maintained in a given set on a time interval [0, T ]. As an example of application, we show how it can ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید