z-logo
Premium
Stochastic modeling for inventory and production planning in the paper industry
Author(s) -
Yin K. Karen,
Yin G. George,
Liu Hu
Publication year - 2004
Publication title -
aiche journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.958
H-Index - 167
eISSN - 1547-5905
pISSN - 0001-1541
DOI - 10.1002/aic.10251
Subject(s) - markov decision process , mathematical optimization , randomness , markov chain , markov process , production planning , curse of dimensionality , hamilton–jacobi–bellman equation , production (economics) , computer science , bellman equation , operations research , mathematics , economics , artificial intelligence , statistics , machine learning , macroeconomics
Problem formulations and solution procedures of production planning and inventory management for manufacturing systems under uncertainties is discussed. Markov decision processes and controlled Markovian dynamic systems are used in the models. Considering an inventory problem in discrete time and formulating it by a finite‐state Markov chain lead to a Markov decision process model. Using the policy‐improvement algorithm yields the optimal inventory policy. In controlled dynamic system modeling, the random demand and capacity processes involved in planning are described by two finite‐state continuous‐time Markov chains. Such an approach enables us to embed the randomness in the differential equations of the system. The optimal production rates that minimize an expected cost are obtained by numerically solving the corresponding Hamilton–Jacobi–Bellman (HJB) equations. To overcome the so‐called curse of dimensionality, frequently encountered in computation, we resort to a hierarchical approach. Illustrative examples using data collected from a large paper manufacturer are provided. © 2004 American Institute of Chemical Engineers AIChE J, 50: 2877–2890, 2004

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here