You are here

MARKOV KARAR SÜRECİ İLE MODELLENEN STOKASTİK VE ÇOK AMAÇLI ÜRETİM/ENVANTER PROBLEMLERİNİN HEDEF PROGRAMLAMA YAKLAŞIMI İLE ÇÖZÜLMESİ

SOLVING STOCHASTIC AND MULTI-OBJECTIVE PRODUCTION/INVENTORY PROBLEMS MODELED BY MARKOV DECISION PROCESS WITH GOAL PROGRAMMING APPROACH

Journal Name:

Publication Year:

Abstract (2. Language): 
To make decisions involving uncertainty while making future plans, Markov Decision Process (MDP), one of the stochastic approaches, may provide assistance to managers. Methods such as value iteration, policy iteration or linear programming can be used in the solution of MDPs when only one objective such as profit maximization or cost minimization is considered. However the decisions made by business while operating in a competition environment require considering multiple and usually conflicting objectives simultaneously. Goal programming (GP), can be used to solve such problems. The aim of this study is to provide an integrated perspective involving the utilization of MDP and GP approaches together for the solution of stochastic multi-objective decision problems. To this end the production/inventory system of a business operating in the automotive supplier industry is considered.
Abstract (Original Language): 
Geleceğe yönelik planlar yapılırken belirsizlik içeren kararların verilmesinde stokastik yaklaşımlardan biri olan markov karar süreçleri (MDP) yöneticilere destek sağlayabilmektedir. Kar maksimizasyonu, maliyet minimizasyonu gibi tek bir amaç ele alındığında MDP’lerinin çözümünde değer iterasyonu, politika iterasyonu veya doğrusal programlama gibi yöntemler kullanılabilmektedir. Ancak, işletmelerin rekabet ortamında aldıkları kararlar, birden fazla ve çoğunlukla da birbiriyle çatışan amaçların eş zamanlı olarak ele alınmasını gerektirmektedir. Hedef programlama (GP) yaklaşımı bu tür sorunların çözümünde kullanılabilmektedir. Çalışmanın amacı, stokastik yapıdaki çok amaçlı karar problemlerinin çözümü için MDP ve GP yaklaşımlarının bir arada kullanıldığı bütünleşik bir bakış açısı ortaya koymaktır. Bu doğrultuda otomotiv yan sanayinde faaliyet gösteren bir işletmenin üretim/envanter sistemi ele alınmıştır.
75-96

REFERENCES

References: 

Arthur, J. L. ve Ravindran A. (1978). An Efficient Goal Programming
Algorithm Using Constraint Partitioning and Variable Elimination. Management
Science, 24(8): 867-868.
Azaron, A., Katagiri, H. ve Sakawa, M. (2007). Time-Cost Trade-off Via
Optimal Control Theory in Markov PERT Networks. Annals of Operations
Research, 150(1): 47-64.
Berman, O. ve Sapna, K.P. (2001). Optimal Control of Service for
Facilities Holding Inventory. Computers and Operations Research, 28(5): 429-441.
Ching, W-K. ve Ng, M.K. (2006). Markov Chains: Models, Algorithms
and Applications. USA: Springer.
D’epenoux, F. (1963). A Probabilistic Production and Inventory Problem.
Management Science, 10(1): 98-108.
Denardo, E.V. (1970). On Linear Programming in a Markov Decision
Problem. Management Science, 16(5): 281-288.
Derman, C. ve Klein, M. (1965). Some Remarks on Finite Horizon
Markovian Decision Models. Operations Research, 13(2): 272-278.
Georgiou, A.C. (1999). Aspirations and Priorities in a Three Phase
Approach of a Nonhomogeneous Markov System. European Journal of
Operational Research, 116(3): 565-583.
Georgiou, A.C. ve Tsantas, N. (2002). Modelling Recruitment Training in
Mathematical Human Resource Planning. Applied Stochastic Models in Business
and Industry, 18(1): 53-74.
Ghellinck, G.T.D. ve Eppen, G.D. (1967). Linear Programming Solutions
for Separable Markovian Decision Problems. Management Science, 13(5): 371-394.
Golany, B., Yadin, M. ve Learner, O. (1991). A Goal Programming
Inventory Control Model Applied at a Large Chemical Plant. Production and
Inventory Management Journal, 32(1): 16-23.
Hinomoto, H. (1971a). Selective Control Independent Activities: Linear
Programming of Markovian Decisions. Management Science, 18(1): 88-96.
Hinomoto, H. (1971b). Sequential Control of Homogeneous Activities-Linear Programming of Semi-Markovian Decisions. Operations Research, 19(7):
1664-1674.
Hordijk, A. ve Kallenberg, C.M. (1979). Linear Programming and Markov
Decision Chains. Management Science, 25(4): 352-362.
Howard, R.A. (1960). Dynamic Programming and Markov Processes.
USA: M.I.T. Press
Özdemir, A. DEÜ SBE Dergisi Cilt:11, Sayı:3
92
Ignizio, J.P. (1978). A Review of Goal Programming: A Tool for
Multiobjective Analysis. Journal of Operational Research Society, 29(11): 1109-1119.
Jaaskelainen, V. (1969). A Goal Programming Model of Aggregate
Production Planning. The Swedish Journal of Economics, 71(1): 14-29.
Jayakumar, A. ve Asgarpoor, S. (2006). Maintenance Optimization of
Equipment by Linear Programming. Probability in the Engineering and
Informational Sciences, 20(1): 183-193.
Kalu, T.Ch.U. (1994). Determining the Impact of Nigeria’s Economic
Crisis on the Multinational Oil Companies: A Goal Programming Approach. The
Journal of the Operational Research Society, 45(2): 165-177.
Kalu, T.Ch.U. (1999). Capital Budgeting Under Uncertainty: An Extended
Goal Programming Approach. International Journal of Production Economics,
58(3): 235-251.
Kislev, Y. ve Amiad, A. (1968). Linear and Dynamic Programming in
Markov Chains. American Journal of Agricultural Economics, 50(1): 111-129.
Klein, M. (1962). Inspection-Maintenance-Replacement Schedules Under
Markovian Deterioration. Management Science, 9(1): 25-32.
Klein, M. (1966). Markovian Decision Models for Reject Allowance
Problems. Management Science, 12(5): 349-358.
Kolesar, P. (1967). Randomized Replacement Rules Which Maximize the
Expected Cycle Length of Equipment Subject to Markovian Deterioration.
Management Science, 13(11): 867-876.
Kornbluth, J.S.H. (1981). Aggregate Manpower Planning Using a
Markovian Goal Programming Approach. The Journal of the Operational Research
Society, 32(10): 940-943.
Kristensen, A.R. (1996), “Dynamic Programming and Markov Decision
Processes”, http://www.jbs.agrsci.dk/~ejo/nova/notat48.pdf, Erişim: 18.04.2006
Lee, S.M. (1979). Goal Programming Methods for Multiple Objective
Integer Programs, OR Monograph Series No:2. Atlanta: American Institute of
Industrial Engineers Inc.
Lee, S. M. ve Moore, L.J. (1975). Introduction to Decision Science. New
York: Petrocelli/Charter.
Manne, A.S. (1960). Linear Programming and Sequential Decisions.
Management Science, 6(3): 259-267.
Mrkaic, M. (2002). Policy Iteration Accelerated with Krylov Methods.
Journal of Economic Dynamics and Control, 26(4): 517-545.
Markov Karar Süreci İle… DEÜ SBE Dergisi Cilt:11, Sayı:3
93
Nazareth, J.L. ve Kulkarni, R.B. (1986). Linear Programming
Formulations of Markov Decision Processes. Operations Research Letters, 5(1):
13-16.
Olson, D. (1984). Comparison of Four Goal Programming Algorithms.
Journal of Operational Research Society, 35(4): 347-354.
Parzen, E. (1962). Stochastic Processes, Holden-Day Inc., USA.
Perez, S.J. (1985). Multiple Objective Decision Making Using Goal
Programming Techniques: An Interactive Microcomputer Approach.
Yayınlanmamış Doktora Tezi, Graduate College of Texas A&M University, Texas.
Puterman, L. (1994). Markov Decision Processes: Discrete Stochastic
Dynamic Programming. UK: John Wiley&Sons Inc.
Ravindran, A., Phillips, D.T. ve Solberg, J.J. (1987). Operations Research:
Principles and Practice, Second Edition. USA: John Wiley and Sons Inc.
Schniederjans, M.J. ve Kwak, N.K. (1982). An Alternative Solution
Method for Goal Programming Problems: A Tutorial. Journal of Operational
Research Society, 33(3): 247-251.
Tamiz, M., Jones, D. ve Romero, C. (1998). Goal Programming for
Decision Making: An Overview of the Current State-of-the-Art. European Journal
of Operational Research, 111(3): 569-581.
Tamiz, M. ve Jones, D.F. (1996). An Overview of Current Solution
Methods and Modelling Practices in Goal Programming, Multi-Objective
Programming and Goal Programming Theories and Applications (ss.198-211).
Germany: Springer-Verlag.
Trzaskalik, T. (1998). Multiobjective Analysis in Dynamic Environment.,
Katowice: The Karol Adamiecki University of Economics Press
Vanguri, U.P. (1998). Goal Programming for Pension Fund Portfolio
Modeling. Yayınlanmamış Yüksek Lisans Tezi, University of Manitoba The Warren
Centre for Actuarial Studies and Research, Manitoba.
Wolfe, P. ve Dantzig, G.B. (1962). Linear Programming in a Markov
Chain. Operations Research, 10(5): 702-710.
Yates, C.M. ve Rehman, T. (1998). A Linear Programming Formulation of
the Markovian Decision Process Approach to Modelling the Dairy Replacement
Problem. Agricultural Systems, 58(2): 185-201.
Zanakis, S.H. ve Maret, M.W. (1981). A Markovian Goal Programming
Approach to Aggregate Manpower Planning. The Journal of the Operational
Research Society, 32(1): 55-63.

Thank you for copying data from http://www.arastirmax.com