in: Analytical and Stochastic Modelling Techniques and Applications, Marco Gribaudo,Eduard Sopin,Irina Kochetkova, Editor, Springer Nature, Zug, pp.100-114, 2020
In this study, we consider the dynamic resource-constrained multi-project scheduling problem (DRCMPSP) where projects generate rewards at their completion, completions later than a due date cause tardiness costs and new projects arrive randomly during the ongoing project execution which disturbs the existing project scheduling plan. We model this problem as a discrete Markov decision process and explore the computational limitations of solving the problem by dynamic programming. We run and compare four different solution approaches on small size problems. These solution approaches are: a dynamic programming algorithm to determine a policy that maximises the average profit per unit time net of charges for late project completion, a genetic algorithm which generates a schedule to maximise the total reward of ongoing projects and updates the schedule with each new project arrival, a rule-based algorithm which prioritise processing of tasks with the highest processing durations, and a worst decision algorithm to seek a non-idling policy to minimise the average profit per unit time. Average profits per unit time of generated policies of the solution algorithms are evaluated and compared. The performance of the genetic algorithm is the closest to the optimal policies of the dynamic programming algorithm, but its results are notably suboptimal, up to 67.2%. Alternative scheduling algorithms are close to optimal with low project arrival probability but quickly deteriorate their performance as the probability increases.