T8

16 Abril 2018, 17:00 António José Lopes Rodrigues

Markov decision processes (MDP): definition; solving approaches; a prototype example (maintenance problem); on modelling MDPs.
Application: solving the prototype example by exhaustive enumeration and evaluation of alternative policies