TY - JOUR
T1 - Optimal inspection and maintenance policies for infrastructure systems
T2 - Facility and network problems
AU - Madanat, Samer
AU - Smilowitz, Karen
AU - Lago, Alejandro
PY - 1999
Y1 - 1999
N2 - State-of-the-art infrastructure management systems use Markov decision processes (MDPs) as a methodology for maintenance and rehabilitation (M&R) decision making. The underlying assumptions in this methodology are that an inspection is performed at the beginning of every year and that inspections reveal the true condition state of the facility, with no error. As a result, after an inspection, the decision maker can apply the activity prescribed by the optimal policy for that condition state of the facility. Previous research has developed a methodology for M&R activity selection that accounts for the presence of both forecasting and measurement uncertainty. This methodology is the latent Markov decision process (LMDP), an extension of the traditional MDP that does not necessarily assume the measurement of facility condition to be free of error. Both a transient and a steady-state formulation of the facility-level LMDP are presented. The methodology is extended to include network-level constraints. This can be achieved by extending the LMDP model to the network-level problem through the use of randomized policies. In addition, both a transient and a steady-state formulation of the network-level LMDP are presented. A case study application demonstrates the expected savings in life-cycle costs that result from increasing the measurement accuracy used in facility inspections and from scheduling inspection decisions optimally.
AB - State-of-the-art infrastructure management systems use Markov decision processes (MDPs) as a methodology for maintenance and rehabilitation (M&R) decision making. The underlying assumptions in this methodology are that an inspection is performed at the beginning of every year and that inspections reveal the true condition state of the facility, with no error. As a result, after an inspection, the decision maker can apply the activity prescribed by the optimal policy for that condition state of the facility. Previous research has developed a methodology for M&R activity selection that accounts for the presence of both forecasting and measurement uncertainty. This methodology is the latent Markov decision process (LMDP), an extension of the traditional MDP that does not necessarily assume the measurement of facility condition to be free of error. Both a transient and a steady-state formulation of the facility-level LMDP are presented. The methodology is extended to include network-level constraints. This can be achieved by extending the LMDP model to the network-level problem through the use of randomized policies. In addition, both a transient and a steady-state formulation of the network-level LMDP are presented. A case study application demonstrates the expected savings in life-cycle costs that result from increasing the measurement accuracy used in facility inspections and from scheduling inspection decisions optimally.
UR - http://www.scopus.com/inward/record.url?scp=0033323577&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0033323577&partnerID=8YFLogxK
U2 - 10.3141/1667-01
DO - 10.3141/1667-01
M3 - Article
AN - SCOPUS:0033323577
SN - 0361-1981
SP - 1
EP - 7
JO - Transportation Research Record
JF - Transportation Research Record
IS - 1667
ER -