The objective of infrastructure management is to provide optimal maintenance, rehabilitation and replacement (MR&R) policies for a system of facilities over a planning horizon. While most approaches in the literature have studied the decision-making process as a finite resource allocation problem, the impact of construction activities on the road network is often not accounted for. The state-of-theart Markov decision process (MDP)-based optimization approaches in infrastructure management, while optimal for solving budget allocation problems, become internally inconsistent upon introducing network constraints. In comparison, approximate dynamic programming (ADP) enables solving complex problem formulations by using simulation techniques and lower dimension value function approximations. In this paper, an ADP framework is proposed which provides better results than randomized policy frameworks in the presence of network constraints.