Shadow fading has been proven to be a significant contributor to channel variations in wireless communication. In most cases shadow fading is assumed to have a log-normal fading distribution to model the loss at a certain location. However, in a mobile network, it is also important to know how shadow fading is correlated both in space and in time, which can greatly affect application layer behavior and service quality. This paper is an attempt to characterize shadow fading so as to accurately study its impact on the application layer quality of service. If the correlation is strong over time and space, shadow fading can result in a long outage. In this paper, we assume shadow fading is exponentially correlated in space. To study correlated shadow fading and its resultant outage durations, a first-order Markov chain model is developed and validated. The Markov chain model is constructed by partitioning the entire shadow fading range into a finite number of intervals. The state transition matrix of the Markov chain is derived from the joint probability distribution of correlated log-normal shadow fading. Based on the proposed Markov chain model, the frequency and duration of outage near the edge of a single cell is analyzed. To validate the Markov chain model, correlated Gaussian random fields are simulated to analyze the outage frequency and durations due to correlated shadow fading. Comparing the simulation results with the Markov chain model results, we can conclude that the proposed Markov chain model is an efficient way to describe the channel variations, and the user experienced outage behavior of the channel.