The Gauss-Markov source produces U-i = aU(i-1) + Z(i) for i >= 1, where U-0 = 0, vertical bar a vertical bar < 1 and Z(i) similar to N(0, sigma(2)) are i.i.d. Gaussian random variables. We consider lossy compression of a block of n samples of the Gauss-Markov source under squared error distortion. We obtain the Gaussian approximation for the Gauss-Markov source with excess-distortion criterion for any distortion d > 0, and we show that the dispersion has a reverse waterfilling representation. This is the first finite blocklength result for lossy compression of sources with memory. We prove that the finite blocklength rate-distortion function R(n, d, epsilon) approaches the rate-distortion function R(d) as R(n, d, epsilon) = R(d) + root V(d)/nQ(-1) (epsilon) + o (1/root n), where V(d) is the dispersion, epsilon is an element of (0, 1) is the excess-distortion probability, and Q(-1) is the inverse of the Q-function. We give a reverse waterfilling integral representation for the dispersion V(d), which parallels that of the rate-distortion functions for Gaussian processes. Remarkably, for all 0 < d <= sigma(2)/(1+vertical bar a vertical bar)(2), R(n, d, epsilon) of the Gauss-Markov source coincides with that of Zi, the i.i.d. Gaussian noise driving the process, up to the second-order term. Among novel technical tools developed in this paper is a sharp approximation of the eigenvalues of the covariance matrix of n samples of the Gauss-Markov source, and a construction of a typical set using the maximum likelihood estimate of the parameter a based on n observations.