The connection between Bayesian statistics and the technique of regularization for inverse problems has been given significant attention in recent years. For example, Bayes' law is frequently used as motivation for variational regularization methods of Tikhonov type. In this setting, the regularization function corresponds to the negative-log of the prior probability density; the fit-to-data function corresponds to the negative-log of the likelihood; and the regularized solution corresponds to the maximizer of the posterior density function, known as the maximum a posteriori (MAP) estimator of the unknown, which in our case is an image. Much of the work in this direction has focused on the development of techniques for efficient computation of MAP estimators (or regularized solutions). Less explored in the inverse problems community, and of interest to us in this paper, is the problem of sampling from the posterior density. To do this, we use a Markov chain Monte Carlo (MCMC) method, which has previously appeared in the Bayesian statistics literature, is straightforward to implement, and provides a means of both estimation and uncertainty quantification for the unknown. Additionally, we show how to use the preconditioned conjugate gradient method to compute image samples in cases where direct methods are not feasible. And finally, the MCMC method provides samples of the noise and prior precision (inverse-variance) parameters, which makes regularization parameter selection unnecessary. We focus on linear models with independent and identically distributed Gaussian noise and define the prior using a Gaussian Markov random field. For our numerical experiments, we consider test cases from both image deconvolution and computed tomography, and our results show that the approach is effective and surprisingly computationally efficient, even in large-scale cases.