As simulation continues to replace experimentation in the design cycle, the need to quantify uncertainty in model outputs due to uncertainties in the model parameters becomes critical. For distributed parameter models, current approaches assume the mean and variance of parameters are known, then use recently developed efficient numerical methods for approximating stochastic partial differential equations. However, the statistical descriptions of the model parameters are rarely known. A number of recent works have investigated adapting existing variational methods for parameter estimation to account for parametric uncertainty. In this paper, we formulate the parameter identification problem as an infinite dimensional constrained optimization problem for which we establish existence of minimizers and the first order necessary conditions. A spectral approximation of the uncertain observations (via a truncated Karhunen-Loeve expansion) allows an approximation of the infinite dimensional problem by a smooth, albeit high dimensional, deterministic optimization problem, the so-called 'finite noise' problem, in the space of functions with bounded mixed derivatives. We prove convergence of 'finite noise' minimizers to the corresponding infinite dimensional solutions, and devise a gradient based strategy for locating these numerically. Lastly, we illustrate our method with a numerical example.