In the paper, we propose a class of dynamic conditional Gaussian graphical model (DCGGMs) based on a set of nonidentical distribution observations, which changes smoothly with time or condition. Specifically, the DCGGMs model the dynamic output network influenced by conditioning input variables, which are encoded by a set of varying parameters. Moreover, we propose a joint smooth graphical Lasso to estimate the DCGGMs, which combines kernel smoother with sparse group Lasso penalty. At the same time, we design an efficient accelerated proximal gradient algorithm to solve this estimator. Theoretically, we establish the asymptotic properties of our model on consistency and sparsistency under the high-dimensional settings. In particular, we highlight a class of consistency theory for dynamic graphical models, in which the sample size can be seen as n(4/5) for estimating a local graphical model when the bandwidth parameter h of kernel smoother is chosen as h asymptotic to n(-1/5) for describing the dynamic. Finally, the extensive numerical experiments on both synthetic and real datasets are provided to support the effectiveness of the proposed method.