Most of real-world graphs are dynamic, i.e., they change over time by a sequence of update operations. While the regression problem has been studied for static graphs and temporal graphs, it is not investigated for general dynamic graphs. In this paper, we study the theory of regression over dynamic graphs. First, we present the notion of update-efficient matrix embedding, that defines conditions sufficient for a matrix embedding to be effectively used for dynamic graph regression (under l2\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$l_2$$\end{document} norm). Then, we show that given a n×m\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$n \times m$$\end{document} update-efficient matrix embedding (e.g., the adjacency matrix) and after an update operation in the graph, the exact optimal solution of linear regression can be updated in O(nm) time for the revised graph. Moreover, we show that this also holds when the matrix embedding is the Laplacian matrix and the update operations are restricted to edge insertion/deletion. In the end, by conducting experiments over synthetic and real-world graphs, we show the high efficiency of updating the solution of graph regression.