In the last few years, many known works in learning theory stepped over the classical assumption that samples are independent and identical distribution and investigated learning performance based on non-independent samples, as mixing sequences (e.g., alpha-mixing, beta-mixing, phi-mixing etc.), they derived similar results with the investigation based on classical sample assumption. Negative association (NA) sequence is a kind of significant dependent random variables and plays an important role in non-independent sequences. It is widely applied to various subjects such as probability theory, statistics and stochastic processes. Therefore, it is essential to study the learning performance of learning process for dependent samples drawn from NA process. Obviously, samples in this learning process are not independent and identical distribution. The results in classical learning theory are not applied directly. In this paper, we study the consistency of least-square regularized regression with NA samples. We establish the error bound of least-square regularized regression for NA samples, and prove that the learning rate of least-square regularized regression for NA samples is m(epsilon-1), which is tend to m(-1) when epsilon arbitrarily close to 0, where m denote the number of the samples. The simulation experiment of convergence rate on NA samples reveals that the least-square regularized regression algorithm for NA samples is consistent. This result generalizes the classical result of independent and identical distribution.