Consider the partly Linear model Y-i = X(i)'beta(0)+g(0)(T-i)+e(i), where {((Ti, X(i))}(infinity)(1) is a strictly stationary sequence of random variables, the e(i)'s are i.i.d. random errors, the Y-i's are real-valued responses, beta(0) is a d-vector of parameters, X(i) is a d-vector of explanatory variables, T-i is another explanatory variable ranging over a nondegenerate compact interval. Based on a segment of observations (T-1,X(1)',Y-1),...,(T-n,X(n)',Yn), this article investigates the rates of convergence of the M-estimators for beta(0) and g(0) obtained from the minimization problem Sigma(i=1)(n) rho(Y-i-X(i)'beta-g(n)(T-i)) = min (beta is an element of R?d ,gn is an element of Fn') where F-n is a space of B-spline functions of order m + 1 and rho(.) is a function chosen suitably. Under some regularity conditions, it is shown that the estimator of g(0) achieves the optimal global rate of convergence of estimators for nonparametric regression, and the estimator of beta(0) is asymptotically normal. The M-estimators here include regression quantile estimators, L(1)-estimators, L(p)-norm estimators, Huber's type M-estimators and usual least squares estimators. Applications of the asymptotic theory to testing the hypothesis H-0 : A'beta(0) = <(beta)over tilde> are also discussed, where <(beta)over tilde> is a given vector and A is a known d x d(0) matrix with rank d(0).