Correspondence analysis (CA) and principal component analysis (PCA) are often used to describe multivariate data. In certain applications they have been used for estimation in latent variable models. The theoretical basis for such inference is assessed in generalized linear models where the linear predictor equals alpha(j) + x(i)beta(j) or a(j) - b(j) (x(i) - u(j))(2), (i = 1, ..., n; j = 1, ..., m), and x(i) is treated as a latent fixed effect. The PCA and CA eigenvectors/column scores are evaluated as estimators of beta(j) and u(j) and as estimators of u(j). With m fixed and n up arrow infinity, consistent estimators cannot be obtained due to the incidental parameters problem unless sufficient "moment" conditions are imposed on x(i). PCA is equivalent to maximum likelihood estimation for the linear Gaussian model and gives a consistent estimator of beta(j) (up to a scale change) when the second sample moment of x(i) is positive and finite in the limit. It is inconsistent for Poisson and Bernoulli distributions, but when b(j) is constant, its first and/or second eigenvectors can consistently estimate u(j) (up to a location and scale change) for the quadratic Gaussian model. In contrast, the CA estimator is always inconsistent. For finite samples, however, the CA column scores often have high correlations with the u(j)'s, especially when the response curves are spread out relative to one another. The correlations obtained from PCA are usually weaker, although the second PCA eigenvector can sometimes do much better than the first eigenvector, and for incidence data with tightly clustered response curves its performance is comparable to that of CA. For small sample sizes, PCA and particularly CA are competitive alternatives to maximum likelihood and may be preferred because of their computational ease.