Compared to the standard support vector machine, the generalized eigenvalue proximal support vector machine coped well with the "Xor " problem. However, it was based on the squared Frobenius norm and hence was sensitive to outliers and noise. To improve the robustness, this paper introduces capped L-1-norm into the generalized eigenvalue proximal support vector machine, which employs nonsquared L-1-norm and "capped " operation, and further proposes a novel capped L-1-norm proximal support vector machine, called CPSVM. Due to the use of capped L-1-norm, CPSVM can effectively remove extreme outliers and suppress the effect of noise data. CPSVM can also be viewed as a weighted generalized eigenvalue proximal support vector machine and is solved through a series of generalized eigenvalue problems. The experimental results on an artificial dataset, some UCI datasets, and an image dataset demonstrate the effectiveness of CPSVM.