This paper proposes a new generalized criterion for the training of feed-forward neural networks. Depending on the optimization strategy used, this criterion leads to a variety of fast learning algorithms for single-layered as well as multilayered neural networks. The simplest algorithm devised on the basis of this generalized criterion is the fast delta rule algorithm, proposed for the training of single-layered neural networks. The application of a similar optimization strategy to multilayered neural networks in conjunction with the proposed generalized criterion provides the fast back propagation algorithm. Another set of fast algorithms with better convergence properties is derived on the basis of the same strategy that provided recently a family of Efficient LEarning Algorithms for Neural NEtworks (ELEANNE). This optimization strategy is the source of the Fast ELEANNE 3, a second-order learning algorithm, and a simplified version of this algorithm, called Fast ELEANNE 4. The Fast ELEANNE 3 and Fast ELEANNE 4, proposed for this training of single-layered neural networks, provide the basis for the derivation of the Fast ELEANNE 5, the Fast ELEANNE 6, and the Fast ELEANNE 7, which are proposed for the training of multilayered neural networks. Several experiments verify that the fast algorithms developed in this paper perform the training of neural networks faster than the corresponding learning algorithms existing in the literature.