In this paper the main features of a recently proposed nonlinear mathematical model for GaAs FETs are discussed with special emphasis on large-signal performance prediction in the presence of low-frequency dispersive effects due to traps. The model is extended to take into account low-frequency dispersion due to thermal phenomena, which may be important when large power dissipation is involved. In this paper it is shown that the modelling approach used for the prediction of the low-frequency deviations of the drain current characteristics can also be easily embedded in classical equivalent circuit models. Experimental and simulated results, which confirm the validity of the model proposed, are presented and discussed.