Land surface temperature (LST) is an important component of the energy-budget of the surface. In order to determine the brightness temperature (BT) measured by a satellite for a given LST, the atmospheric influence on radiance emitted by the surface has to be accounted for. Provided that the current state of the atmosphere (vertical moisture and temperature profiles) and the surface emissivity are sufficiently well known, it is possible to calculate the BT using a radiative transfer model (RTM), e.g., Moderate Resolution Transmittance Code 3 (MODTRAN-3). RTMs do not linearise the atmospheric effect and variations of surface emissivity, elevation, and view angle (i.e., the path through the atmosphere) are readily accounted for. However, RTMs are very expensive in terms of computing time and, therefore, not well suited to simulate large quantities of data. In order to overcome this limitation, the main objective of this study is to investigate if MODTRAN-3 can be substituted by a feed-forward neural network (NN). Training and validation data consist of inputs and outputs of MODTRAN-3 for 84 TOVS Initial Guess Retrieval (TIGR) profiles in the middle latitudes (45degreesN to 55degreesN). The RTM was run for these atmospheric situations and a range of LSTs, surface elevations, and view angles. In total, 4032 radiative transfer calculations were performed. A NN developed manually for this data is evaluated using forward calculations for atmospheric profiles from the European Centre of Medium-Range Weather Forecasts (ECMWF) analyses. Grid cells of the ECMWF analyses containing clouds are identified using cloudmasks derived from Meteosat IR and VIS data. The manually developed NN is compared against a NN developed by the evolutionary algorithm "Evolutionarer Netzwerk Optimierer" (ENZO). ENZO evolves populations of NNs: it utilises mutation and crossover to optimise NN topology (layers, number of neurons, and weights) and trains (fine-tunes) the NNs with the Stuttgart Neural Network Simulator (SNNS). The NN developed by ENZO has a rms validation error (untrained TIGR situations) of 0.25 K and a rms verification error (cloud-free situations from ECMWF analyses) of 0.31 K. ENZO is shown to produce substantially smaller NNs than the manual approach: the developed NN has 58% fewer hidden neurons and about 90% fewer weights than the manually developed NN. Furthermore, the NN developed by ENZO is more robust and generalises better. A major advantage of the NN is its computational speed, which is estimated to be of the order of 104 times faster than MODTRAN-3. (C) 2002 Elsevier Science Inc. All rights reserved.