It has been proposed recently that thermally assisted electroluminescence may in principle provide a means to convert solar or waste heat into electricity. The basic concept is to use an intermediate active emitter between a heat source and a photovoltaic (PV) cell. The active emitter would be a forward biased light emitting diode (LED) with a bias voltage, V-b, below bandgap, E-g (i.e., qV(b) < E-g), such that the average emitted photon energy is larger than the average energy that is required to create charge carriers. The basic requirement for this conversion mechanism is that the emitter can act as an optical refrigerator. For this process to work and be efficient, however, several materials challenges will need to be addressed and overcome. Here, we outline a preliminary analysis of the efficiency and conversion power density as a function of temperature, bandgap energy and bias voltage, by considering realistic high temperature radiative and non-radiative rates as well as radiative heat loss in the absorber/emitter. From this analysis, it appears that both the overall efficiency and net generated power increase with increasing bandgap energy and increasing temperature, at least for temperatures up to 1000 K, despite the fact that the internal quantum yield for radiative recombination decreases with increasing temperature. On the other hand, the escape efficiency is a crucial design parameter which needs to be optimized.