Let P = (p(1), ... , p(n)) be a probability distribution on a set Omega = {omega(1), ... , omega(n)} with n elements, n is an element of N\{1}. Then the term S-2(P) = 1 - Sigma(n)(i-1) P-i(2), frequently called the Gini-Simpson index, or, in information theory, quadratic entropy, is used in many different areas of research resp. applications and was, therefore, reinvented several times. In this note we give a concise history of this index and closely related measures, as well as its generalisation to all values of the parameter of the class of entropies of order alpha is an element of(0, infinity)\{1} introduced by Havrda and Charvat (1967) and reinvented by Tsallis (1988) for the use of this index in statistical physics, for which the limiting case for alpha -> 1 is Shannon's entropy S-1(P) = - Sigma(n)(i=1) pi ln pi. We also give a brief note on weighted versions of the Gini-Simpson index. In addition to these central historic features our note also presents contributions on the axiomatics of entropies and on the early history of the application of the concept of entropy in thermodynamics. We also provide an entry on Renyi's class of entropies, linked with Hill's diversity numbers.