Mathematical tools that are widely employed for the distribution-free analysis of the finite-sample performance of empirical estimators-like Efron-Stein's inequality, or Azuma's martingale inequality and derivatives-rely on the provision of tight bounds for the Lipschitz constants of the estimators. Lipschitz constants can be easily derived for simple functionals of the empirical measures, such as the Shannon or the Tsallis entropy. However, obtaining tight bounds for more generalized entropy functionals which cannot be decomposed into sums of identical terms is much more involved. The goal in this paper is the derivation of (empirical) Lipschitz constants for the maximum likelihood estimator of the family of Renyi entropies of order lambda > 0, lambda not equal 1. Analytic solutions for the optimal constants are obtained for the most important special cases; namely, for 0 < lambda < 1, for 1 < lambda < 2 as well as for lambda = 2 (collision entropy) and lambda = 3. For the remaining cases where no analytic solution is obtained (i.e., lambda > 2, lambda not equal 3), an efficient way to compute the optimal constants is derived by reducing the complexity of the underlying optimization problem from exponential Omega(n(parallel to A parallel to-1)) to linear O(n), where n the number of available samples and parallel to A parallel to >= 2 the number of symbols of the source (with the assumption parallel to A parallel to does not grow with n). The optimal constants are subsequently used for the distribution-free performance analysis of the maximum likelihood estimator of Renyi entropy; this includes variance and concentration bounds, both under the assumption of independence as well as under strong mixing conditions.