Economists attempting to estimate linear models are frequently restricted due to data scarcity in terms of short lime series of data and also of parameter non constancy. In this case, a realistic alternative is often to guess rather than to estimate parameters of such models. An algorithm of repetitive guessing (drawing) parameters from iteratively changing distributions, with the objective of minimizing the squares of ex-post prediction errors, weighted by penalty weights and subject to a learning process, has been recently introduced and sufficient conditions for convergence were theoretically described In this paper, Repetitive Stochastic Guesstimation (RSG) and Simulated Annealing (SA) are compared for the problem of a linear regression coefficients' estimation, when only small and undersized samples are available. A robust alternative - based on bootstrap confidence intervals - to the RSG is built: Repetitive Stochastic Bootstrapped Guesstimation (RSGBOOT). A Monte Carlo experiment is designed to compare performances of RSG, RSGBOOT and SA. In the second part, confidence intervals for the RSG point estimators are built in a Bayesian framework. Again, a Monte Carlo analysis is conducted in the case of a linear regression equation.