The objective of the study is twofold. On the one hand, it attempts to define an specific framework to make comparative studies of different statistical and machine learning methods in the context of regression analysis. On the other, it takes a specific known economics problem and apply this framework using different algorithms-OLS, neural network, decision tree, and k-nearest; neighbor. This methodology is based on the study of the error curves-the behavior of the root mean square error (RMSE) when varying the sample size and the capacity (degrees of freedom) of each analytical method. Using state-of-the art techniques we build more than 13,920 models to test the methodology by recovering a restricted version of the Black-Scholes call option pricing formula with noise-where the instantaneous standard deviation of the noise is 0.78. The results show that-given the level of noise-neural networks provide the best estimation with an average RMSE of 0.7825 for a training sample of 6,000 records. OLS is the second best with an average RMSE of 0.7861 and its first best for sample sizes smaller than 1,125. The k-Nearest Neighbor achieved, an average RMSE of 0.8380 which is comparable to the worst performer CART which attained an average RMSE of 0.8721.(1)