In this paper, we propose a simple bias-reduced log-periodogram regression estimator, (d) over cap (r), of the long-memory parameter, d, that eliminates the first- and higher-order biases of the Geweke and Porter-Hudak (1983) (GPH) estimator. The bias-reduced estimator is the same as the GPH estimator except that one includes frequencies to the power 2k for k = 1,...,r, for some positive integer r, as additional regressors in the pseudo-regression model that yields the GPH estimator. The reduction in bias is obtained using assumptions on the spectrum only in a neighborhood of the zero frequency. Following the work of Robinson (1995b) and Hurvich, Deo, and Brodsky (1998), we establish the asymptotic bias, variance, and mean-squared error (MSE) of (d) over cap (r), determine the asymptotic MSE optimal choice of the number of frequencies, m, to include in the regression, and establish the asymptotic normality of (d) over cap (r). These results show that the bias of (d) over cap (r) goes to zero at a faster rate than that of the GPH estimator when the normalized spectrum at zero is sufficiently smooth, but that its variance only is increased by a multiplicative constant. We show that the bias-reduced estimator (d) over cap (r) attains the optimal rate of convergence for a class of spectral densities that includes those that are smooth of order s greater than or equal to 1 at zero when r greater than or equal to (s - 2)/2 and m is chosen appropriately. For s > 2, the GPH estimator does not attain this rate. The proof uses results of Giraitis, Robinson, and Samarov (1997). We specify a data-dependent plug-in method for selecting the number of frequencies m to minimize asymptotic MSE for a given value of r. Some Monte Carlo simulation results for stationary Gaussian ARFIMA(1, d, 1) and (2, d, 0) models show that the bias-reduced estimators perform well relative to the standard log-periodogram regression estimator.