Fast moving objects are important objects of concern for deep space exploration or astronomical missions. Challenges in accurately locating these objects (especially at the sub-pixel level) include motion blur caused by the fast movement and the defocusing of optical sensors. In this letter, we explore a new method for removing these effects. Instead of estimating each of the effects individually e.g. via image restoration, the proposed method estimates the combined effect of defocusing and blurring in one step. Based on the observed similarity between the appearance of fast moving objects (such as asteroids) and 2D Gaussian surfaces, we use the Gaussian surface as model for fitting and locating the objects. Experimental results demonstrate that the proposed algorithm can locate the centroids of LAGEOS satellite in real optical sensor images and achieve 0.15 pixel in accuracy when blurring length is below 15 pixels in simulated data, and still achieve sub-pixel accuracy when 0dB Gaussian white noise is superimposed. The results suggest the proposed method has potential applications in navigation or orbital debris surveillance.