Regression analysis when the underlying regression function has jumps is a research problem with many applications. In practice, jumps often represent structure changes of a related process. Hence, it is important to detect them accurately from observed noisy data. In the literature, there are some jump detectors proposed, most of which are based on local constant or local linear kernel smoothing. For a given application, which method is more appropriate to use? Will local quadratic or local higher-order polynomial kernel smoothing provide a better jump detector in certain cases? All these practical questions have not been well addressed yet. To answer these questions, in this paper, we study both theoretical and numerical properties of jump detectors based on various local polynomial kernel smoothing, and provide certain guidelines on their practical use. Besides a simulation study, two real data examples are presented for demonstrating cases when two specific jump detectors are more appropriate to use, compared to other methods.