When analyzing time series which are supposed to exhibit long-range dependence (LRD), a basic issue is the estimation of the LRD parameter, for example the Hurst parameter H∈(1/2,1)\documentclass[12pt]{minimal}
\usepackage{amsmath}
\usepackage{wasysym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsbsy}
\usepackage{mathrsfs}
\usepackage{upgreek}
\setlength{\oddsidemargin}{-69pt}
\begin{document}$$H \in (1/2, 1)$$\end{document}. Conventional estimators of H easily lead to spurious detection of long memory if the time series includes a shift in the mean. This defect has fatal consequences in change-point problems: Tests for a level shift rely on H, which needs to be estimated before, but this estimation is distorted by the level shift. We investigate two blocks approaches to adapt estimators of H to the case that the time series includes a jump and compare them with other natural techniques as well as with estimators based on the trimming idea via simulations. These techniques improve the estimation of H if there is indeed a change in the mean. In the absence of such a change, the methods little affect the usual estimation. As adaption, we recommend an overlapping blocks approach: If one uses a consistent estimator, the adaption will preserve this property and it performs well in simulations.