Electrocardiogram (ECG) signals quality can be seriously affected by artifacts such as electromyogram (EMG), negatively impacting its clinical value. Various denoising methods have been proposed to address this issue in the literature. We examined how simulated EMG artifacts affect the heartbeat detection performance of different ECG denoising pipelines at four EMG intensity levels (N1,...,N4), by adding simulated EMG to ECG signals from two different datasets: MIT-BIH arrhythmia dataset (dataset 1) and data we collected from 20 healthy partici-pants (dataset 2). Five different pipelines were investigated: CEEMDAN (Cd), CEEMDAN-ICA (Cd-I), SWT(WT), CEEMDAN-SWT (Cd-W), and CEEMDAN-ICA-SWT (Cd-I-W). For dataset 1, Cd and WT provided the best heart-beat detection performance at noise levels N1-N2 and N3-N4, respectively. For dataset 2, WT and Cd_I performed best at N1, while WT performed best at N2-N4. We further investigated the robustness of the pipelines against varying EMG intensity. We observed that the pipelines involving SWT (WT, Cd-W, Cd-I-W) considerably out-performed others in robustness against noise. Cd-I-W and WT were the most robust pipelines for dataset 1 and 2, respectively. As such, the potential artifact level in the environment of ECG should be assessed for selecting an appropriate denoising method.