Electrocardiogram (ECG) serves a key role in understanding human health conditions and necessitates targeted treatments. Attempts to automate ECG interpretations using different algorithmic paradigms have existed for decades. However, a robust system or an approach that can automate the detection of a wide range of arrhythmia categories from ECGs captured or acquired from a multitude of sources remains a challenge, given the signal quality and extend of associated artifacts vary. The approach described in this paper takes the application of Deep Learning (DL) in arrhythmia detection to a new dimension, in which it uses class-discriminative visualization to improve interpretability and transparency as an additional step to validate the algorithm. A Convolutional Neural Network (CNN) of 20 rhythm categories was developed using 193,492 single-lead ten-second ECG strips. External validation was done against a test set of 5606 strips collected using Holter device (n=4447), patch (n=311), diagnostic ECG machine (n=806), and smartwatch (n=42), covering a wide variety of devices available in the current market. For IoU (Intersection over Union) score, Grad-CAM (Gradient-weighted Class Activation Mapping) analysis representing the region weighted by the model for inference was performed on 266 strips against Certified Cardiographic Technician annotations. The model with a weighted average F1 score of 0.94 across 20 rhythm categories had high sensitivity and specificity for critical arrhythmias like Atrial Fibrillation (0.997, 0.955) and Ventricular Tachycardia (0.999, 0.915). Besides, the mean IoU score calculated was 0.56, meaning the region of arrhythmia had a great impact on the predictions made by the model.