In this paper, we derive the stochastic maximum principle for optimal control problems of the backward Markovian regime-switching systems involving impulse controls. The control system is described by a backward stochastic differential equation involving impulse controls and modulated by continuous-time, finite-state Markov chains. Besides the Markov chains, the most distinguishing features of our problem are that the control variables consist of two parts: regular and impulsive control, and that the domain of regular control is not necessarily convex. We obtain the necessary and sufficient conditions for optimal controls. Thereafter, we apply the theoretical results to a linear-quadratic problem with impulsive control and Markovian regime-switching.