Ultra-Reliable Low-Latency Communications (URLLC) is a prerequisite for advancing industrial automation. To verify whether communications systems meet these stringent requirements, physical layer simulations are a powerful tool. Such simulations rely on a large number of channel realizations to obtain statistically significant results. Using exclusively real-world measured channels is challenging, e.g., due to the measurement time needed. In this work, we propose how to derive a channel model that mimics not only the mean but equally the temporal behavior of data from an industrial channel measurement campaign. The approach considers time variation on a large scale, modeled through a Markov process, as well as fading of individual channel components, achieved through Doppler-filtered random processes based on empirical distributions. The model is validated in link-level simulations by comparing the synthesized channels to the original measured data by means of performance metrics relevant to URLLC, including mean and maximum outage durations. Our validations show that the model performs well and especially the outage durations, caused by consecutive packet losses, match the real channel characteristics very well. In the future, we can use the model to investigate how extensive measurements need to be in order to make statements about the performance of communications systems.