In this paper, we present a first-order projection-free method, namely, the universal conditional gradient sliding (UCGS) method, for computing epsilon-approximate solutions to convex differentiable optimization problems with bounded domains. For objective functions with Holder continuous gradients under the Euclidean norm we show that UCGS is able to terminate with epsilon-solutions with at most O((M-nu D-X(1+nu) /epsilon)(2/(1+3 nu))) gradient evaluations and O((M-nu D-X(1+nu) /epsilon)(4/(1+3 nu))) linear objective optimizations, where nu is an element of(0, 1] and M-nu > 0 are the exponent and constant of the Holder condition and DX is the diameter of the constraint set. Furthermore, UCGS is able to perform such computations without requiring any specific knowledge of the smoothness information nu and M-nu. In the weakly smooth case when nu is an element of(0, 1), both complexity results improve the current state-of-the-art O((M-nu D-X(1+nu) /epsilon)(1/nu)) results Y. Nesterov, Math. Program., 171 (2018), pp. 311-330, S. Ghadimi, Math. Program., 173 (2019), pp. 431-464 on a first-order projection-free method achieved by the conditional gradient method. Within the class of sliding-type algorithms following from the work of G. Lan and Y. Zhou, SIAM J. Optim., 26 (2016), pp. 1379-1409, Y. Chen, G. Lan, Y. Ouyang, and W. Zhang, Comput. Optim. Appl., 73 (2019), pp. 159-199, to the best of our knowledge, this is the first time a sliding-type algorithm has been able to improve not only the gradient complexity but also the overall complexity for computing an approximate solution. In the smooth case when nu = 1, UCGS matches the state-of-the-art complexity result achieved by the conditional gradient sliding method G. Lan and Y. Zhou, SIAM J. Optim., 26 (2016), pp. 1379-1409, but adds more features allowing for practical implementation.