Contrasting with online convex optimization algorithms designed for specific function types, universal algorithms are able to automatically adapt to various loss functions. This capability eliminates the need for users to correctly classify the type of loss function, thereby lowering the barrier to employing online convex optimization techniques. Previous studies have proposed algorithms that provide minimax optimal theoretical guarantees for several types of loss functions. However, they struggle to attain tighter theoretical guarantees that are correlated with the structure of the problem for general convex functions. To address this issue, we introduce the Universal Online Convex Optimization Algorithm with Adaptivity to Gradient-Variation (UAGV). This novel algorithm automatically adapts to both general convex and strongly convex loss functions. Furthermore, under the smooth condition, UAGV enjoys the gradient-variation bound for general convex loss functions, which is a problem-dependent bound. The algorithm adopts a two-layered structure, with a meta-algorithm in the upper layer and several expert-algorithms in the lower layer. We innovatively adopt the meta-algorithm with an optimism term, which can be interpreted as a prediction of the loss vector. When the optimism term closely matches the loss vector, the meta-algorithm achieves small regret. Thus, we carefully design the surrogate loss function and the optimism term according to the form of gradient-variation bound, enhancing the meta-algorithm’s ability of combining decisions generated by expert algorithms and helping to obtain the corresponding theoretical guarantee. Experimental results from several datasets indicate that the UAGV algorithm can effectively track the best expert-algorithm, and its optimization results for smooth general convex functions outperform those of existing universal algorithms. Specifically, the regret of UAGV is over 14 % smaller than that of existing algorithms on certain datasets. © 2024 Science Press. All rights reserved.