We study the smooth minimax optimization problem minx maxy f(x, y), where f is l-smooth, strongly-concave in y but possibly nonconvex in x. Most of existing works focus on finding the first-order stationary points of the function f(x, y) or its primal function P(x), Delta= max(y) f(x, y), but few of them focus on achieving second-order stationary points. In this paper, we propose a novel approach for minimax optimization, called Minimax Cubic Newton (MCN), which could find an (epsilon, kappa(1.5) root rho epsilon)-second-order stationary point of P(x) with calling O(kappa(1.5)root rho epsilon(-1.5)) times of second-order oracles and (O) over tilde(kappa(2) root rho epsilon(-1.5)) times of first-order oracles, where kappa is the condition number and rho is the Lipschitz continuous constant for the Hessian of f(x, y). In addition, we propose an inexact variant of MCN for high-dimensional problems to avoid calling expensive second-order oracles. Instead, our method solves the cubic sub-problem inexactly via gradient descent and matrix Chebyshev expansion. This strategy still obtains the desired approximate second-order stationary point with high probability but only requires (O) over tilde(kappa(1.5) l epsilon(-2)) Hessian-vector oracle calls and (O) over tilde(kappa(2)root rho epsilon(-1.5)) first-order oracle calls. To the best of our knowledge, this is the first work that considers the non-asymptotic convergence behavior of finding second-order stationary points for minimax problems without the convex-concave assumptions.