In this paper, a timestamp-based Nesterov's accelerated gradient algorithm is proposed for Nash equilibrium seeking over communication networks for strongly monotone games. Its difference from the well-known consensus-based Nash equilibrium seeking method is that each player's local estimates of players' actions is updated by both Nesterov's accelerated gradient method and timestamp-based broadcasting protocol. We prove its convergence to the epsilon-approximation Nash equilibrium with the fixed step-size. Simulation results are given to demonstrate the outperformance of the proposed algorithm over some well-known projected gradient approaches. It is shown that the required number of iterations to reach the Nash equilibrium is greatly reduced in our proposed algorithm.