Temporal knowledge graph (TKG) reasoning, has seen widespread use for modeling real-world events, particularly in extrapolation settings. Nevertheless, most previous studies are embedded models, which require both entity and relation embedding to make predictions, ignoring the semantic correlations among different entities and relations within the same timestamp. This can lead to random and nonsensical predictions when unseen entities or relations occur. Furthermore, many existing models exhibit limitations in handling highly correlated historical facts with extensive temporal depth. They often either overlook such facts or overly accentuate the relationships between recurring past occurrences and their current counterparts. Due to the dynamic nature of TKG, effectively capturing the evolving semantics between different timestamps can be challenging. To address these shortcomings, we propose the recurrent semantic evidenceaware graph neural network (RE-SEGNN), a novel graph neural network that can learn the semantics of entities and relations simultaneously. For the former challenge, our model can predict a possible answer to missing quadruples based on semantics when facing unseen entities or relations. For the latter problem, based on an obvious established force, both the recency and frequency of semantic history tend to confer a higher reference value for the current. We use the Hawkes process to compute the semantic trend, which allows the semantics of recent facts to gain more attention than those of distant facts. Experimental results show that RE-SEGNN outperforms all SOTA models in entity prediction on 6 widely used datasets, and 5 datasets in relation prediction. Furthermore, the case study shows how our model can deal with unseen entities and relations.