Knowledge graphs (KGs) are often incomplete, omitting many existing facts. To address this issue, researchers have proposed many knowledge graph completion (KGC) models to fill in the missing triples. A full KG often consists of interconnected local KGs from multiple organizations, forming a cross-domain KG. Federated learning enables the effective utilization of the entire cross-domain KG for training a federated KGC model, rather than relying solely on a local KG from a single client. However, the existing methods often neglect the latent information among local KGs. Therefore, we propose a neighbor prediction-enhanced federated knowledge graph completion (NP-FedKGC) model to improve KGC by mining latent information. Specifically, we first obtain embeddings of entities and relations from multiple clients' local KGs. Second, we employ the obtained embeddings as labels to train a respective neighbor prediction model for each client. Subsequently, the neighbor prediction model is applied to enhance each client's local KG. Third, the enhanced local KGs of all clients are used to train the final federated KGC model. Comprehensive experimental results show that the proposed NP-FedKGC model outperforms three baseline models, FedE, FedR, and FedM, at MRR and Hits@1/3/10.