Federated learning (FL) gets a sound momentum of growth, which is widely applied to train model in the distributed scenario. However, huge communication cost, poor performance under heterogeneous datasets and models, and emerging privacy leakage are major problems of FL. In this paper, we propose a communication-efficient personalized FL scheme with privacy-preserving. Firstly, we develop a personalized FL with feature fusion-based mutual-learning, which can achieve communication-efficient and personalized learning by training the shared model, private model and fusion model reciprocally on the client. Specifically, only the shared model is shared with global model to reduce communication cost, the private model can be personalized, and the fusion model can fuse the local and global knowledge adaptively in different stages. Secondly, to further reduce the communication cost and enhance the privacy of gradients, we design a privacy-preserving method with gradient compression. In this method, we construct a chaotic encrypted cyclic measurement matrix, which can achieve well privacy protection and lightweight compression. Moreover, we present a sparsity-based adaptive iterative hard threshold algorithm to improve the flexibility and reconstruction performance. Finally, we perform extensive experiments on different datasets and models, and the results show that our scheme achieves more competitive results than other benchmarks on model performance and privacy.