Emotion recognition is important in the application of brain-computer interface (BCI). Building a robust emotion recognition model across subjects and sessions is critical in emotion-based BCI systems. Electroencephalogram (EEG) is a widely used tool to recognize different emotion states. However, EEG has disadvantages such as small amplitude, low signal-to-noise ratio, and nonstationary properties, resulting in large differences across subjects. To solve these problems, this article proposes a new emotion recognition method based on a multisource associate domain adaptation (DA) network, considering both domain invariant and domain-specific features. First, separate branches were constructed for multiple source domains, assuming that different EEG data shared the same low-level features. Second, the domain-specific features were extracted using the one-to-one associate DA. Then, the weighted scores of specific sources were obtained according to the distribution distance, and multiple source classifiers were deduced with the corresponding weighted scores. Finally, EEG emotion recognition experiments were conducted on different datasets, including SEED, DEAP, and SEED-IV dataset. Results indicated that, in the cross-subject experiment, the average accuracy in SEED dataset was 86.16%, DEAP dataset was 65.59%, and SEED-IV was 59.29%. In the cross-session experiment, the accuracies of SEED and SEED-IV datasets were 91.10% and 66.68%, respectively. Our proposed method has achieved better classification results compared to the state-of-the-art DA methods.