Background: Brain network features contain more emotion-related information and can be more effective in emotion recognition. However, emotions change continuously and dynamically, and current function brain network features using the sliding window method cannot explore dynamic characteristics of different emotions, which leads to the serious loss of functional connectivity information. New method: In the study, we proposed a new framework based on EEG source signals and dynamic function brain network (dyFBN) for emotion recognition. We constructed emotion-related dyFBN with dynamic phase linearity measurement (dyPLM) at every time point and extracted the second-order feature Root Mean Square (RMS) based on of dyFBN. In addition, a multiple feature fusion strategy was employed, integrating sensor frequency features with connection information. Results: The recognition accuracy of subject-independent and subject-dependent is 83.50% and 88.93%, respectively. The selected optimal feature subset of fused features highlighted the interplay between dynamic features and sensor features and showcased the crucial brain regions of the right superiortemporal, left isthmuscingulate, and left parsorbitalis in emotion recognition. Comparison with existing methods: Compared with current methods, the emotion recognition accuracy of subjectindependent and subject-dependent is improved by 11.46% and 10.19%, respectively. In addition, recognition accuracy of the fused features of RMS and sensor features is also better than the fused features of existing methods. Conclusions: These findings prove the validity of the proposed framework, which leads to better emotion recognition.