In the mobile edge computing environment, multiple edge servers are often deployed in task-dense areas, however, the service coverage of these edge servers may overlap with each other. In such scenarios, users within the overlapping areas need to determine which server is chosen to offload the task. However, unreasonable decision of task offloading may result in imbalanced loads, thereby affecting the number of served users and the latency and energy consumption of user task offloading. Furthermore, the complexity of task offloading and resource allocation is further heightened by the dynamic arrival of user tasks. Therefore, it is crucial to design an effective task offloading and resource allocation strategy in an environment with multiple edge servers. In this paper, we propose a task offloading and resource allocation strategy aimed at meeting task latency requirements while maximizing the number of served users and minimizing the average energy consumption of all completed tasks. To timely obtain information about user tasks and the status of edge servers, we adopt a central controller to manage multiple edge servers. Then, we model the problem as a parameterized action Markov decision process and utilize the parameterized deep Q-network algorithm, a deep reinforcement learning algorithm, to solve it. Additionally, we conducted experiments to evaluate the performance of our proposed strategy against five benchmark strategies. The results demonstrate the superiority of our strategy in terms of the number of served users and the average energy consumption per task while meeting task latency constraints.