The advancement of Industrial Internet of Things (IIoT) applications has increased the demand for efficient and energy-aware task scheduling in fog-cloud environments. This paper presents a novel multi-objective energy efficient task scheduling based reinforcement learning (MEETS-RL) model for fog-cloud IIoT-based systems. The IIoT, fog, and cloud are the three layers obtained in the proposed method. Tasks are captured from various industries through industrial devices such as actuators, sensors, and control systems placed near fog nodes. The task classification stage utilizes the ID3 algorithm to classify tasks based on their priority, Quality of Service (QoS), and processing requirements. After classification, appropriate fog nodes are selected for task execution considering factors such as energy consumption, processing capabilities, and proximity to IIoT devices. The unhandled task in fog nodes is offloaded to cloud data centers using the first fit (FF) algorithm, based on available resources, QoS requirements, and fog node location. Once the suitable fog nodes or cloud data centers have been identified, the tasks are scheduled for execution using a reinforcement learning-based task scheduling algorithm. Experimental evaluations demonstrate that the proposed MEETS-RL model outperforms existing task scheduling models, including first come first serve (FCFS), greedy for energy (GfE), shortest job first (SJF), round Robin (RR), and Laxity based priority with Ant colony system (LBP-ACS). The experimental results of the proposed architecture showed an accuracy rate of 98.8%, task completion rate of 95% and energy consumption of 0.4 J. The proposed architecture offers a comprehensive solution for modern IIoT systems demanding flexibility, scalability, and efficient resource management while enhancing energy efficiency through the application of reinforcement learning in task scheduling.