For the sake of fulfilling the ever-increasing demands on the Internet of Things (IoT) applications and supporting the maximum number of users and elastic service with the minimum resources, researchers are implementing cloud computing technology in many applications. The main principle of this emerging technology is to provide the users and enterprises with the computation and storage resources they require while allowing the customers to pay only for the amount they use. Lately, a newly emerging application based on cloud computing called cloud gaming is introduced. This powerful application aims effectively to deliver a high-quality gaming experience over the internet to gamers anywhere and anytime. The advantages of such a mix in the gaming industry are revolving around lower costs for the users given the potential to play on thin client devices and easier development for developers compared to traditional gaming. Accordingly, we are considering key research challenging factor facing cloud computing systems which is the impact of latency, generally called lag on Quality of Experience (QoE). Interestingly, the main concentration of this work is to help the research community in understanding the big picture of the cloud gaming field, starting from the very basics of this promising application. Then going through the major difficulties facing this field that must be addressed to provide users with good performance. The performance of our systems is studied using an exclusive simulator, built-in Java language named by CloudSim. The results show that the response and execution time are performed in a cloud system on the same cloudlet number with different allocation policies: the space shared and time shared. The cloudlet response and execution time of the shared time policy perform less than of the space shared policy. Also, by increasing the number of cloudlets, the throughput is increased. In addition, it is shown that the closer users from the data centre get the service faster than other far and far away from users. Also, by increasing the number of cloudlets, the throughput is increased. In addition, it is shown that the closer users from the data center get the service faster than other far and far away users.