Renting servers in the cloud is a generalization of the bin packing problem, motivated by job allocation to servers in cloud computing applications. Jobs arrive in an online manner, and need to be assigned to servers; their duration and size are known at the time of arrival. There is an infinite supply of identical servers, each having one unit of computational capacity per unit of time. A server can be rented at any time and continues to be rented until all jobs assigned to it finish. The cost of an assignment is the sum of durations of rental periods of all servers. The goal is to assign jobs to servers to minimize the overall cost while satisfying server capacity constraints. We focus on analyzing two natural algorithms, NextFit and FirstFit, for the case of jobs of equal duration. It is known that the competitive ratio of NextFit and FirstFit are at most 3 and 4 respectively for this case. We prove a tight bound of 2 on the competitive ratio of NextFit. For FirstFit, we establish a lower bound of ≈2.519 on the competitive ratio, even when jobs have only two distinct arrival times 0 and t. Using the weight function technique, we show that this bound is almost tight when there are only two arrival times; we obtain an upper bound of 2.565 on the asymptotic competitive ratio of FirstFit. In fact, we show an upper bound of [Formula presented] on the asymptotic competitive ratio for any t>0.559. For the case when jobs have arrival times 0 and 1 and duration 2, we show a lower bound of ≈1.89 and an upper bound of 2 on the strict competitive ratio of FirstFit. Finally, we show an upper bound of 3/2 on the competitive ratio of long-running uniform servers. © 2024 The Author(s)