Simulations in the GUSBAD Catalog of gamma-ray bursts suggest that the apparent duration of a burst decreases as its amplitude is decreased. We see no evidence for this effect in the BATSE catalog. We show that for a burst at the detection limit, the typical signal-to-noise ratio at the edges of the T-90 duration is around 1.5, suggesting that T-90 must be quite uncertain. The situation for T-50 is less unfavorable. Simulations using the exact procedure to derive the durations listed in the BATSE catalog would be useful in quantifying the effect.