The use of reverse transcription (RT) followed by polymerase chain reaction (PCR) to detect HIV-1 RNA in serum was first described in 1988 [1]. Using plasma-associated HIV-1 RNA (viral load) as a surrogate marker in clinical practice was first introduced in the mid-1990s, with multiple reports describing plasma viral load quantification and the relationship of copy number to stage of HIV disease and response to antiretroviral therapy. Concurrently, three different methods to quantify viral load were being developed, resulting in the first United States Food and Drug Administration (FDA)-approved RT-PCR-based assay for determining prognosis and monitoring antiretroviral therapy in 1997, and the approval of an ultrasensitive version in 1999. More recently, in November 2001, the United States FDA approved a nucleic acid sequence-based amplification (NASBA) test for quantifying HIV-1 in human plasma. The current use of highly active antiretroviral treatment (HAART) has resulted in a dramatic reduction in viral replication (viral load) with concomitant reductions in AIDS-defining conditions and death. The degree of viral load reduction has made it possible to distinguish differences in potency between antiretroviral agents and regimens and in the durability of treatment responses. Quantification of HIV RNA in plasma is an important surrogate marker in assessing the risk of disease progression and monitoring antiretroviral therapy in routine HIV clinical practice and is now a required surrogate marker in antiretroviral clinical efficacy studies. Guidelines for intended use and interpretation of viral load results have been recently published (see later) [2,3]. This article reviews current methodologies, factors that affect performance and interpretation, and intended use of HIV-1 viral load testing.