The determination of soil water content by time domain reflectometry (TDR) involves two steps: the measurement of the propagation velocity of an electromagnetic pulse along a transmission line and the conversion of this measurement to an estimate of soil water content. The objective of this study was to identify and quantify errors associated with propagation velocity measurements. We developed new TDR techniques, involving the use of remotely switched diodes and differential wave form detection, that can be used to quantify and minimize propagation velocity errors. These errors are presented in terms of time interval errors. We show that the dominant time interval error term relates to the transition time of reflected pulses and that absolute time interval errors cannot be assumed to be <200 ps. We identify the presence of dissolved ions and the use of long cables as major sources of additional transition time errors, For transition times >2 ns, the time interval error caused by transition time effects can be estimated as the root mean square sum of the basic 200-ps error and an error equal to 10% of the transition time. We show that although instrument time base errors are the main source of precision errors, they are less important in determining the absolute accuracy of propagation velocity measurements, We show that system costs can be reduced without compromising accuracy by using 75 Omega coaxial cable, which is not only inexpensive but is superior to standard 50 Omega cable.