Raw surface and ground waters used for drinking purposes can vary markedly in their chemical and biological composition. Inorganic content (such as salt, bicarbonate, clay and metal ions), organic content (natural organic matter and anthropogenic compounds, including pollutants) and micro-organisms present in raw water are key drivers for treatment processes that provide for safe and aesthetically acceptable drinking water. Conventional treatment at large scale water treatment plants (WTP) involves the use of inorganic coagulants to remove turbidity and colour, and more recently to maximise removal of organic compounds. The basis for the latter is to minimise the concentration of organics in treated water that leads to lower levels of disinfection by-products (post chlorination) and substrates for microbial growth in the water distribution system. Maximising removal of organic matter using inorganic coagulants is impacted by the character and concentration of the organics, the turbidity and alkalinity of the raw water. Removal of organics is also influenced by the type of coagulant used, its dose rate and the pH at which coagulation occurs. To date, few attempts have been made to model the relationships between raw water quality parameters and the use of coagulants and pH control reagents for removal of organics, colour and turbidity. In this paper, mathematical models are described that relate raw water quality parameters (ultraviolet light absorbing compounds, coloured compounds, turbidity and the pH buffering capacity of raw water) to dose rates of the coagulants, alum and ferric chloride, and pH control reagents. Also described are models that relate the concentration and character of organics in raw water to targeted percentage removal of organics. The aim of these models is to provide water treatment operators with a tool that enables prediction of chemical reagents and treatment conditions for selected removal of organics, based on raw water quality data.