Many Americans studied medicine in Britain in the 18th century, but the major influx to Europe began after 1815, when the French Revolution's reforms of health care and medical teaching had reached their zenith. Americans were well trained in France (and later in Germany) in medicine, surgery, pathology and clinical science, and brought these skills back to the US. Their training had been in countries with government-run, relatively egalitarian health care systems. On their return, they did not seek to transplant such a system to the US, but they did introduce European medical science and medical techniques, and something of the European medical education system.