Ultrasound is indispensable imaging modality for clinical diagnosis such as fetus assessment and heart assessment. Moreover, many ultrasound applications for image guided procedures have been proposed and attempted because US is less invasive, less cost, and high portability. However, to obtain US images, a US imaging probe has to be held manually and contacted with a patient body. To address the issue, we have proposed a robotic system for automatic probe scanning. The system consists of a probe scanning robot, navigation software, an optical tracking device, and an ultrasound imaging device. The robot, that is six degrees of freedom, is composed of a frame mechanism and a probe holding mechanism. The frame mechanism has six pneumatic actuators to reduce its weight, and the probe holding mechanism has one DC motor. The probe holding mechanism is connected with the pneumatic actuators using wires. Moreover, the robot can control the position and orientation of the B-scan plane based on the transformation between an optical tracker attached to the US probe and the B-scan plane. The navigation system, which is connected with the tracking device and an US imaging device via a VGA cable, computes the relative position between the positions of a therapeutic tool and the B-scan plane, and sends it to the robot. Then the position of the B-scan plane can be controlled based on the tool position. Also, the navigation system displays the plane with a texture of an actual echogram and a tool model three-dimensionally to monitor the relative position of the tool and the B-scan plane. To validate the basic system performance, phantom tests were conducted. The phantom was made of gelatin and poly(ethylene glycol). In the tests, the needle was inserted into the phantom, and the B-scan plane was controlled to contain a tracked needle in real-time. From the results, the needle was continuously visualized during needle insertion. Therefore, it is confirmed that the system has a great potential for automatic US image guided procedures. (C) 2013 The Authors. Published by Elsevier B.V.