Inertial measurement units, such as those embedded in many smartwatches, enable to track the orientation of the body joint to which the wearable is attached. Tracking the user's entire body would thus require many wearables, which tends to be difficult to achieve in daily life. This paper presents a calibration-free method to match the data stream from a body-worn Inertial Measurement Unit, to any body joint estimates that are recognized and located from a camera in the environment. This allows networked cameras to associate detected body joints from nearby humans with an orientation stream from a nearby wearable; This wearable's orientation stream can be transformed and complemented with its user's full body pose estimates. Results show that in multi-user environments where users try to perform actions synchronously, out of 42 joint candidates, our calibration-free method obtains a perfect match within 83 to 155 samples (2.8 to 5.2 seconds), depending on the scenario. This would facilitate a seamless combination of wearable- and vision-based tracking for a robust, full-body user tracking.