Robust tracking of persons in real-world scenarios using a statistical computer vision approach

被引:4
|
作者
Rigoll, G [1 ]
Breit, H [1 ]
Wallhoff, F [1 ]
机构
[1] Tech Univ Munich, Inst Human Machine Commun, D-80290 Munich, Germany
关键词
person tracking; hidden Markov models; Kalman-filter; statistical object modeling; background adaptation;
D O I
10.1016/j.imavis.2003.09.014
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the following work we present a novel approach to robust and flexible person tracking using an algorithm that combines two powerful stochastic modeling techniques: the first one is the technique of so-called Pseudo-2D Hidden Markov Models (P2DHMMs) used for capturing the shape of a person within an image frame, and the second technique is the well-known Kalman-filtering algorithm, that uses the output of the P2DHMM for tracking the person by estimation of a bounding box trajectory indicating the location of the person within the entire video sequence. Both algorithms are cooperating together in an optimal way, and with this cooperative feedback, the proposed approach even makes the tracking of persons possible in the presence of background motions, for instance caused by moving objects such as cars, or by camera operations as e.g. panning or zooming. We consider this as a major advantage compared to most other tracking algorithms that are mostly not capable of dealing with background motion. Furthermore, the person to be tracked is not required to wear special equipment (e.g. sensors) or special clothing. Additionally, we show how our approach can be effectively extended in order to include on-line background adaptation. Our results are confirmed by several tracking examples in real scenarios, shown at the end of the article and provided on the web server of our institute. (C) 2003 Elsevier B.V. All rights reserved.
引用
收藏
页码:571 / 582
页数:12
相关论文
共 50 条
  • [21] Automated grasp planning and execution for real-world objects using computer vision and tactile probing
    Bender, PA
    Bone, GM
    INTERNATIONAL JOURNAL OF ROBOTICS & AUTOMATION, 2004, 19 (01): : 15 - 27
  • [22] Normative Agents for Real-world Scenarios
    Beheshti, Rahmatollah
    AAMAS'14: PROCEEDINGS OF THE 2014 INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS & MULTIAGENT SYSTEMS, 2014, : 1749 - 1750
  • [23] Formalizing Real-world Threat Scenarios
    Tavolato, Paul
    Luh, Robert
    Eresheim, Sebastian
    PROCEEDINGS OF THE 8TH INTERNATIONAL CONFERENCE ON INFORMATION SYSTEMS SECURITY AND PRIVACY (ICISSP), 2021, : 281 - 289
  • [24] Capability and accuracy of usual statistical analyses in a real-world setting using a federated approach
    Jegou, Romain
    Bachot, Camille
    Monteil, Charles
    Boernert, Eric
    Chmiel, Jacek
    Boucher, Mathieu
    Pau, David
    PLOS ONE, 2024, 19 (11):
  • [25] Performance of Real-world Functional Tasks Using an Updated Oral Electronic Vision Device in Persons Blinded by Trauma
    Grant, Patricia
    Maeng, Meesa
    Arango, Tiffany
    Hogle, Rich
    Szlyk, Janet
    Seiple, William
    OPTOMETRY AND VISION SCIENCE, 2018, 95 (09) : 766 - 773
  • [26] Enhancing explainability in real-world scenarios: Towards a robust stability measure for local interpretability
    Sepulveda, Eduardo
    Vandervorst, Felix
    Baesens, Bart
    Verdonck, Tim
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 274
  • [27] A real-world look at synthetic vision
    Wilson, JR
    AEROSPACE AMERICA, 2000, 38 (10) : 24 - 30
  • [28] A real world oriented interface technique using computer vision
    Ohashi, T
    Yoshida, T
    Ejima, T
    DESIGN OF COMPUTING SYSTEMS: SOCIAL AND ERGONOMIC CONSIDERATIONS, 1997, 21 : 949 - 952
  • [29] A Statistical Roadmap for Journey from Real-World Data to Real-World Evidence
    Yixin Fang
    Hongwei Wang
    Weili He
    Therapeutic Innovation & Regulatory Science, 2020, 54 : 749 - 757
  • [30] A Statistical Roadmap for Journey from Real-World Data to Real-World Evidence
    Fang, Yixin
    Wang, Hongwei
    He, Weili
    THERAPEUTIC INNOVATION & REGULATORY SCIENCE, 2020, 54 (04) : 749 - 757