KINECT-BASED UNIVERSAL RANGE SENSOR FOR LABORATORY EXPERIMENTS

被引:0
|
作者
Zhang, Mingshao [1 ]
Zhang, Zhou [1 ]
Aziz, El-Sayed [1 ]
Esche, Sven K. [1 ]
Chassapis, Constantin [1 ]
机构
[1] Stevens Inst Technol, Hoboken, NJ 07030 USA
关键词
Microsoft Kinect; Object recognition; Motion tracking; Educational laboratory; SCENE FLOW; OBJECT RECOGNITION; SHAPE; SYSTEM;
D O I
暂无
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
The Microsoft Kinect is part of a wave of new sensing technologies. Its RGB-D camera is capable of providing high quality synchronized video of both color and depth data. Compared to traditional 3-D tracking techniques that use two separate RGB cameras' images to calculate depth data, the Kinect is able to produce more robust and reliable results in object recognition and motion tracking. Also, due to its low cost, the Kinect provides more opportunities for use in many areas compared to traditional more expensive 3-D scanners. In order to use the Kinect as a range sensor, algorithms must be designed to first recognize objects of interest and then track their motions. Although a large number of algorithms for both 2-D and 3-D object detection have been published, reliable and efficient algorithms for 3-D object motion tracking are rare, especially using Kinect as a range sensor. In this paper, algorithms for object recognition and tracking that can make use of both RGB and depth data in different scenarios are introduced. Subsequently, efficient methods for scene segmentation including background and noise filtering are discussed. Taking advantage of those two kinds of methods, a prototype system that is capable of working efficiently and stably in various applications related to educational laboratories is presented.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Kinect-based Universal Range Sensor and its Application in Educational Laboratories
    Zhang, Mingshao
    Zhang, Zhou
    Chang, Yizhe
    Esche, Sven K.
    Chassapis, Constantin
    [J]. INTERNATIONAL JOURNAL OF ONLINE ENGINEERING, 2015, 11 (02) : 26 - 35
  • [2] Moving Kinect-Based Gait Analysis with Increased Range
    Pathegama, Madhura P.
    Marasinghe, Dileepa M.
    Wijayasekara, Kanishka
    Karunanayake, Ishan
    Edussooriya, Chamira U. S.
    Silva, Pujitha
    Rodrigo, Ranga
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2018, : 4126 - 4131
  • [3] Kinect-Based Limb Rehabilitation Methods
    Yang, Yongji
    Xiao, Zhiguo
    Jiang, Furen
    [J]. INTERNATIONAL JOURNAL OF HEALTHCARE INFORMATION SYSTEMS AND INFORMATICS, 2018, 13 (03) : 49 - 64
  • [4] Exercise Recognition for Kinect-based Telerehabilitation
    Anton, D.
    Goni, A.
    Illarramendi, A.
    [J]. METHODS OF INFORMATION IN MEDICINE, 2015, 54 (02) : 145 - 155
  • [5] Research on Kinect-based Gesture Recognition
    Ma, Tian
    Guo, Ming
    [J]. CONFERENCE PROCEEDINGS OF 2019 IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING, COMMUNICATIONS AND COMPUTING (IEEE ICSPCC 2019), 2019,
  • [6] Kinect-based Framework for Motor Rehabilitation
    Chang, R. K. Y.
    Lau, S. H.
    Sim, K. S.
    Too, M. S. M.
    [J]. PROCEEDINGS OF 2016 INTERNATIONAL CONFERENCE ON ROBOTICS, AUTOMATION AND SCIENCES (ICORAS 2016), 2016,
  • [7] A Kinect-based Oral Rehabilitation System
    Pan, Tse-Yu
    Wong, Yong-Xiang
    Lee, Ting-Chia
    Hu, Min-Chun
    [J]. PROCEEDINGS OF 2015 INTERNATIONAL CONFERENCE ON ORANGE TECHNOLOGIES (ICOT), 2015, : 71 - 74
  • [8] A kinect-based interface to animate virtual characters
    Sanna, Andrea
    Lamberti, Fabrizio
    Paravati, Gianluca
    Rocha, Felipe Domingues
    [J]. JOURNAL ON MULTIMODAL USER INTERFACES, 2013, 7 (04) : 269 - 279
  • [9] A Kinect-Based Automatic Ultrasound Scanning System
    Wu, Bowen
    Huang, Qinghua
    [J]. IEEE ICARM 2016 - 2016 INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS AND MECHATRONICS (ICARM), 2016, : 585 - 590
  • [10] Kinect-based Presenter Tracking Prototype for Videoconferencing
    Gadanac, Davor
    Dujak, Mico
    Tomic, Damir
    Jercic, Domagoj
    [J]. 2014 37TH INTERNATIONAL CONVENTION ON INFORMATION AND COMMUNICATION TECHNOLOGY, ELECTRONICS AND MICROELECTRONICS (MIPRO), 2014, : 485 - 490