Real-time activity recognition for energy efficiency in buildings

被引:63
|
作者
Ahmadi-Karvigh, Simin [1 ]
Ghahramani, Ali [1 ]
Becerik-Gerber, Burcin [1 ]
Soibelman, Lucio [1 ]
机构
[1] Univ Southern Calif, Sonny Astani Dept Civil & Environm Engn, KAP 217,3620 South Vermont Ave, Los Angeles, CA 90089 USA
基金
美国国家科学基金会;
关键词
Building energy efficiency; Building automation; Activity recognition; Appliance control; Waste detection; INTERLAYER ADHESION; CONSUMPTION; SYSTEM; USAGE; INFORMATION; INTENSITY; STRENGTH; SIZE;
D O I
10.1016/j.apenergy.2017.11.055
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
More than half of the electricity in residential and commercial buildings is consumed by lighting systems and appliances. Consumption by these service systems is directly associated with occupant activities. By recognizing activities and identifying the associated possible energy savings, more effective strategies can be developed to design better buildings and automation systems. In line with this motivation, using inductive and deductive reasoning, we introduce a framework to detect occupant activities and potential wasted energy consumption and peak-hour usage that could be shifted to non-peak hours in real-time. Our framework consists of three sub algorithms for action detection, activity recognition and waste estimation. As the real-time input, the action detection algorithm receives the data from the sensing system, consisting of plug meters and sensors, to detect the occurred actions (e.g., turning on an appliance) via our unsupervised clustering models. Detected actions are then used by the activity recognition algorithm to recognize the activities (e.g., preparing food) through semantic reasoning on our constructed ontology. Based on the recognized activities, the waste estimation algorithm identifies the potential waste and estimates the potential savings. To evaluate the performance of our framework, an experimental study was carried out in an office with five occupants and in two single-occupancy apartments for two weeks. Following the experiment, the performance of the action detection and activity recognition algorithms was evaluated using the ground truth labels for actions and activities. Average accuracy was 97.6% for action detection using Gaussian Mixture Model with Principal Components Analysis and 96.7% for activity recognition. In addition, 35.5% of the consumption of an appliance or lighting system, in average was identified as potential savings.
引用
收藏
页码:146 / 160
页数:15
相关论文
共 50 条
  • [41] Enhancing Energy Efficiency in Resource Allocation for Real-Time Cloud Services
    Bagheri, Zahra
    Zamanifar, Kamran
    [J]. 2014 7th International Symposium on Telecommunications (IST), 2014, : 701 - 706
  • [42] ActiRecognizer: Design and implementation of a real-time human activity recognition system
    Cao, Liang
    Wang, Yufeng
    Jin, Qun
    Ma, Jianhua
    [J]. 2017 INTERNATIONAL CONFERENCE ON CYBER-ENABLED DISTRIBUTED COMPUTING AND KNOWLEDGE DISCOVERY (CYBERC), 2017, : 266 - 271
  • [43] A hierarchical approach to real-time activity recognition in body sensor networks
    Wang, Liang
    Gu, Tao
    Tao, Xianping
    Lu, Jian
    [J]. PERVASIVE AND MOBILE COMPUTING, 2012, 8 (01) : 115 - 130
  • [44] On the Development of a Real-Time Multi-sensor Activity Recognition System
    Banos, Oresti
    Damas, Miguel
    Guillen, Alberto
    Herrera, Luis-Javier
    Pomares, Hector
    Rojas, Ignacio
    Villalonga, Claudia
    Lee, Sungyoung
    [J]. AMBIENT ASSISTED LIVING: ICT-BASED SOLUTIONS IN REAL LIFE SITUATIONS, 2015, 9455 : 176 - 182
  • [45] System-level energy-efficiency for real-time tasks
    Yang, Chuan-Yue
    Chen, Jian-Jia
    Hung, Chia-Mei
    Kuo, Tei-Wei
    [J]. 10TH IEEE INTERNATIONAL SYMPOSIUM ON OBJECT AND COMPONENT-ORIENTED REAL-TIME DISTRIBUTED COMPUTING, PROCEEDINGS, 2007, : 266 - +
  • [46] MobiRAR: Real-Time Human Activity Recognition Using Mobile Devices
    Cuong Pham
    [J]. 2015 SEVENTH INTERNATIONAL CONFERENCE ON KNOWLEDGE AND SYSTEMS ENGINEERING (KSE), 2015, : 144 - 149
  • [47] Gaze-Based Real-Time Activity Recognition for Proactive Interfaces
    Cig, Cagla
    Sezgin, Tevfik Metin
    [J]. 2015 23RD SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2015, : 694 - 697
  • [48] Real-Time Human Activity Recognition on Embedded Equipment: A Comparative Study
    Najeh, Houda
    Lohr, Christophe
    Leduc, Benoit
    [J]. APPLIED SCIENCES-BASEL, 2024, 14 (06):
  • [49] Deep Learning Models for Real-time Human Activity Recognition with Smartphones
    Shaohua Wan
    Lianyong Qi
    Xiaolong Xu
    Chao Tong
    Zonghua Gu
    [J]. Mobile Networks and Applications, 2020, 25 : 743 - 755
  • [50] Real-time Activity Recognition on Smartphones Using Deep Neural Networks
    Zhang, Licheng
    Wu, Xihong
    Luo, Dingsheng
    [J]. IEEE 12TH INT CONF UBIQUITOUS INTELLIGENCE & COMP/IEEE 12TH INT CONF ADV & TRUSTED COMP/IEEE 15TH INT CONF SCALABLE COMP & COMMUN/IEEE INT CONF CLOUD & BIG DATA COMP/IEEE INT CONF INTERNET PEOPLE AND ASSOCIATED SYMPOSIA/WORKSHOPS, 2015, : 1236 - 1242