Paper Reading AI Learner

Transfer Learning in Deep Learning Models for Building Load Forecasting: Case of Limited Data

2023-01-25 16:05:47
Menna Nawar, Moustafa Shomer, Samy Faddel, Huangjie Gong

Abstract

Precise load forecasting in buildings could increase the bill savings potential and facilitate optimized strategies for power generation planning. With the rapid evolution of computer science, data-driven techniques, in particular the Deep Learning models, have become a promising solution for the load forecasting problem. These models have showed accurate forecasting results; however, they need abundance amount of historical data to maintain the performance. Considering the new buildings and buildings with low resolution measuring equipment, it is difficult to get enough historical data from them, leading to poor forecasting performance. In order to adapt Deep Learning models for buildings with limited and scarce data, this paper proposes a Building-to-Building Transfer Learning framework to overcome the problem and enhance the performance of Deep Learning models. The transfer learning approach was applied to a new technique known as Transformer model due to its efficacy in capturing data trends. The performance of the algorithm was tested on a large commercial building with limited data. The result showed that the proposed approach improved the forecasting accuracy by 56.8% compared to the case of conventional deep learning where training from scratch is used. The paper also compared the proposed Transformer model to other sequential deep learning models such as Long-short Term Memory (LSTM) and Recurrent Neural Network (RNN). The accuracy of the transformer model outperformed other models by reducing the root mean square error to 0.009, compared to LSTM with 0.011 and RNN with 0.051.

Abstract (translated)

精确建筑物负载预测可以增加账单节省潜力,并促进能源规划的优化策略。随着计算机科学的迅速发展,数据驱动技术,特别是深度学习模型,已成为负载预测问题的一个有前途的解决方案。这些模型已经显示了准确的预测结果;然而,他们需要大量的历史数据来维持表现。考虑到新的建筑物和具有低精度测量设备的建筑物,很难从它们中获得足够的历史数据,导致预测性能不佳。为了适应有限且稀缺数据的建筑用深度学习模型,本文提出了一个建筑到建筑转移学习框架,以克服问题并增强深度学习模型的性能。转移学习方法被应用于被称为Transformer模型的新技术,因为它在捕捉数据趋势方面的有效性。算法的性能在一份具有有限数据的大型商业建筑上进行了测试。结果表明,与传统的深度学习案例相比,新方法提高了预测准确性,提高了56.8%。本文还比较了所提出的Transformer模型与其他Sequential深度学习模型,如长短期记忆(LSTM)和循环神经网络(RNN)。Transformer模型的精度通过降低平方误差到0.009,比LSTM的0.011和RNN的0.051更好。

URL

https://arxiv.org/abs/2301.10663

PDF

https://arxiv.org/pdf/2301.10663.pdf


Tags
3D Action Action_Localization Action_Recognition Activity Adversarial Agent Attention Autonomous Bert Boundary_Detection Caption Chat Classification CNN Compressive_Sensing Contour Contrastive_Learning Deep_Learning Denoising Detection Dialog Diffusion Drone Dynamic_Memory_Network Edge_Detection Embedding Embodied Emotion Enhancement Face Face_Detection Face_Recognition Facial_Landmark Few-Shot Gait_Recognition GAN Gaze_Estimation Gesture Gradient_Descent Handwriting Human_Parsing Image_Caption Image_Classification Image_Compression Image_Enhancement Image_Generation Image_Matting Image_Retrieval Inference Inpainting Intelligent_Chip Knowledge Knowledge_Graph Language_Model Matching Medical Memory_Networks Multi_Modal Multi_Task NAS NMT Object_Detection Object_Tracking OCR Ontology Optical_Character Optical_Flow Optimization Person_Re-identification Point_Cloud Portrait_Generation Pose Pose_Estimation Prediction QA Quantitative Quantitative_Finance Quantization Re-identification Recognition Recommendation Reconstruction Regularization Reinforcement_Learning Relation Relation_Extraction Represenation Represenation_Learning Restoration Review RNN Salient Scene_Classification Scene_Generation Scene_Parsing Scene_Text Segmentation Self-Supervised Semantic_Instance_Segmentation Semantic_Segmentation Semi_Global Semi_Supervised Sence_graph Sentiment Sentiment_Classification Sketch SLAM Sparse Speech Speech_Recognition Style_Transfer Summarization Super_Resolution Surveillance Survey Text_Classification Text_Generation Tracking Transfer_Learning Transformer Unsupervised Video_Caption Video_Classification Video_Indexing Video_Prediction Video_Retrieval Visual_Relation VQA Weakly_Supervised Zero-Shot