DOI QR코드

DOI QR Code

Evaluation of Summer Zone Temperature and Load Forecasting Performance of Transformer Architecture according to Training Dataset Size Change

학습데이터 크기 변화에 따른 트랜스포머 아키텍처의 여름철 실내온도와 열부하 예측성능 평가

  • Choi, Wonjun (School of Architecture, Chonnam National University)
  • Received : 2022.11.08
  • Accepted : 2023.01.07
  • Published : 2023.02.28

Abstract

The data that fully reflects the dynamics of a building can only be collected after the building is completed. Therefore, the data for training machine learning models are not sufficient at the operation stage of buildings. In addition, the dynamics of buildings and energy systems frequently change due to age deterioration, commissioning, component replacement, and retrofitting. Thus, the retraining of deep learning models to reflect the changed system dynamics is required. Therefore, the performance benchmark of deep learning architecture should be designed in consideration of these specificities of the building-energy field. This study benchmarks the time-series forecasting performance of three deep learning architectures: the multilayer perceptron (MLP) and long short-term memory (LSTM), which are widely used architectures, and the transformer, which is relatively recently developed but has high potential. For reproducible benchmarks, a publicly accessible data generator and the open-source Python library DeepTimeSeries was developed. The performance dependence according to the training dataset size was evaluated by changing the training dataset size from 0.3 to 0.9 years. Forecasting targets were the zone air temperatures and thermal loads. Among the three architectures, the transformer had the best performance. In particular, when the training dataset size was small, the transformer exhibited better performance than other architectures in forecasting peaks and dips. Other architectures displayed unstable performance when the training dataset size was small. The results suggest that the transformer has a high potential for time series forecasting in the field of building energy, where the amount of data is limited in most cases.

Keywords

Acknowledgement

이 연구는 전남대학교 학술연구비(과제번호: 2022-2664), 그리고 2022년도 과학기술정보통신부의 재원으로 한국연구재단(No. RS-2022-00165723), 정보통신산업진흥원(No. ITAS03182201100200010 00100100)의 지원을 받아 수행되었기에 감사를 표합니다.

References

  1. ANSI/ASHRAE/IES. (2019). ANSI/ASHRAE/IES Standard 90.1-2019: Energy standard for buildings except low-riseresidential buildings, Atlanta, GA, ASHRAE
  2. Bahdanau, D., Cho, K., & Bengio, Y. (2014). Neural machine translation by jointly learning to align and translate, International Conference on Learning Representations (ICLR) 2015, doi:10.48550/arXiv.1409.0473
  3. Baek, J. Y., Min, S. W., Baek, D. H., & Chang, S. J. (2021). LSTM-based prediction of building power consumption with PCA based data dimension change - Centered on a campus building in Daejeon, Journal of the Architectural Institute of Korea, 37(9), 137-144, doi:10.5659/JAIK.2021.37.9.137
  4. Elman, J. (1990). Finding structure in time, Cognitive Science, 14, 179-211, doi:10.1016/0364-0213(90)90002-E
  5. EnergyPlus. (2022). Weather data by location: Chicago-O' Hare International Airport 725300 (TMY3), Retrieved August 10, 2022 from https://energyplus.net/weather-location/north_and_central_america_wmo_region_4/USA/IL/USA_IL_Chicago-OHare.Intl.AP.725300_TMY3
  6. Fan, C., Wang, J., Gang, W., & Li, S. (2019). Assessment of deep recurrent neural network-based strategies for short-term building energy predictions, Applied Energy, 236, 700-710, doi:10.1016/j.apenergy.2018.12.004
  7. Helbing, G., & Ritter, M. (2018). Deep learning for fault detection in wind turbines, Renewable and Sustainable Energy Reviews, 98, 189-98, doi:10.1016/j.rser.2018.09.012
  8. Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory, Neural Computation, 9, 1735-1780, doi:10.1162/neco.1997.9.8.1735
  9. Lu, R., & Hong, S. H. (2019). Incentive-based demand response for smart grid with reinforcement learning and deep neural network, Applied Energy, 236, 937-949, doi:10.1016/j.apenergy.2018.12.061
  10. Li, A., Xiao, F., Zhang, C., & Fan, C. (2021). Attention-based interpretable neural network for building cooling load prediction. Applied Energy, 299, 117238, doi:10.1016/j.apenergy.2021.117238
  11. Lim, B., Arik, S., Loeff, N., & Pfister, T. (2021). Temporal Fusion Transformers for interpretable multi-horizon time series forecasting, International Journal of Forecast, 37, 1748-1764. doi:10.1016/j.ijforecast.2021.03.012
  12. U.S. Department of Energy. (2012). Commercial reference buildings - New construction, Retrieved August 10, 2022 from https://www.energy.gov/eere/buildings/new-construction-commercial-reference-buildings
  13. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., Polosukhin, I., Kaiser, L., & Polosukhin, I. (2017). Attention is all you need. Advances in Neural Information Processing Systems, 30. doi:10.48550/arXiv.1706.03762