Comparing Load-Forecasts of Residential Heatpumps with Transformer and XGB on Field Data

Loading...
Thumbnail Image

Volume

Issue

Journal

Series Titel

Book Title

Publisher

Graz : Technische Universität Graz

Link to publishers version

Abstract

Accurate device-level forecasting of residential heat pump power consumption is a key enabler for advanced Prosumer energy management, yet forecasting performance is often limited by user-driven variability and incomplete measurement data. This work examines day-ahead forecasting of individual air-source heat pumps at 15-minute resolution using a large field dataset from 308 devices, combining lagged load, weather, and calendar features. A globally trained XGB model, locally trained XGB and Transformer models, and benchmark methods (linear regression and 24 h persistence) are comparatively evaluated with respect to R2, normalized RMSE, and a peak error metric. Results reveal pronounced performance heterogeneity across devices, with the global XGB model achieving R2 > 0.8 and nRMSE < 0.05 for series with regular daily peaks, but negative R2 and large peak errors for highly irregular time series. Transformers do not consistently outperform XGB and tend to overfit to noisy data despite considerable model capacity and training effort. Feature ablation experiments identify lagged load values as the dominant predictors and indicate that temperature and periodic features alone yield poor forecasts. Generalization analyses show that models trained on time series with consistent patterns transfer reasonably well to unseen, regular time series, whereas models calibrated on irregular time series generalize poorly and are sometimes inferior to persistence, highlighting the central role of the intrinsic load structure in forecastability. The findings underscore that, in the studied setting, robust, computationally efficient tree-based ensembles remain competitive with deep learning methods.

Description

Keywords

License

CC BY 4.0