Residential Photovoltaic Generation Forecast via Long Short-Term Memory and Transformer
Date
Volume
Issue
Journal
Series Titel
Book Title
Publisher
Link to publishers version
Abstract
Accurate short-term photovoltaic (PV) power forecasting is increasingly important for maximizing on-site self-consumption and ensuring reliable grid integration as decentralized PV deployment grows. This paper presents a systematic comparison of Long Short-Term Memory (LSTM) and Transformer-based architectures for deterministic short-term PV power forecasting using exclusively publicly available data from multiple climatic regions. The dataset combines multi-year 15-minute PV power measurements from nine plants with corresponding meteorological variables, and it is followed by a unified preprocessing pipeline that includes outlier treatment, interpolation, feature scaling, and correlation-based feature selection. Several feature subsets and input window lengths are evaluated, and Bayesian hyperparameter optimization is employed to refine model configurations for both architectures. The results indicate that using all meteorological variables except cloud coverage with a 2-day input window yields the best performance. Under this configuration, the Transformer model outperforms the LSTM model, achieving on average test errors of MSE = 0.0038 and MAE = 0.0265, compared to 0.0055 and 0.0343 for the LSTM, respectively. An analysis of time-resolved residuals shows that both models exhibit the largest errors around noon, while the Transformer provides a consistently narrower error distribution over the diurnal cycle. These findings highlight the advantages of attention-based sequence modeling for PV applications and offer practical guidance on feature design, input horizon selection, and hyperparameter ranges for future data-driven PV forecasting studies.
