Combining methods to better analyse solar panel degradation
Solar panels have to withstand the elements, from various forms of precipitation, winds and sunshine throughout the year. It’s well known in the industry that this weather causes degradation of their rated power output over their lifetime. Though small (less than 1% per year); in an industry where every little counts, the exact numbers used in models are hotly debated in and between different markets. For the UK an annual degradation rate of 0.4% is usually assumed although some jurisdictions recognise lower rates.
Part of the issue of analysing this assumption is seeing such a small decrease among the noise of other factors such as irradiance, temperature and general measurement uncertainty. With so many assets now operating for extended periods, Lightsource bp’s Strategy and Innovation teams have analysed this wealth of solar plant data to try and find an accurate number for this phenomenon.
Several methodologies exist, here at Lightsource bp we have used the recognised NREL method through RDTools and the methodology from Meyers et al (2020) to generate the degradation results below. Results vary across portfolios but the mean and medians of both methods are below current industry standards as assumed in current performance models.
What can be seen here is the challenge of accurately calculating this degradation as some sites show a positive uplift, this is thought to be as a result of changes in O&M practices, soiling, relative effects of shading, panel and inverter replacements along with the general uncertainty of energy metering.
Even with these inaccuracies, solar panel degradation is lower than the assumed 0.4% annually. If it was not, we would be seeing more ‘signal’ through the ‘noise’. This study includes sites operating for ten years and more, if the UK industry assumption of 0.4% per year was valid, the data with its uncertainties would demonstrate much larger degradation.