Explainable Artificial Intelligence and Trust in the Energy Sector (2020 - )
Today, because of the complexity of underlying machine learning models, AI appears as a ‘black box’, since the internal learning and optimization processes are often not completely comprehensible. To tackle the trade-off problem, methods of “Explainable Artificial Intelligence” (XAI) were developed to increase the transparency of underlying models without decreasing their performance. In this context, the PhD project aims to apply XAI methods for machine learning models in two use cases in the field of energy, esp. photovoltaics, to increase the models’ explainability and hence trustworthiness.
The first use case is linked to the existing HEIBRiDS project “Optimization of solar energy yield..” The control strategies in this project incorporate predictive and prescriptive data analytics based on machine learning approaches, which however represent black boxes. The XAI research proposed here makes it possible both to improve the trust and acceptability of AI-based solutions, and to generate findings that sustainably improve product design or system configurations. The second use case relates to the “Combinatorial materials discovery” pursued in the HZB research groups Unold/Schorr and Abdi/van de Krol, which focuses on the exploration of light-absorbing semiconductor sand catalysts for solar energy conversion devices by combinatorial high throughput methods. This generates high-dimensional sets of data, which have to be searched and analyzed for structure-property-function relationships and from which guidance for further experiments is sought. For this purpose, it is aimed to use machine learning approaches to automate time-consuming procedures regarding the analysis of the multidimensional datasets. The introduction of machine learning in combination with XAI methods is hence expected to accelerate development processes, to provide new physical understandings and eventually support the efficient development of materials for solar conversion devices.
The results of the PhD project will be synthesized to contribute knowledge regarding a) the validity and reliability of XAI methods in the field of photovoltaic, b) new opportunities for data pre-processing and data analytics based on XAI outcomes, c) consequent effects on the development of (and trust towards) energy systems, and d) the findings’ transferability to other fields in the energy sector.
Peer-reviewed Publications (journal or conference)
- C. Utama, C. Meske, J. Schneider, and C. Ulbrich (2022). Reactive power control in photovoltaic systems through (explainable) artificial intelligence.Applied Energy, 328. https://doi.org/10.1016/j.apenergy.2022.120004
- C. Utama, B. Karg, C. Meske, and S. Lucia (2022). Explainable artificial intelligence for deep learning-based model predictive controllers.In Proceedings of the 26th International Conference on System Theory, Control and Computing (ICSTCC), 464-471. https://doi.org/10.1109/ICSTCC55426.2022.9931794
- C. Utama, C. Meske, J. Schneider, R. Schlatmann, C. Ulbrich (2023). Explainable artificial intelligence for photovoltaic fault detection: A comparison of instruments. Solar Energy, 249, 139–151. https://doi.org/10.1016/j.solener.2022.11.018
Other (presentations at conferences or preprints)
- C. Utama, C. Meske, J. Schneider, and C. Ulbrich. Reactive power control in photovoltaic systems through (explainable) artificial intelligence. (Poster presentation), 8th World Conference on Photovoltaic Energy Conversion (WCPEC-8), Milan, Italy, 26-30 September 2022.
- C. Utama, B. Karg, C. Meske, and S. Lucia. Explainable artificial intelligence for deep learning-based model predictive controllers. (Oral presentation), 26th International Conference on System Theory, Control and Computing (ICSTCC), Online, 19-21 October 2022.