Optimal Control of Renewable Energy Communities subject to Network Peak Fees with Model Predictive Control and Reinforcement Learning Algorithms

29 Jan 2024  ·  Samy Aittahar, Adrien Bolland, Guillaume Derval, Damien Ernst ·

We propose in this paper an optimal control framework for renewable energy communities (RECs) equipped with controllable assets. Such RECs allow its members to exchange production surplus through an internal market. The objective is to control their assets in order to minimise the sum of individual electricity bills. These bills account for the electricity exchanged through the REC and with the retailers. Typically, for large companies, another important part of the bills are the costs related to the power peaks; in our framework, they are determined from the energy exchanges with the retailers. We compare rule-based control strategies with the two following control algorithms. The first one is derived from model predictive control techniques, and the second one is built with reinforcement learning techniques. We also compare variants of these algorithms that neglect the peak power costs. Results confirm that using policies accounting for the power peaks lead to a significantly lower sum of electricity bills and thus better control strategies at the cost of higher computation time. Furthermore, policies trained with reinforcement learning approaches appear promising for real-time control of the communities, where model predictive control policies may be computationally expensive in practice. These findings encourage pursuing the efforts toward development of scalable control algorithms, operating from a centralised standpoint, for renewable energy communities equipped with controllable assets.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here