2023 IEEE Belgrade PowerTech

Full Program »

Optimal Energy Scheduling of Flexible Industrial Prosumers Via Reinforcement Learning

This research introduces an energy management system (EMS) aiming to minimize the electricity operating costs using reinforcement learning (RL) with a linear function approximation. The proposed EMS uses a Q-learning with tile coding (QLTC) algorithm and is compared to a deterministic mixed-integer linear programming (MILP) with perfect forecast information. The comparison is performed using a case study on an industrial manufacturing company in the Netherlands considering measured electricity consumption, PV generation, and wholesale electricity prices during one week of operation. The results show that the proposed EMS is able to adjust the prosumer's power consumption considering favourable prices. Obtained electricity costs using the QLTC algorithm are 99\% close to the costs obtained with eh MILP approach. Furthermore, the results demonstrate that the QLTC approach is able to generalize on previously learned control policies even in the case of missing data, and can deploy actions 80\% near to the MILP's optimal solution.

Nick van den Bovenkamp
Delft University of Technology
Netherlands

Juan S Giraldo
University of Twente
Netherlands

Edgar Mauricio Salazar Duque
Eindhoven University of Technology
Netherlands

Pedro P Vergara
Delft University of Technology
Netherlands

Charalambos Konstantinou
King Abdullah University of Science and Technology
Saudi Arabia

Peter Palensky
Delft University of Technology
Netherlands

 



Powered by OpenConf®
Copyright©2002-2022 Zakon Group LLC