Please use this identifier to cite or link to this item:
https://hdl.handle.net/11499/56535
Title: | Autonomous Micro-Grids: A Reinforcement Learning-Based Energy Management Model in Smart Cities | Authors: | Özkan, E. Kök, I. Özdemir, S. |
Keywords: | Artificial intelligence Deep reinforcement learning (DRL) Energy management system (EMS) Micro-grid Curve fitting Deep learning Electric loads Electric power transmission Electric power utilization Energy management Energy management systems Environmental impact Learning systems Power quality Real time systems Renewable energy resources Smart power grids Deep reinforcement learning Electricity-consumption Energy management system Energy source Government IS Management Model Microgrid Reinforcement learnings Renewable energies Renewable energy source Reinforcement learning |
Publisher: | Institute of Electrical and Electronics Engineers Inc. | Abstract: | The growing electricity consumption of communities has raised concerns about the environmental impact of traditional energy sources. To mitigate these concerns, governments are promoting to utilize renewable energy sources. However, the intermittent nature of renewable energy poses significant challenges for gird stability. Micro-grids have emerged as promising solutions to address these challenges. A micro-grid is an independent electric system that can generate, distribute, and manage electricity within a small area. It offers many advantages such as; peak load reduction, minimized load variability, and enhanced power quality. Energy management systems (EMS) within micro-grids play a significant role in overcoming operational challenges. They are designed to control micro-grid systems with the goal of flattening, smoothing, and reducing the curve of electrical demand. This helps to reduce the operational costs of electricity generation, transmission and distribution. Reinforcement Learning (RL) has been an important research area for EMS systems. By leveraging historical and real-time data, RL enables effective control of EMS systems within micro-grids. However, despite the advancements in this area, many of these research is challenging to reproduce. In this work, we use SAC and PPO RL agents in a micro-grid architecture. We make use of Citylearn framework to test our agents. We compare our agents with the Rule Based Controller (RBC). Our test results show that our solution is able to improve the micro-grid performance by effectively smoothing the electricity consumption. © 2023 IEEE. | Description: | 2023 International Symposium on Networks, Computers and Communications, ISNCC 2023 -- 23 October 2023 through 26 October 2023 -- 194993 | URI: | https://doi.org/10.1109/ISNCC58260.2023.10323891 https://hdl.handle.net/11499/56535 |
ISBN: | 9798350335590 |
Appears in Collections: | Mühendislik Fakültesi Koleksiyonu Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection |
Show full item record
CORE Recommender
Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.