Dynamic Pricing Based on Demand Response Using Actor–Critic Agent Reinforcement Learning


Creative Commons License

Ismail A., BAYSAL M.

Energies, vol.16, no.14, 2023 (SCI-Expanded) identifier

  • Publication Type: Article / Article
  • Volume: 16 Issue: 14
  • Publication Date: 2023
  • Doi Number: 10.3390/en16145469
  • Journal Name: Energies
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, Aerospace Database, Agricultural & Environmental Science Database, CAB Abstracts, Communication Abstracts, Compendex, INSPEC, Metadex, Veterinary Science Database, Directory of Open Access Journals, Civil Engineering Abstracts
  • Keywords: actor–critic agent, Australian Energy Market Operator (AEMO), demand response, dynamic pricing, long short-term memory (LSTM), pricing prediction, real-time pricing (RTP), reinforcement learning
  • Yıldız Technical University Affiliated: Yes

Abstract

Eco-friendly technologies for sustainable energy development require the efficient utilization of energy resources. Real-time pricing (RTP), also known as dynamic pricing, offers advantages over other pricing systems by enabling demand response (DR) actions. However, existing methods for determining and controlling DR have limitations in managing an increasing demand and predicting future pricing. This paper presents a novel approach to address the limitations of existing methods for determining and controlling demand response (DR) in the context of dynamic pricing systems for sustainable energy development. By leveraging actor–critic agent reinforcement learning (RL) techniques, a dynamic pricing DR model is proposed for efficient energy management. The model’s learning framework was trained using DR and real-time pricing data extracted from the Australian Energy Market Operator (AEMO) spanning a period of 17 years. The efficacy of the RL-based dynamic pricing approach was evaluated through two predicting cases: actual-predicted demand and actual-predicted price. Initially, long short-term memory (LSTM) models were employed to predict price and demand, and the results were subsequently enhanced using the deep RL model. Remarkably, the proposed approach achieved an impressive accuracy of 99% for every 30 min future price prediction. The results demonstrated the efficiency of the proposed RL-based model in accurately predicting both demand and price for effective energy management.