Energies, vol.16, no.14, 2023 (SCI-Expanded)
Eco-friendly technologies for sustainable energy development require the efficient utilization of energy resources. Real-time pricing (RTP), also known as dynamic pricing, offers advantages over other pricing systems by enabling demand response (DR) actions. However, existing methods for determining and controlling DR have limitations in managing an increasing demand and predicting future pricing. This paper presents a novel approach to address the limitations of existing methods for determining and controlling demand response (DR) in the context of dynamic pricing systems for sustainable energy development. By leveraging actor–critic agent reinforcement learning (RL) techniques, a dynamic pricing DR model is proposed for efficient energy management. The model’s learning framework was trained using DR and real-time pricing data extracted from the Australian Energy Market Operator (AEMO) spanning a period of 17 years. The efficacy of the RL-based dynamic pricing approach was evaluated through two predicting cases: actual-predicted demand and actual-predicted price. Initially, long short-term memory (LSTM) models were employed to predict price and demand, and the results were subsequently enhanced using the deep RL model. Remarkably, the proposed approach achieved an impressive accuracy of 99% for every 30 min future price prediction. The results demonstrated the efficiency of the proposed RL-based model in accurately predicting both demand and price for effective energy management.