dge-Intelligence-Based Embedded Control for Demand Response Management in Smart Cities
Abstract
ABSTRACT: Demand response (DR) programs are essential for ensuring grid stability in environments with fluctuating energy consumption, especially within rapidly expanding smart cities. This paper presents an edge-intelligence-based embedded control framework designed to improve DR responsiveness and userlevel energy optimization. The system uses distributed IoT controllers equipped with lightweight reinforcement learning agents to modulate loads based on price signals, grid stress conditions, and individual consumption behaviors. The embedded controllers operate autonomously while maintaining secure communication with utility servers for event coordination. Field-level implementation shows that the framework achieves considerable gains in peakload reduction, user comfort maintenance, and communication efficiency. The reinforcement learning algorithm demonstrates strong adaptability, learning optimal response patterns within a few operational cycles. The proposed approach significantly reduces the computational burden on centralized systems while enabling high-granularity control across diverse consumer clusters.
KEYWORDS: Demand response, Edge intelligence, Smart cities, Reinforcement learning, IoT controllers
Full Text:
PDF 22-32Refbacks
- There are currently no refbacks.