Coordinated Load Balancing in Mobile Edge Computing Network: a Multi-Agent DRL Approach
Published
IEEE International Conference on Communications (ICC)
Abstract
Mobile edge computing (MEC) networks have been recently adopted to accommodate the fast-growing number of mobile devices performing complicated tasks with limited hardware capability. Recently, edge nodes with communication, computation, and caching capacities are starting to be deployed in MEC networks. Due to the physical separation of these resources, efficient coordination and scheduling are important for efficient resource utilization and optimal network performance. In this paper, we study mobility load balancing for communication, computation, and caching-enabled heterogeneous MEC networks. Specifically, we propose to tackle this problem via a multi-agent deep reinforcement learning-based framework. Users served by overloaded edge nodes are handed over to less loaded ones, to minimize the load in the most loaded base station in the network. In this framework, the handover decision for each user is made based on the user’s own observation which comprises the user’s task at hand and the load status of the MEC network. Simulation results show that our proposed multi-agent deep reinforcement learning-based approach can reduce the time-average maximum load by up to 30% and the end-to-end delay by 50% compared to baseline algorithms.