Publications

CKD: Communication Traffic Prediction with Continual Knowledge Distillation

Published

IEEE International Conference on Communications (ICC)

Date

2022.05.16

Research Areas

Abstract

Accurate traffic volume estimation and prediction are essential for advanced communication network functions, such as automatic operations and predictive resource allocation. Although machine learning (ML)-based approaches achieve great success in accomplishing this goal, existing approaches suffer from two drawbacks that limit their real-world applications. First, the ML-based prediction models developed in the past might be obsolete now, since the communication traffic patterns and volumes keep changing in the real world, leading to prediction errors. Second, most Base Stations (BSs) can only save a small amount of data due to the limited storage capacity and high storage costs, which prevents from training an accurate prediction model. In this paper, we propose a novel framework that adapts the prediction model to the constantly changing traffic with only a few current traffic data. Specifically, the framework first learns the knowledge of historical traffic data as much as possible by using a proposed two-branch neural network design, which includes a prediction and a reconstruction module. Then, the framework transfers the knowledge from an old (past) prediction model to a new (current) model for the model update by using a proposed continual knowledge distillation technique. Evaluations on a real-world dataset show that the proposed framework reduces the Mean Absolute Error (MAE) of traffic prediction by up to 9.62% compared to the state-of-the-art prediction methods.