ERAFL: Efficient Resource Allocation for Federated Learning Training in Smart Homes


Due to its privacy-by-design feature, federated learning (FL) has been widely adopted in smart home settings, from device anomaly detection to traffic characterization. However, FL training might suffer from a few slow participants, so-called stragglers, that cannot finish their local training due to limited computation capacity or cannot transmit fast due to bandwidth limitations. To mitigate this effect, prior studies suggest either offloading FL training completely or some layers of the learning model to an edge device. While the former diminishes the benefit of FL, the latter may not suffice when multiple FL tasks compete for local computation and bandwidth resources in a smart home. Different from these approaches, we propose ERAFL that runs on a home gateway and allocates computation and communication resources among the competing FL training tasks to avoid the straggler effect and privacy loss due to training data being offloaded to external computing devices. To preserve the user’s privacy, rather than offloading the data completely, we propose to split the training data between the local and remote devices and train two models in parallel. To guarantee that final global model can achieve a certain level of accuracy, ERAFL puts a boundary on the amount of offloaded data or performs the training either locally or remotely with the complete data in case of user agreement. We first formulate an optimization problem that minimizes the amount of offloaded data while guaranteeing that the deadline of each FL training task is met. Our simulation results show that ERAFL outperforms the baseline approaches that do not consider partial offloading or resource allocation in terms of fraction of FL tasks meeting their deadlines and the amount of offloaded data. Also, the accuracy obtained from our splitting approach and convergence speed is comparable with traditional FL.

Conference paper
Proceedings of the IEEE/IFIP Network Operations and Management Symposium (NOMS)