Abstract
Federated Learning (FL) is a machine learning approach that enables the
creation of shared models for powerful applications while allowing data to
remain on devices. This approach provides benefits such as improved data
privacy, security, and reduced latency. However, in some systems, direct
communication between clients and servers may not be possible, such as remote
areas without proper communication infrastructure. To overcome this challenge,
a new framework called FedEx (Federated Learning via Model Express Delivery) is
proposed. This framework employs mobile transporters, such as UAVs, to
establish indirect communication channels between the server and clients. These
transporters act as intermediaries and allow for model information exchange.
The use of indirect communication presents new challenges for convergence
analysis and optimization, as the delay introduced by the transporters'
movement creates issues for both global model dissemination and local model
collection. To address this, two algorithms, FedEx-Sync and FedEx-Async, are
proposed for synchronized and asynchronized learning at the transporter level.
Additionally, a bi-level optimization algorithm is proposed to solve the joint
client assignment and route planning problem. Experimental validation using two
public datasets in a simulated network demonstrates consistent results with the
theory, proving the efficacy of FedEx.