As traditional centralized learning networks (CLNs) are facing increasing challenges in terms of privacy preservation, communication overheads, and scalability, federated (FLNs) have been proposed as a promising alternative paradigm to support the training machine (ML) models. In contrast data storage processing CLNs, FLNs exploit number edge devices (EDs) store perform distributively. this way...