Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | 남해운 | - |
dc.date.accessioned | 2024-05-15T23:52:15Z | - |
dc.date.available | 2024-05-15T23:52:15Z | - |
dc.date.issued | 2023-06-01 | - |
dc.identifier.citation | IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, v. 20, NO 2, Page. 1517-1528 | en_US |
dc.identifier.issn | 1932-4537 | en_US |
dc.identifier.issn | 2373-7379 | en_US |
dc.identifier.uri | https://information.hanyang.ac.kr/#/eds/detail?an=edseee.10130085&dbId=edseee | en_US |
dc.identifier.uri | https://repository.hanyang.ac.kr/handle/20.500.11754/190294 | - |
dc.description.abstract | Federated learning is an approach of training the global model on the server by utilizing the personal data of the end users while data privacy is preserved. The users, referred to as clients, are responsible for performing local training using their respective datasets. Once trained, the clients forward their local models to the server, where the models are aggregated to update the global model. Practically, the datasets of clients have different classes of labels regardless of the number of samples. In other words, the data is non-independent and identically distributed (non-iid) among clients in terms of classes of labels, which creates heterogeneity among them. Hence, the local model weights updated by clients result in a broad variation due to heterogeneity among their local datasets. Thus, the process of aggregating the diversified local models of clients has a valuable impact on the performance of global training. When the server aggregates the local models by calculating the weighted average based solely on the number of samples available at the clients, the aggregation process may misguide the global training process. To address this issue, our paper proposes a novel reweighting method called FedCLS that performs based on the volume and variance of local datasets among clients. By taking into account the heterogeneity of data for aggregation in federated learning, the proposed method aims to achieve the minimum global point. The simulation results show that the proposed method achieves 28% performance improvement compared to the conventional federated learning methods. | en_US |
dc.language | en_US | en_US |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | en_US |
dc.relation.ispartofseries | v. 20, NO 2;1517-1528 | - |
dc.subject | Federated learning | en_US |
dc.subject | deep neural networks | en_US |
dc.subject | heterogeneous network | en_US |
dc.subject | distributed learning | en_US |
dc.subject | unbiased aggregation | en_US |
dc.title | FedCLS: Class-Aware Federated Learning in a Heterogeneous Environment | en_US |
dc.type | Article | en_US |
dc.relation.no | 2 | - |
dc.relation.volume | 20 | - |
dc.identifier.doi | 10.1109/TNSM.2023.3278023 | en_US |
dc.relation.page | 1517-1528 | - |
dc.relation.journal | IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT | - |
dc.contributor.googleauthor | Bhatti, Dost Muhammad Saqib | - |
dc.contributor.googleauthor | Nam, Haewoon | - |
dc.relation.code | 2023038240 | - |
dc.sector.campus | E | - |
dc.sector.daehak | COLLEGE OF ENGINEERING SCIENCES[E] | - |
dc.sector.department | SCHOOL OF ELECTRICAL ENGINEERING | - |
dc.identifier.pid | hnam | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.