29 0

Full metadata record

DC FieldValueLanguage
dc.contributor.author남해운-
dc.date.accessioned2024-05-15T23:52:15Z-
dc.date.available2024-05-15T23:52:15Z-
dc.date.issued2023-06-01-
dc.identifier.citationIEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, v. 20, NO 2, Page. 1517-1528en_US
dc.identifier.issn1932-4537en_US
dc.identifier.issn2373-7379en_US
dc.identifier.urihttps://information.hanyang.ac.kr/#/eds/detail?an=edseee.10130085&dbId=edseeeen_US
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/190294-
dc.description.abstractFederated learning is an approach of training the global model on the server by utilizing the personal data of the end users while data privacy is preserved. The users, referred to as clients, are responsible for performing local training using their respective datasets. Once trained, the clients forward their local models to the server, where the models are aggregated to update the global model. Practically, the datasets of clients have different classes of labels regardless of the number of samples. In other words, the data is non-independent and identically distributed (non-iid) among clients in terms of classes of labels, which creates heterogeneity among them. Hence, the local model weights updated by clients result in a broad variation due to heterogeneity among their local datasets. Thus, the process of aggregating the diversified local models of clients has a valuable impact on the performance of global training. When the server aggregates the local models by calculating the weighted average based solely on the number of samples available at the clients, the aggregation process may misguide the global training process. To address this issue, our paper proposes a novel reweighting method called FedCLS that performs based on the volume and variance of local datasets among clients. By taking into account the heterogeneity of data for aggregation in federated learning, the proposed method aims to achieve the minimum global point. The simulation results show that the proposed method achieves 28% performance improvement compared to the conventional federated learning methods.en_US
dc.languageen_USen_US
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INCen_US
dc.relation.ispartofseriesv. 20, NO 2;1517-1528-
dc.subjectFederated learningen_US
dc.subjectdeep neural networksen_US
dc.subjectheterogeneous networken_US
dc.subjectdistributed learningen_US
dc.subjectunbiased aggregationen_US
dc.titleFedCLS: Class-Aware Federated Learning in a Heterogeneous Environmenten_US
dc.typeArticleen_US
dc.relation.no2-
dc.relation.volume20-
dc.identifier.doi10.1109/TNSM.2023.3278023en_US
dc.relation.page1517-1528-
dc.relation.journalIEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT-
dc.contributor.googleauthorBhatti, Dost Muhammad Saqib-
dc.contributor.googleauthorNam, Haewoon-
dc.relation.code2023038240-
dc.sector.campusE-
dc.sector.daehakCOLLEGE OF ENGINEERING SCIENCES[E]-
dc.sector.departmentSCHOOL OF ELECTRICAL ENGINEERING-
dc.identifier.pidhnam-
Appears in Collections:
COLLEGE OF ENGINEERING SCIENCES[E](공학대학) > ELECTRICAL ENGINEERING(전자공학부) > Articles
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE