Full metadata record

DC FieldValueLanguage
dc.contributor.author정두석-
dc.date.accessioned2022-12-14T01:16:12Z-
dc.date.available2022-12-14T01:16:12Z-
dc.date.issued2021-12-
dc.identifier.citationAdvances in Neural Information Processing Systems, v. 34, Page. 28274-28285en_US
dc.identifier.issn1049-5258en_US
dc.identifier.urihttps://arxiv.org/abs/2110.02550en_US
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/178291-
dc.description.abstractBackward propagation of errors (backpropagation) is a method to minimize objective functions (e.g., loss functions) of deep neural networks by identifying optimal sets of weights and biases. Imposing constraints on weight precision is often required to alleviate prohibitive workloads on hardware. Despite the remarkable success of backpropagation, the algorithm itself is not capable of considering such constraints unless additional algorithms are applied simultaneously. To address this issue, we propose the constrained backpropagation (CBP) algorithm based on the pseudo-Lagrange multiplier method to obtain the optimal set of weights that satisfy a given set of constraints. The defining characteristic of the proposed CBP algorithm is the utilization of a Lagrangian function (loss function plus constraint function) as its objective function. We considered various types of constraints - binary, ternary, one-bit shift, and two-bit shift weight constraints. As a post-training method, CBP applied to AlexNet, ResNet-18, ResNet-50, and GoogLeNet on ImageNet, which were pre-trained using the conventional backpropagation. For most cases, the proposed algorithm outperforms the state-of-the-art methods on ImageNet, e.g., 66.6%, 74.4%, and 64.0% top-1 accuracy for ResNet-18, ResNet-50, and GoogLeNet with binary weights, respectively. This highlights CBP as a learning algorithm to address diverse constraints with the minimal performance loss by employing appropriate constraint functions. The code for CBP is publicly available at https://github.com/dooseokjeong/CBP.en_US
dc.description.sponsorshipThis work was supported by the Ministry of Trade, Industry & Energy (grant no. 20012002) and Korea Semiconductor Research Consortium program for the development of future semiconductor devices and by National R&D Program through the National Research Foundation of Korea (NRF) funded by Ministry of Science and ICT (2021M3F3A2A01037632).en_US
dc.languageenen_US
dc.publisherNeural information processing systems foundationen_US
dc.titleCBP: Backpropagation with constraint on weight precision using pseudo-Lagrange multiplier methoden_US
dc.typeArticleen_US
dc.relation.volume34-
dc.identifier.doi10.48550/arXiv.2110.02550en_US
dc.relation.page28274-28285-
dc.relation.journalAdvances in Neural Information Processing Systems-
dc.contributor.googleauthorKim, Guhyun-
dc.contributor.googleauthorJeong, Doo Seok-
dc.sector.campusS-
dc.sector.daehak공과대학-
dc.sector.department신소재공학부-
dc.identifier.piddooseokj-
Appears in Collections:
COLLEGE OF ENGINEERING[S](공과대학) > MATERIALS SCIENCE AND ENGINEERING(신소재공학부) > Articles
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE