297 0

Full metadata record

DC FieldValueLanguage
dc.contributor.author노영균-
dc.date.accessioned2021-10-29T00:41:09Z-
dc.date.available2021-10-29T00:41:09Z-
dc.date.issued2020-04-
dc.identifier.citationNEUROCOMPUTING, v. 413, page. 294-304en_US
dc.identifier.issn0925-2312-
dc.identifier.issn1872-8286-
dc.identifier.urihttps://www.sciencedirect.com/science/article/pii/S0925231220310961?via%3Dihub-
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/165951-
dc.description.abstractSeveral practical difficulties arise when trying to apply deep learning to image-based industrial inspection tasks: training datasets are difficult to obtain, each image must be inspected in milliseconds, and defects must be detected with 99% or greater accuracy. In this paper we show how, for image-based industrial inspection tasks, transfer learning can be leveraged to address these challenges. Whereas transfer learning is known to work well only when the source and target domain images are similar, we show that using ImageNet-whose images differ significantly from our target industrial domain-as the source domain, and performing transfer learning, works remarkably well. For one benchmark problem involving 5,520 training images, the resulting transfer-learned network achieves 99.90% accuracy, compared to only a 70.87% accuracy achieved by the same network trained from scratch. Further analysis reveals that the transfer-learned network produces a considerably more sparse and disentangled representation compared to the trained-from-scratch network. The sparsity can be exploited to compress the transfer-learned network up to 1/128 the original number of convolution filters with only a 0.48% drop in accuracy, compared to a drop of nearly 5% when compressing a trained-from-scratch network. Our findings are validated by extensive systematic experiments and empirical analysis. (C) 2020 Elsevier B.V. All rights reserved.en_US
dc.description.sponsorshipSeunghyeon Kim and Frank C. Park were supported in part by Naver Labs Ambidex Project, MSIT-IITP (No. 2019-0-01367, BabyMind), SNU-IAMD, SNU BK21 + Program in Mechanical Engineering, and the National Research Foundation of Korea under Grant NRF-2016R1A5A1938472. Yung-Kyun Noh was supported in part by the National Research Foundation of Korea Grant (NRF/MSIT2017R1E1A1A03070945) and Hanyang University (HY-2019).en_US
dc.language.isoenen_US
dc.publisherELSEVIERen_US
dc.subjectDeep learningen_US
dc.subjectIndustrial image inspectionen_US
dc.subjectNeural network compressionen_US
dc.subjectTransfer learningen_US
dc.titleEfficient neural network compression via transfer learning for machine vision inspectionen_US
dc.typeArticleen_US
dc.relation.volume413-
dc.identifier.doi10.1016/j.neucom.2020.06.107-
dc.relation.page294-304-
dc.relation.journalNEUROCOMPUTING-
dc.contributor.googleauthorKim, Seunghyeon-
dc.contributor.googleauthorNoh, Yung-Kyun-
dc.contributor.googleauthorPark, Frank C.-
dc.relation.code2020048689-
dc.sector.campusS-
dc.sector.daehakCOLLEGE OF ENGINEERING[S]-
dc.sector.departmentSCHOOL OF COMPUTER SCIENCE-
dc.identifier.pidnohyung-
Appears in Collections:
COLLEGE OF ENGINEERING[S](공과대학) > COMPUTER SCIENCE(컴퓨터소프트웨어학부) > Articles
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE