Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | 노영균 | - |
dc.date.accessioned | 2021-10-29T00:41:09Z | - |
dc.date.available | 2021-10-29T00:41:09Z | - |
dc.date.issued | 2020-04 | - |
dc.identifier.citation | NEUROCOMPUTING, v. 413, page. 294-304 | en_US |
dc.identifier.issn | 0925-2312 | - |
dc.identifier.issn | 1872-8286 | - |
dc.identifier.uri | https://www.sciencedirect.com/science/article/pii/S0925231220310961?via%3Dihub | - |
dc.identifier.uri | https://repository.hanyang.ac.kr/handle/20.500.11754/165951 | - |
dc.description.abstract | Several practical difficulties arise when trying to apply deep learning to image-based industrial inspection tasks: training datasets are difficult to obtain, each image must be inspected in milliseconds, and defects must be detected with 99% or greater accuracy. In this paper we show how, for image-based industrial inspection tasks, transfer learning can be leveraged to address these challenges. Whereas transfer learning is known to work well only when the source and target domain images are similar, we show that using ImageNet-whose images differ significantly from our target industrial domain-as the source domain, and performing transfer learning, works remarkably well. For one benchmark problem involving 5,520 training images, the resulting transfer-learned network achieves 99.90% accuracy, compared to only a 70.87% accuracy achieved by the same network trained from scratch. Further analysis reveals that the transfer-learned network produces a considerably more sparse and disentangled representation compared to the trained-from-scratch network. The sparsity can be exploited to compress the transfer-learned network up to 1/128 the original number of convolution filters with only a 0.48% drop in accuracy, compared to a drop of nearly 5% when compressing a trained-from-scratch network. Our findings are validated by extensive systematic experiments and empirical analysis. (C) 2020 Elsevier B.V. All rights reserved. | en_US |
dc.description.sponsorship | Seunghyeon Kim and Frank C. Park were supported in part by Naver Labs Ambidex Project, MSIT-IITP (No. 2019-0-01367, BabyMind), SNU-IAMD, SNU BK21 + Program in Mechanical Engineering, and the National Research Foundation of Korea under Grant NRF-2016R1A5A1938472. Yung-Kyun Noh was supported in part by the National Research Foundation of Korea Grant (NRF/MSIT2017R1E1A1A03070945) and Hanyang University (HY-2019). | en_US |
dc.language.iso | en | en_US |
dc.publisher | ELSEVIER | en_US |
dc.subject | Deep learning | en_US |
dc.subject | Industrial image inspection | en_US |
dc.subject | Neural network compression | en_US |
dc.subject | Transfer learning | en_US |
dc.title | Efficient neural network compression via transfer learning for machine vision inspection | en_US |
dc.type | Article | en_US |
dc.relation.volume | 413 | - |
dc.identifier.doi | 10.1016/j.neucom.2020.06.107 | - |
dc.relation.page | 294-304 | - |
dc.relation.journal | NEUROCOMPUTING | - |
dc.contributor.googleauthor | Kim, Seunghyeon | - |
dc.contributor.googleauthor | Noh, Yung-Kyun | - |
dc.contributor.googleauthor | Park, Frank C. | - |
dc.relation.code | 2020048689 | - |
dc.sector.campus | S | - |
dc.sector.daehak | COLLEGE OF ENGINEERING[S] | - |
dc.sector.department | SCHOOL OF COMPUTER SCIENCE | - |
dc.identifier.pid | nohyung | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.