322 0

Full metadata record

DC FieldValueLanguage
dc.contributor.author임종우-
dc.date.accessioned2022-10-19T07:13:48Z-
dc.date.available2022-10-19T07:13:48Z-
dc.date.issued2021-01-
dc.identifier.citationNEURAL NETWORKS, v. 133, page. 103-111en_US
dc.identifier.issn0893-6080; 1879-2782en_US
dc.identifier.urihttps://www.sciencedirect.com/science/article/pii/S089360802030366X?via%3Dihuben_US
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/175569-
dc.description.abstractIn recent years transfer learning has attracted much attention due to its ability to adapt a well-trained model from one domain to another. Fine-tuning is one of the most widely-used methods which exploit a small set of labeled data in the target domain for adapting the network. Including a few methods using the labeled data in the source domain, most transfer learning methods require labeled datasets, and it restricts the use of transfer learning to new domains. In this paper, we propose a fully unsupervised self-tuning algorithm for learning visual features in different domains. The proposed method updates a pre-trained model by minimizing the triplet loss function using only unlabeled data in the target domain. First, we propose the relevance measure for unlabeled data by the bagged clustering method. Then triplets of the anchor, positive, and negative data points are sampled based on the ranking violations of the relevance scores and the Euclidean distances in the embedded feature space. This fully unsupervised self-tuning algorithm improves the performance of the network significantly. We extensively evaluate the proposed algorithm using various metrics, including classification accuracy, feature analysis, and clustering quality, on five benchmark datasets in different domains. Besides, we demonstrate that applying the self-tuning method on the fine-tuned network help achieve better results.en_US
dc.description.sponsorshipThis work was partly supported by Institute of Information & communications Technology Planning & Evaluation (IITP), Korea grant funded by the Korea government(MSIT) (No. 2020-0-01373, Artificial Intelligence Graduate School Program(Hanyang University)), Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Korea (NRF-2017R1A6A3A11031193), and the NSF CAREER, United States of America Grant #1149783.en_US
dc.language.isoenen_US
dc.publisherPERGAMON-ELSEVIER SCIENCE LTDen_US
dc.subjectSelf-tuning neural network; Unsupervised feature learning; Unsupervised transfer learning; Bagged clustering; Ranking violation for triplet samplingen_US
dc.titleUnsupervised Feature Learning for Self-tuning Neural Networksen_US
dc.typeArticleen_US
dc.relation.volume133-
dc.identifier.doi10.1016/j.neunet.2020.10.011en_US
dc.relation.page103-111-
dc.relation.journalNEURAL NETWORKS-
dc.contributor.googleauthorRyu, Jongbin-
dc.contributor.googleauthorYang, Ming-Hsuan-
dc.contributor.googleauthorLim, Jongwoo-
dc.relation.code2021007670-
dc.sector.campusS-
dc.sector.daehakCOLLEGE OF ENGINEERING[S]-
dc.sector.departmentSCHOOL OF COMPUTER SCIENCE-
dc.identifier.pidjlim-
Appears in Collections:
COLLEGE OF ENGINEERING[S](공과대학) > COMPUTER SCIENCE(컴퓨터소프트웨어학부) > Articles
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE