509 0

Full metadata record

DC FieldValueLanguage
dc.contributor.advisor이기천-
dc.contributor.author류민호-
dc.date.accessioned2020-02-11T03:07:18Z-
dc.date.available2020-02-11T03:07:18Z-
dc.date.issued2020-02-
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/123440-
dc.identifier.urihttp://hanyang.dcollection.net/common/orgView/200000436936en_US
dc.description.abstractA pre-trained language model, BERT, has brought significant performance improvements across a range of natural language processing tasks. Since the model is trained on a large corpus of diverse topics, it shows robust performance for domain shift problems in which data distributions at training (source data) and testing (target data) differ while sharing similarities. Despite its great improvements compared to previous models, it still suffers from performance degradation due to domain shifts. To mitigate such problems, we propose a simple but effective unsupervised domain adaptation method, adversarial adaptation with distillation (AAD), which combines the adversarial discriminative domain adaptation (ADDA) framework with knowledge distillation. We evaluate our approach in the task of cross-domain sentiment classification on 30 domain pairs, advancing the state-of-the-art performance for unsupervised domain adaptation in text sentiment classification.-
dc.publisher한양대학교-
dc.titleKnowledge Distillation for BERT Unsupervised Domain Adaptation-
dc.title.alternative비지도 도메인 적응을 위한 지식 증류 기법-
dc.typeTheses-
dc.contributor.googleauthorMinho Ryu-
dc.contributor.alternativeauthor류민호-
dc.sector.campusS-
dc.sector.daehak대학원-
dc.sector.department산업공학과-
dc.description.degreeMaster-
dc.contributor.affiliation데이터마이닝-
Appears in Collections:
GRADUATE SCHOOL[S](대학원) > INDUSTRIAL ENGINEERING(산업공학과) > Theses (Master)
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE