502 0

Knowledge Distillation for BERT Unsupervised Domain Adaptation

Title
Knowledge Distillation for BERT Unsupervised Domain Adaptation
Other Titles
비지도 도메인 적응을 위한 지식 증류 기법
Author
류민호
Alternative Author(s)
류민호
Advisor(s)
이기천
Issue Date
2020-02
Publisher
한양대학교
Degree
Master
Abstract
A pre-trained language model, BERT, has brought significant performance improvements across a range of natural language processing tasks. Since the model is trained on a large corpus of diverse topics, it shows robust performance for domain shift problems in which data distributions at training (source data) and testing (target data) differ while sharing similarities. Despite its great improvements compared to previous models, it still suffers from performance degradation due to domain shifts. To mitigate such problems, we propose a simple but effective unsupervised domain adaptation method, adversarial adaptation with distillation (AAD), which combines the adversarial discriminative domain adaptation (ADDA) framework with knowledge distillation. We evaluate our approach in the task of cross-domain sentiment classification on 30 domain pairs, advancing the state-of-the-art performance for unsupervised domain adaptation in text sentiment classification.
URI
https://repository.hanyang.ac.kr/handle/20.500.11754/123440http://hanyang.dcollection.net/common/orgView/200000436936
Appears in Collections:
GRADUATE SCHOOL[S](대학원) > INDUSTRIAL ENGINEERING(산업공학과) > Theses (Master)
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE