139 0

Efficient Dual Attention Transformer for Image Super-Resolution

Title
Efficient Dual Attention Transformer for Image Super-Resolution
Author
박수빈
Advisor(s)
Yong Suk Choi
Issue Date
2024. 2
Publisher
한양대학교 대학원
Degree
Master
Abstract
Research based on computationally efficient local-window self-attention has been actively advancing in the field of image super-resolution (SR), leading to significant performance improvements. However, in most recent studies, local-window self-attention focuses only on spatial dimension, without sufficient consideration of the channel dimension. Additionally, extracting global information while maintaining the efficiency of local-window self-attention, still remains a challenging task in image SR. To resolve these problems, we propose a novel efficient dual attention transformer (EDAT). Our EDAT presents a dual attention block (DAB) that empowers the exploration of interdependencies not just among features residing at diverse spatial locations but also among distinct channels. Moreover, we propose a global attention block (GAB) to achieve efficient global feature extraction by reducing the spatial size of the keys and values. Our extensive experiments demonstrate that our DAB and GAB complement each other, exhibiting a synergistic effect. Furthermore, based on the two attention blocks, DAB and GAB, our EDAT achieves state-of-the-art results on five benchmark datasets.
URI
http://hanyang.dcollection.net/common/orgView/200000720418https://repository.hanyang.ac.kr/handle/20.500.11754/188866
Appears in Collections:
GRADUATE SCHOOL[S](대학원) > ARTIFICIAL INTELLIGENCE(인공지능학과) > Theses(Master)
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE