240 0

Full metadata record

DC FieldValueLanguage
dc.contributor.advisor김회율-
dc.contributor.author조동찬-
dc.date.accessioned2020-02-27T16:30:47Z-
dc.date.available2020-02-27T16:30:47Z-
dc.date.issued2014-02-
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/130806-
dc.identifier.urihttp://hanyang.dcollection.net/common/orgView/200000423608en_US
dc.description.abstractGaze information of the user, which is widely used in various fields such as research on the visual systems, human computer interaction, and product design, can be estimated by the vision based gaze tracking systems. In the vision-based remote gaze tracking systems, the most challenging tasks are to allow natural movement of a user and to increase the working volume as well as the distance from the user to the system. Up to now, several eye gaze estimation methods allowing the natural movement of a user have been proposed. However, their working volumes and distances are narrow and close. Some of the existing gaze tracking systems providing a long working distance suffer from a narrow working volume or the low degree of accuracy in the gaze estimation when the user moves within the working volume. In this dissertation, a novel gaze tracking system that allows a larger working volume as well as longer working distance is presented. The proposed gaze estimation method, employing a novel movement mapping function, robustly estimates the gaze of the user by taking advantage of both the conventional 2-D mapping-based method and the conventional 3-D model-based method. In the conventional 2-D mapping-based gaze estimation method, a pupil center corneal reflection (PCCR) vector is defined as the spatial difference between the pupil center and the corneal reflection of the IR ray in the image. During the user calibration process, the user is asked to gaze several user calibration points on the screen in order and PCCR vectors of several user calibration points are obtained by the 2-D feature extraction method. From these PCCR vectors, a 2-D mapping function between the PCCR vectors and the reference points on the screen is obtained. The final gaze point on the screen is then calculated by the 2-D mapping function using the current PCCR vector. However, these PCCR vectors, obtained in the user calibration process, are valid only when the user stays at the location where the user calibration is performed or in its vicinity. On the other hand, 3-D model-based gaze estimation method would not suffer from the movement of the user because the gaze direction of the user is estimated from the 3-D coordinates of the pupil center and cornea center in the world coordinate system. However, the 3-D model-based gaze estimation method requires accurate system calibration among cameras, illuminators, and the screen. Furthermore, the information of user-dependent parameters such as the radius of the cornea and a kappa angle is also needed in advance. The proposed gaze estimation method estimates the gaze of the user by taking into account the 2-D mapping-based gaze estimation method. When the user changes his/her position, the proposed movement compensation model estimates the scale factors of the PCCR vectors from the 3-D position of the user and the reference points on the screen. Then refined PCCR vectors are estimated and the new mapping function is recalculated to determine the gaze point on the screen from the movement compensation model. In the experiments, the user calibration process started at the center of the screen and 210 cm away from the screen. The user was then asked to gaze at nine evaluation points on the screen at eight different locations. The average angular error of the proposed gaze estimation method among eight different locations was 0.84º, and the increase in angular error was only 0.09º while the user moved around 81 cm from the position where the user calibration was performed.-
dc.publisher한양대학교-
dc.title넓은 움직임 범위를 지원하는 원거리 시선 추적 방법-
dc.title.alternativeLong Range Eye Gaze Tracking Method for Large Movements-
dc.typeTheses-
dc.contributor.googleauthor조동찬-
dc.contributor.alternativeauthorCho, Dong-Chan-
dc.sector.campusS-
dc.sector.daehak대학원-
dc.sector.department전자컴퓨터통신공학과-
dc.description.degreeDoctor-
dc.contributor.affiliation영상처리-
Appears in Collections:
GRADUATE SCHOOL[S](대학원) > ELECTRONICS AND COMPUTER ENGINEERING(전자컴퓨터통신공학과) > Theses (Ph.D.)
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE