698 0

Full metadata record

DC FieldValueLanguage
dc.contributor.advisor임창환-
dc.contributor.author최강민-
dc.date.accessioned2020-02-11T03:07:24Z-
dc.date.available2020-02-11T03:07:24Z-
dc.date.issued2020-02-
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/123465-
dc.identifier.urihttp://hanyang.dcollection.net/common/orgView/200000436740en_US
dc.description.abstractWith the increased interest of virtual reality (VR), research on VR is expanding its realm in various fields. Among them, studies about human-computer interface (HCI)-based interactive VR applications have widely gained attention as a natural interface vehicle, instead of the conventional manual controller that limits degree of freedom for users. Especially, hands-free avatar control in the VR environment has been studied in recent years based on a variety of paradigms, owing to its potential and effectiveness. Meanwhile, avatar embodiment for detailed reproduction of users has been studied currently, aiming for more immersiveness in the VR environment, beyond simple avatar control based on the user intention detection. In this paper, two studies are presented. First, the author compared the visual stimuli for SSVEP-based BCI, analyzing relation between the visual fatigue and the performance. Second, the author proposes new eye-tracking method using arbitrary electrode configuration around eyes. Recent studies on brain–computer interfaces (BCIs) based on steady-state visual evoked potential (SSVEP) have demonstrated their use to control objects or generate commands in virtual reality (VR) environments. However, most SSVEP-based BCI studies performed in VR environments have adopted visual stimuli that are typically used in conventional LCD environments without considering the differences in the rendering devices (head-mounted displays (HMDs) used in the VR environments). The proximity between the visual stimuli and the eyes in HMDs can readily cause eyestrain, degrading the overall performance of SSVEP-based BCIs. Therefore, in the present study, the author has tested two different types of visual stimuli—pattern-reversal checkerboard stimulus (PRCS) and grow/shrink stimulus (GSS)—on young healthy participants wearing HMDs. Preliminary experiments were conducted to investigate the visual comfort of each participant during the presentation of the visual stimuli. In subsequent online avatar control experiments, the author observed considerable differences in the classification accuracy of individual participants based on the type of visual stimuli used to elicit SSVEP. Interestingly, there was a close relationship between the subjective visual comfort score and the online performance of the SSVEP-based BCI: Most participants showed better classification accuracy under visual stimulus they were more comfortable with. The experimental results suggest the importance of an appropriate visual stimulus to enhance the overall performance of the SSVEP-based BCIs in VR environments. In addition, it is expected that the appropriate visual stimulus for a certain user might be readily selected by surveying the user’s visual comfort for different visual stimuli, without the need for the actual BCI experiments. Meanwhile, with the development of interactive virtual reality (VR) applications, eye tracking technology has been gaining attention as one of the promising interaction interfaces in VR environments. Most eye tracking technologies in VR environments have been developed using infrared camera so far. Although it has been reported that camera-based eye tracker shows superior performance in terms of accuracy and resolution, it is generally too expensive to be manufactured as consumer-electronic devices. Hence, electrooculogram (EOG) has recently been used to track eyeball movement alternatively, with its cost and portability much better than the camera-based eye trackers. In the most conventional EOG-based eye tracking studies, two pairs of electrodes were attached to record vertical and horizontal EOG components (e.g. upper and lower part of an eye; left and right side of both eyes); however, this conventional electrode configuration might be impractical in the VR head-mounted display (HMD) environment because it should be correctly attached with the reference of user’s eye position for each time the eye tracker system is executed. In the present study, the author proposes a new eye tracking method that estimate vertical and horizontal eye movement components from 8-channel EOG signals recorded from arbitrary location around eyes, using reconstruction independent component analysis (rICA) algorithm. For the comparison of the performance of the eye tracking, the author performed correlation analysis for each component with the simultaneously recorded EOG signals from the conventional 4-channel electrode configuration. The performance of the proposed eye tracking method was evaluated by the real-time eye-writing system in terms of classification accuracy of the 10 Arabic number patterns. Experimental results showed that the pattern classification accuracies obtained from the conventional and proposed methods were almost equivalent (94.1% for conventional electrode configuration and 94.4% for proposed 8-channel electrode configuration) with each component strongly correlated (over 0.93 for the averaged correlation coefficients between them), demonstrating that reliable eye tracking is possible using the proposed arbitrary multichannel EOG configuration around the user’s eyes.-
dc.publisher한양대학교-
dc.titleBiosignal-based Human-Computer Interface for Hands-free Avatar Control in Virtual Reality-
dc.title.alternative가상현실 환경에서의 핸즈프리 아바타 컨트롤을 위한 생체신호 기반 인간-컴퓨터 인터페이스-
dc.typeTheses-
dc.contributor.googleauthorKang-min Choi-
dc.contributor.alternativeauthor최강민-
dc.sector.campusS-
dc.sector.daehak대학원-
dc.sector.department생체공학과-
dc.description.degreeMaster-
Appears in Collections:
GRADUATE SCHOOL[S](대학원) > BIOMEDICAL ENGINEERING(생체공학과) > Theses (Master)
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE