251 98

Full metadata record

DC FieldValueLanguage
dc.contributor.author임창환-
dc.date.accessioned2019-12-08T20:23:01Z-
dc.date.available2019-12-08T20:23:01Z-
dc.date.issued2018-09-
dc.identifier.citationSCIENTIFIC REPORTS, v. 8, Article no. 9505en_US
dc.identifier.issn2045-2322-
dc.identifier.urihttps://www.nature.com/articles/s41598-018-27865-5-
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/119907-
dc.description.abstractIndividuals who have lost normal pathways for communication need augmentative and alternative communication (AAC) devices. In this study, we propose a new electrooculogram (EOG)-based human-computer interface (HCI) paradigm for AAC that does not require a user's voluntary eye movement for binary yes/no communication by patients in locked-in state (LIS). The proposed HCI uses a horizontal EOG elicited by involuntary auditory oculogyric reflex, in response to a rotating sound source. In the proposed HCI paradigm, a user was asked to selectively attend to one of two sound sources rotating in directions opposite to each other, based on the user's intention. The user's intentions could then be recognised by quantifying EOGs. To validate its performance, a series of experiments was conducted with ten healthy subjects, and two patients with amyotrophic lateral sclerosis (ALS). The online experimental results exhibited high-classification accuracies of 94% in both healthy subjects and ALS patients in cases where decisions were made every six seconds. The ALS patients also participated in a practical yes/no communication experiment with 26 or 30 questions with known answers. The accuracy of the experiments with questionnaires was 94%, demonstrating that our paradigm could constitute an auxiliary AAC system for some LIS patients.en_US
dc.description.sponsorshipThis work was supported in part by the Institute for Information & Communications Technology Promotion (IITP) grant funded by the Korea government (MSIT) (2017-0-00432, Development of non-invasive integrated BCI SW platform to control home appliances and external devices by user's thought via AR/VR interface) and in part by the Brain Research Program through the National Research Foundation of Korea (NRF) funded by MSIT (NRF-2015M3C7A1031969).en_US
dc.language.isoen_USen_US
dc.publisherNATURE PUBLISHING GROUPen_US
dc.subjectAMYOTROPHIC-LATERAL-SCLEROSISen_US
dc.subjectALTERNATIVE COMMUNICATIONen_US
dc.subjectATTENTIONen_US
dc.subjectSTATEen_US
dc.titleDevelopment of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patientsen_US
dc.typeArticleen_US
dc.relation.no9505-
dc.relation.volume8-
dc.identifier.doi10.1038/s41598-018-27865-5-
dc.relation.page1-10-
dc.relation.journalSCIENTIFIC REPORTS-
dc.contributor.googleauthorKim, Do Yeon-
dc.contributor.googleauthorHan, Chang-Hee-
dc.contributor.googleauthorIm, Chang-Hwan-
dc.relation.code2018003596-
dc.sector.campusS-
dc.sector.daehakCOLLEGE OF ENGINEERING[S]-
dc.sector.departmentDIVISION OF ELECTRICAL AND BIOMEDICAL ENGINEERING-
dc.identifier.pidich-
dc.identifier.orcidhttp://orcid.org/0000-0003-3795-3318-


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE