658 155

Full metadata record

DC FieldValueLanguage
dc.contributor.author신재영-
dc.date.accessioned2019-12-08T03:19:52Z-
dc.date.available2019-12-08T03:19:52Z-
dc.date.issued2018-05-
dc.identifier.citationPLOS ONE, v. 13, no. 5, Article no. e0196359en_US
dc.identifier.issn1932-6203-
dc.identifier.urihttps://journals.plos.org/plosone/article?id=10.1371/journal.pone.0196359-
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/118704-
dc.description.abstractBrain-computer interfaces (BCIs) have been studied extensively in order to establish a non-muscular communication channel mainly for patients with impaired motor functions. However, many limitations remain for BCIs in clinical use. In this study, we propose a hybrid BCI that is based on only frontal brain areas and can be operated in an eyes-closed state for end users with impaired motor and declining visual functions. In our experiment, electroenceph-alography (EEG) and near-infrared spectroscopy (NIRS) were simultaneously measured while 12 participants performed mental arithmetic (MA) and remained relaxed (baseline state: BL). To evaluate the feasibility of the hybrid BCI, we classified MA-from BL-related brain activation. We then compared classification accuracies using two unimodal BCIs (EEG and NIRS) and the hybrid BCI in an offline mode. The classification accuracy of the hybrid BCI (83.9 +/- 10.3%) was shown to be significantly higher than those of unimodal EEG-based (77.3 +/- 15.9%) and NIRS-based BCI (75.9 +/- 6.3%). The analytical results confirmed performance improvement with the hybrid BCI, particularly for only frontal brain areas. Our study shows that an eyes-closed hybrid BCI approach based on frontal areas could be applied to neurodegenerative patients who lost their motor functions, including oculomotor functions.en_US
dc.description.sponsorshipThis work was supported by Institute for Information & Communications Technology Promotion (IITP) grant funded by the Korea government (MSIT) (No. 2017-0-00451) and by National Research Foundation of Korea (NRF) funded by the Ministry of Education (No. 2017R1A6A3A01003543) and the National Research Foundation of Korea (NRF) grant funded by the Korea government (Ministry of Science, ICT & Future Planning) (No. 2017R1C1B5017909).This work was supported by a grant from Institute for Information & Communications Technology Promotion (IITP) grant funded by the Korea government (No. 2017-0-00451), and by the National Research Foundation of Korea (NRF) grant funded by the Korea government (Ministry of Science, ICT & Future Planning) (No.2017R1C1B5017909). Correspondence to KRM and HJH.en_US
dc.language.isoen_USen_US
dc.publisherPUBLIC LIBRARY SCIENCEen_US
dc.subjectEEGen_US
dc.subjectNIRSen_US
dc.subjectBCIen_US
dc.subjectCLASSIFICATIONen_US
dc.subjectCOMMUNICATIONen_US
dc.subjectOPERATEen_US
dc.subjectALSen_US
dc.titleEyes-closed hybrid brain-computer interface employing frontal brain activationen_US
dc.typeArticleen_US
dc.relation.no5-
dc.relation.volume13-
dc.identifier.doi10.1371/journal.pone.0196359-
dc.relation.page196359-196359-
dc.relation.journalPLOS ONE-
dc.contributor.googleauthorShin, Jaeyoung-
dc.contributor.googleauthorMueller, Klaus-Robert-
dc.contributor.googleauthorHwang, Han-Jeong-
dc.relation.code2018006288-
dc.sector.campusS-
dc.sector.daehakRESEARCH INSTITUTE[S]-
dc.sector.departmentINSTITUTE OF BIOMEDICAL ENGINEERING-
dc.identifier.pidnaraeshigo-
dc.identifier.researcherIDT-5173-2018-
dc.identifier.orcidhttp://orcid.org/0000-0003-2899-6893-


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE