259 93

Full metadata record

DC FieldValueLanguage
dc.contributor.author조석현-
dc.date.accessioned2019-11-26T07:57:41Z-
dc.date.available2019-11-26T07:57:41Z-
dc.date.issued2017-07-
dc.identifier.citationSENSORS, v. 17, no. 7, Article no. 1685en_US
dc.identifier.issn1424-8220-
dc.identifier.urihttps://www.mdpi.com/1424-8220/17/7/1685-
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/114797-
dc.description.abstractPolysomnography (PSG) is considered as the gold standard for determining sleep stages, but due to the obtrusiveness of its sensor attachments, sleep stage classification algorithms using noninvasive sensors have been developed throughout the years. However, the previous studies have not yet been proven reliable. In addition, most of the products are designed for healthy customers rather than for patients with sleep disorder. We present a novel approach to classify sleep stages via low cost and noncontact multi-modal sensor fusion, which extracts sleep-related vital signals from radar signals and a sound-based context-awareness technique. This work is uniquely designed based on the PSG data of sleep disorder patients, which were received and certified by professionals at Hanyang University Hospital. The proposed algorithm further incorporates medical/statistical knowledge to determine personal-adjusted thresholds and devise post-processing. The efficiency of the proposed algorithm is highlighted by contrasting sleep stage classification performance between single sensor and sensor-fusion algorithms. To validate the possibility of commercializing this work, the classification results of this algorithm were compared with the commercialized sleep monitoring device, ResMed S+. The proposed algorithm was investigated with random patients following PSG examination, and results show a promising novel approach for determining sleep stages in a low cost and unobtrusive manner.en_US
dc.description.sponsorshipThis work was supported by the ICT R&D program of MSIP/IITP(R7124-16-0004, Development of Intelligent Interaction Technology Based on Context Awareness and Human Intention Understanding).en_US
dc.language.isoen_USen_US
dc.publisherMDPI AGen_US
dc.subjectradaren_US
dc.subjectvital signalen_US
dc.subjectsleep stageen_US
dc.subjectmedical deviceen_US
dc.subjectsensor fusionen_US
dc.subjectmicrophoneen_US
dc.titleNoncontact Sleep Study by Multi-Modal Sensor Fusionen_US
dc.typeArticleen_US
dc.relation.no7-
dc.relation.volume17-
dc.identifier.doi10.3390/s17071685-
dc.relation.page1-17-
dc.relation.journalSENSORS-
dc.contributor.googleauthorChung, Ku-young-
dc.contributor.googleauthorSong, Kwangsub-
dc.contributor.googleauthorShin, Kangsoo-
dc.contributor.googleauthorSohn, Jinho-
dc.contributor.googleauthorCho, Seok Hyun-
dc.contributor.googleauthorChang, Joon-Hyuk-
dc.relation.code2017007121-
dc.sector.campusS-
dc.sector.daehakCOLLEGE OF MEDICINE[S]-
dc.sector.departmentDEPARTMENT OF MEDICINE-
dc.identifier.pidshcho-


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE