427 0

Full metadata record

DC FieldValueLanguage
dc.contributor.author조성호-
dc.date.accessioned2019-10-02T07:20:11Z-
dc.date.available2019-10-02T07:20:11Z-
dc.date.issued2019-04-
dc.identifier.citationIEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, Page. 1-31en_US
dc.identifier.issn0018-9456-
dc.identifier.issn1557-9662-
dc.identifier.urihttps://ieeexplore.ieee.org/document/8682073-
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/110835-
dc.description.abstractIn this study, we classify digits written in mid-air using hand gestures. Impulse radio ultra-wideband (IR-UWB) radar sensors are used for data acquisition, with three radar sensors placed in a triangular geometry. Conventional radar-based gesture recognition methods use whole raw data matrices or a group of features for gesture classification using convolutional neural networks (CNNs) or other machine learning algorithms. However, if the training and testing data differ in distance, orientation, hand shape, hand size, or even gesture speed or the radar setup environment, these methods become less accurate. To develop a more robust gesture recognition method, we propose not using raw data for the CNN classifier, but instead employing the hand’s mid-air trajectory for classification. The hand trajectory has a stereotypical shape for a given digit, regardless of the hand’s orientation or speed, making its classification easy and robust. Our proposed method consists of three stages: signal preprocessing, hand motion localization, and tracking and transforming the trajectory data into an image to classify it using a CNN. Our proposed method outperforms conventional approaches because it is robust to changes in orientation, distance, and hand shape and size. Moreover, this method does not require building a huge training database of digits drawn by different users in different orientations; rather, we can use training databases already available in the image processing field. Overall, the proposed mid-air handwritten digit recognition system provides a user-friendly and accurate mid-air handwriting modality that does not place restrictions on users.en_US
dc.description.sponsorshipThis research was supported by Bio & Medical Technology Development Program (Next Generation Biotechnology) through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (2017M3A9E2064626)en_US
dc.language.isoenen_US
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INCen_US
dc.subjectGesture recognitionen_US
dc.subjectconvolutional neural network (CNN)en_US
dc.subjectmid-air hand writingen_US
dc.subjectimpulse radio ultra-wideband (IR-UWB) radaren_US
dc.subjectsensoren_US
dc.subjectimageen_US
dc.subjecthuman computer interactionen_US
dc.subjectlocalizationen_US
dc.titleDetecting Mid-air Gestures for Digit Writing with Radio Sensors and a CNNen_US
dc.typeArticleen_US
dc.relation.volumeDOI: 10.1109/TIM.2019.2909249-
dc.identifier.doi10.1109/TIM.2019.2909249-
dc.relation.page1-16-
dc.relation.journalIEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT-
dc.contributor.googleauthorLeem, Seong Kyu-
dc.contributor.googleauthorKhan, Faheem-
dc.contributor.googleauthorCho, Sung Ho-
dc.relation.code2019002097-
dc.sector.campusS-
dc.sector.daehakCOLLEGE OF ENGINEERING[S]-
dc.sector.departmentDEPARTMENT OF ELECTRONIC ENGINEERING-
dc.identifier.piddragon-
Appears in Collections:
COLLEGE OF ENGINEERING[S](공과대학) > ELECTRONIC ENGINEERING(융합전자공학부) > Articles
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE