Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | 조성호 | - |
dc.date.accessioned | 2019-10-02T07:20:11Z | - |
dc.date.available | 2019-10-02T07:20:11Z | - |
dc.date.issued | 2019-04 | - |
dc.identifier.citation | IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, Page. 1-31 | en_US |
dc.identifier.issn | 0018-9456 | - |
dc.identifier.issn | 1557-9662 | - |
dc.identifier.uri | https://ieeexplore.ieee.org/document/8682073 | - |
dc.identifier.uri | https://repository.hanyang.ac.kr/handle/20.500.11754/110835 | - |
dc.description.abstract | In this study, we classify digits written in mid-air using hand gestures. Impulse radio ultra-wideband (IR-UWB) radar sensors are used for data acquisition, with three radar sensors placed in a triangular geometry. Conventional radar-based gesture recognition methods use whole raw data matrices or a group of features for gesture classification using convolutional neural networks (CNNs) or other machine learning algorithms. However, if the training and testing data differ in distance, orientation, hand shape, hand size, or even gesture speed or the radar setup environment, these methods become less accurate. To develop a more robust gesture recognition method, we propose not using raw data for the CNN classifier, but instead employing the hand’s mid-air trajectory for classification. The hand trajectory has a stereotypical shape for a given digit, regardless of the hand’s orientation or speed, making its classification easy and robust. Our proposed method consists of three stages: signal preprocessing, hand motion localization, and tracking and transforming the trajectory data into an image to classify it using a CNN. Our proposed method outperforms conventional approaches because it is robust to changes in orientation, distance, and hand shape and size. Moreover, this method does not require building a huge training database of digits drawn by different users in different orientations; rather, we can use training databases already available in the image processing field. Overall, the proposed mid-air handwritten digit recognition system provides a user-friendly and accurate mid-air handwriting modality that does not place restrictions on users. | en_US |
dc.description.sponsorship | This research was supported by Bio & Medical Technology Development Program (Next Generation Biotechnology) through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (2017M3A9E2064626) | en_US |
dc.language.iso | en | en_US |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | en_US |
dc.subject | Gesture recognition | en_US |
dc.subject | convolutional neural network (CNN) | en_US |
dc.subject | mid-air hand writing | en_US |
dc.subject | impulse radio ultra-wideband (IR-UWB) radar | en_US |
dc.subject | sensor | en_US |
dc.subject | image | en_US |
dc.subject | human computer interaction | en_US |
dc.subject | localization | en_US |
dc.title | Detecting Mid-air Gestures for Digit Writing with Radio Sensors and a CNN | en_US |
dc.type | Article | en_US |
dc.relation.volume | DOI: 10.1109/TIM.2019.2909249 | - |
dc.identifier.doi | 10.1109/TIM.2019.2909249 | - |
dc.relation.page | 1-16 | - |
dc.relation.journal | IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT | - |
dc.contributor.googleauthor | Leem, Seong Kyu | - |
dc.contributor.googleauthor | Khan, Faheem | - |
dc.contributor.googleauthor | Cho, Sung Ho | - |
dc.relation.code | 2019002097 | - |
dc.sector.campus | S | - |
dc.sector.daehak | COLLEGE OF ENGINEERING[S] | - |
dc.sector.department | DEPARTMENT OF ELECTRONIC ENGINEERING | - |
dc.identifier.pid | dragon | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.