Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | 서경민 | - |
dc.date.accessioned | 2024-05-30T06:56:50Z | - |
dc.date.available | 2024-05-30T06:56:50Z | - |
dc.date.issued | 2024-05-06 | - |
dc.identifier.citation | JOURNAL OF COMPUTATIONAL DESIGN AND ENGINEERING, v. 11, no 3, page. 158-173 | en_US |
dc.identifier.issn | 2288-5048 | en_US |
dc.identifier.uri | https://academic.oup.com/jcde/article/11/3/158/7665756 | en_US |
dc.identifier.uri | https://repository.hanyang.ac.kr/handle/20.500.11754/190445 | - |
dc.description.abstract | During quality inspection in manufacturing, the gaze of a worker provides pivotal information for identifying surface defects of a product. However, it is challenging to digitize the gaze information of workers in a dynamic environment where the positions and postures of the products and workers are not fixed. A robust, deep learning-based system, ISGOD (Integrated System with w orker’s Gaze and Object Detection), is proposed, which analyzes data to determine which part of the object is observed by integrating object detection and eye-tracking information in dynamic environments. The ISGOD employs a six-dimensional pose estimation algorithm for object detection, considering the location, orientation, and rotation of the object. Eye-tracking data were obtained fr om Tobii Glasses, which enable real-time video transmission and eye-movement tracking. A latency reduction method is proposed to ov ercome the time delays between object detection and eye-tracking information. Three evaluation indices, namely, gaze score, accuracy score, and concentration index are suggested for comprehensive analysis. Two experiments were conducted: a robustness test to confirm the suitability for real-time object detection and eye-tracking, and a trend test to analyze the difference in gaze movement between experts and novices. In the future, the proposed method and system can transfer the expertise of experts to enhance defect detection efficiency significantly. | en_US |
dc.description.sponsorship | This work was supported by the Industrial Technology Innovation Program (No. 20023014, Development of an Agricultural Robot Platform Capable of Continuously Harvesting more than 3 Fruits per minute and Controlling Multiple Transport Robots in an Outdoor Orchard Environment) funded by the Ministry of Trade, Industry & Energy (MOTIE, Korea). | en_US |
dc.language | en_US | en_US |
dc.publisher | OXFORD UNIV PRESS | en_US |
dc.relation.ispartofseries | v. 11, no 3;158-173 | - |
dc.subject | quality inspection | en_US |
dc.subject | eye-tracking | en_US |
dc.subject | object detection | en_US |
dc.subject | deep learning | en_US |
dc.subject | system integration | en_US |
dc.title | Integration of eye-tracking and object detection in a deep learning system for quality inspection analysis | en_US |
dc.type | Article | en_US |
dc.relation.no | 3 | - |
dc.relation.volume | 11 | - |
dc.identifier.doi | 10.1093/jcde/qwae042 | en_US |
dc.relation.page | 158-173 | - |
dc.relation.journal | JOURNAL OF COMPUTATIONAL DESIGN AND ENGINEERING | - |
dc.contributor.googleauthor | Cho, Seung-Wan | - |
dc.contributor.googleauthor | Lim, Yeong-Hyun | - |
dc.contributor.googleauthor | Seo, Kyung-Min | - |
dc.contributor.googleauthor | Kim, Jungin | - |
dc.relation.code | 2024001421 | - |
dc.sector.campus | E | - |
dc.sector.daehak | COLLEGE OF ENGINEERING SCIENCES[E] | - |
dc.sector.department | DEPARTMENT OF INDUSTRIAL AND MANAGEMENT ENGINEERING | - |
dc.identifier.pid | kmseo | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.