157 36

Full metadata record

DC FieldValueLanguage
dc.contributor.author김기범-
dc.date.accessioned2023-12-22T02:39:38Z-
dc.date.available2023-12-22T02:39:38Z-
dc.date.issued2021-06-
dc.identifier.citationApplied Sciences-basel, v. 11, NO. 12, article no. 5503, Page. 1.0-21.0-
dc.identifier.issn2076-3417-
dc.identifier.urihttps://www.mdpi.com/2076-3417/11/12/5503en_US
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/187978-
dc.description.abstractFeatured Application The proposed technique is an application for people detection and counting which is evaluated over several challenging benchmark datasets. The technique can be applied in heavy crowd assistance systems that help to find targeted persons, to track functional movements and to maximize the performance of surveillance security. Automatic head tracking and counting using depth imagery has various practical applications in security, logistics, queue management, space utilization and visitor counting. However, no currently available system can clearly distinguish between a human head and other objects in order to track and count people accurately. For this reason, we propose a novel system that can track people by monitoring their heads and shoulders in complex environments and also count the number of people entering and exiting the scene. Our system is split into six phases; at first, preprocessing is done by converting videos of a scene into frames and removing the background from the video frames. Second, heads are detected using Hough Circular Gradient Transform, and shoulders are detected by HOG based symmetry methods. Third, three robust features, namely, fused joint HOG-LBP, Energy based Point clouds and Fused intra-inter trajectories are extracted. Fourth, the Apriori-Association is implemented to select the best features. Fifth, deep learning is used for accurate people tracking. Finally, heads are counted using Cross-line judgment. The system was tested on three benchmark datasets: the PCDS dataset, the MICC people counting dataset and the GOTPD dataset and counting accuracy of 98.40%, 98%, and 99% respectively was achieved. Our system obtained remarkable results.-
dc.description.sponsorshipThis research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF), funded by theMinistry of Education (No. 2018R1D1A1A02085645). Also, this work was supported by the Korea Medical Device Development Fund grant funded by the Korea government (the Ministry of Science and ICT, the Ministry of Trade, Industry and Energy, the Ministry of Health & Welfare, theMinistry of Food and Drug Safety) (Project Number: 202012D05-02).-
dc.languageen-
dc.publisherMDPI-
dc.subjectApriori-Association-
dc.subjectCross-line judgment-
dc.subjectdeep learning-
dc.subjecthead tracking-
dc.subjectHough Circular Gradient Transform-
dc.subjectFused intra-inter trajectories-
dc.titleA Systematic Deep Learning Based Overhead Tracking and Counting System Using RGB-D Remote Cameras-
dc.typeArticle-
dc.relation.no12-
dc.relation.volume11-
dc.identifier.doi10.3390/app11125503-
dc.relation.page1.0-21.0-
dc.relation.journalApplied Sciences-basel-
dc.contributor.googleauthorGochoo, Munkhjargal-
dc.contributor.googleauthorRizwan, Syeda Amna-
dc.contributor.googleauthorGhadi, Yazeed Yasin-
dc.contributor.googleauthorJalal, Ahmad-
dc.contributor.googleauthorKim, Kibum-
dc.sector.campusE-
dc.sector.daehak소프트웨어융합대학-
dc.sector.departmentICT융합학부-
dc.identifier.pidkibum-
dc.identifier.article5503-


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE