150 0

Full metadata record

DC FieldValueLanguage
dc.contributor.author김기범-
dc.date.accessioned2021-11-30T02:23:30Z-
dc.date.available2021-11-30T02:23:30Z-
dc.date.issued2021-06-
dc.identifier.citationAPPLIED SCIENCES-BASEL, v. 11, no. 12, Article no. 5503, 21ppen_US
dc.identifier.issn2076-3417-
dc.identifier.urihttps://www.proquest.com/docview/2544958318?accountid=11283-
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/166594-
dc.description.abstractFeatured Application The proposed technique is an application for people detection and counting which is evaluated over several challenging benchmark datasets. The technique can be applied in heavy crowd assistance systems that help to find targeted persons, to track functional movements and to maximize the performance of surveillance security. Automatic head tracking and counting using depth imagery has various practical applications in security, logistics, queue management, space utilization and visitor counting. However, no currently available system can clearly distinguish between a human head and other objects in order to track and count people accurately. For this reason, we propose a novel system that can track people by monitoring their heads and shoulders in complex environments and also count the number of people entering and exiting the scene. Our system is split into six phases at first, preprocessing is done by converting videos of a scene into frames and removing the background from the video frames. Second, heads are detected using Hough Circular Gradient Transform, and shoulders are detected by HOG based symmetry methods. Third, three robust features, namely, fused joint HOG-LBP, Energy based Point clouds and Fused intra-inter trajectories are extracted. Fourth, the Apriori-Association is implemented to select the best features. Fifth, deep learning is used for accurate people tracking. Finally, heads are counted using Cross-line judgment. The system was tested on three benchmark datasets: the PCDS dataset, the MICC people counting dataset and the GOTPD dataset and counting accuracy of 98.40%, 98%, and 99% respectively was achieved. Our system obtained remarkable results.en_US
dc.description.sponsorshipThis research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF), funded by the Ministry of Education (No. 2018R1D1A1A02085645). Also, this work was supported by the Korea Medical Device Development Fund grant funded by the Korea government (the Ministry of Science and ICT, the Ministry of Trade, Industry and Energy, the Ministry of Health & Welfare, the Ministry of Food and Drug Safety) (Project Number: 202012D05-02).en_US
dc.language.isoen_USen_US
dc.publisherMDPIen_US
dc.subjectApriori-Associationen_US
dc.subjectCross-line judgmenten_US
dc.subjectdeep learningen_US
dc.subjecthead trackingen_US
dc.subjectHough Circular Gradient Transformen_US
dc.subjectFused intra-inter trajectoriesen_US
dc.titleA Systematic Deep Learning Based Overhead Tracking and Counting System Using RGB-D Remote Camerasen_US
dc.typeArticleen_US
dc.relation.no12-
dc.relation.volume11-
dc.identifier.doi10.3390/app11125503-
dc.relation.page1-23-
dc.relation.journalAPPLIED SCIENCES-BASEL-
dc.contributor.googleauthorGochoo, Munkhjargal-
dc.contributor.googleauthorRizwan, Syeda Amna-
dc.contributor.googleauthorGhadi, Yazeed Yasin-
dc.contributor.googleauthorJalal, Ahmad-
dc.contributor.googleauthorKim, Kibum-
dc.relation.code2021004533-
dc.sector.campusE-
dc.sector.daehakCOLLEGE OF COMPUTING[E]-
dc.sector.departmentDIVISION OF MEDIA, CULTURE, AND DESIGN TECHNOLOGY-
dc.identifier.pidkibum-
Appears in Collections:
ETC[S] > 연구정보
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE