341 0

Full metadata record

DC FieldValueLanguage
dc.contributor.author임종우-
dc.date.accessioned2021-09-08T01:17:51Z-
dc.date.available2021-09-08T01:17:51Z-
dc.date.issued2020-03-
dc.identifier.citation2020 IEEE Winter Conference on Applications of Computer Vision (WACV), Page. 1658-1667en_US
dc.identifier.isbn978-1-7281-6553-0-
dc.identifier.issn2642-9381-
dc.identifier.issn2472-6737-
dc.identifier.urihttps://ieeexplore.ieee.org/document/9093607-
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/164952-
dc.description.abstractUnlike conventional cameras, event cameras capture the intensity changes at each pixel with very little delay. Such changes are recorded as an event stream with their positions, timestamps, and polarities continuously, thus there is no notion of `frame' as in conventional cameras. As many applications including 3D pose estimation use 2D trajectories of feature points, it is necessary to detect and track the feature points robustly and accurately in a continuous event stream. In conventional feature tracking algorithms for event streams, the events in fixed time intervals are converted into the event images by stacking the events at their pixel locations, and the features are tracked in the event images. Such simple stacking of events yields blurry event images due to the camera motion, and it can significantly degrade the tracking quality. We propose to align the events in the time intervals along Bézier curves to minimize the misalignment. Since the camera motion is unknown, the Bézier curve is estimated to maximize the variance of the warped event pixels. Instead of the initial patches for tracking, we use the temporally integrated template patches, as it captures rich texture information from accurately aligned events. Extensive experimental evaluations in 2D feature tracking as well as 3D pose estimation show that our method significantly outperforms the conventional approaches.en_US
dc.description.sponsorshipThis research was supported by Next-Generation Information Computing Development Program through National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT(NRF-2017M3C4A7069369), the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIT) (NRF-2019R1A4A1029800).en_US
dc.language.isoenen_US
dc.publisherIEEEen_US
dc.subjectCamerasen_US
dc.subjectTrackingen_US
dc.subjectStreaming mediaen_US
dc.subjectStackingen_US
dc.subjectTwo dimensional displaysen_US
dc.subjectMicrosoft Windowsen_US
dc.subjectRobustnessen_US
dc.titleRobust Feature Tracking in DVS Event Stream using Bezier Mappingen_US
dc.typeArticleen_US
dc.identifier.doi10.1109/WACV45572.2020.9093607-
dc.relation.page1658-1667-
dc.contributor.googleauthorSeok, Hochang-
dc.contributor.googleauthorLim, Jongwoo-
dc.sector.campusS-
dc.sector.daehakCOLLEGE OF ENGINEERING[S]-
dc.sector.departmentDEPARTMENT OF COMPUTER SCIENCE-
dc.identifier.pidjlim-
dc.identifier.orcidhttps://orcid.org/0000-0002-2814-4765-
Appears in Collections:
COLLEGE OF ENGINEERING[S](공과대학) > COMPUTER SCIENCE(컴퓨터소프트웨어학부) > Articles
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE