257 0

Full metadata record

DC FieldValueLanguage
dc.contributor.author안용한-
dc.date.accessioned2021-09-28T00:57:48Z-
dc.date.available2021-09-28T00:57:48Z-
dc.date.issued2020-10-
dc.identifier.citationISARC. Proceedings of the International Symposium on Automation and Robotics in Construction, v. 37, page. 781-788en_US
dc.identifier.urihttps://www.proquest.com/docview/2526370065/abstract/4CC0CEC6478F488APQ/1?accountid=11283-
dc.identifier.urihttps://repository.hanyang.ac.kr/handle/20.500.11754/165297-
dc.description.abstractIn order to teleoperate excavators remotely, human operators need accurate information of the robot workspace to carry out manipulation tasks accurately and efficiently. Current visualization methods only allow for limited depth perception and situational awareness for the human operator, leading to high cognitive load when operating the robot in confined spaces or cluttered environments. This research proposes an advanced 3D workspace modeling method for remotely operated construction equipment where the environment is captured in realtime by laser scanning. A real-time 3D workspace state, which contains information such as the pose of end effectors, pose of salient objects, and distances between them, is used to provide feedback to the remote operator concerning the progress of manipulation tasks. The proposed method was validated at a mock urban disaster site where two excavators were teleoperated to pick up and move various debris. A 3D workspace model was constructed by laser scanning which was able to estimate the positions of the excavator and target assets within 0.1 - 0.2m accuracy.en_US
dc.description.sponsorshipThis material is based upon work supported by the Air Force Office of Scientific Research (Award No. FA2386-17-1-4655) and by the Technology Innovation Program (No. 2017-10069072) funded by the Ministry of Trade, Industry & Energy (MOTIE, Korea). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the United States Air Force or MOTIE.en_US
dc.language.isoen_USen_US
dc.publisherInternational Symposium on Automation and Robotics in Constructionen_US
dc.subjectPose estimationen_US
dc.subjectlaser scanningen_US
dc.subjectexcavatoren_US
dc.titleWorkspace Modeling: Visualization and Pose Estimation of Teleoperated Construction Equipment from Point Cloudsen_US
dc.typeArticleen_US
dc.relation.page781-788-
dc.contributor.googleauthorChen, Jingdao-
dc.contributor.googleauthorKim, Pileun-
dc.contributor.googleauthorSun, Dong-Ik-
dc.contributor.googleauthorHan, Chang-Soo-
dc.contributor.googleauthorAhn, Yong Han-
dc.contributor.googleauthorUeda, Jun-
dc.contributor.googleauthorCho, Yong K.-
dc.sector.campusE-
dc.sector.daehakCOLLEGE OF ENGINEERING SCIENCES[E]-
dc.sector.departmentDIVISION OF ARCHITECTURE-
dc.identifier.pidyhahn-
Appears in Collections:
COLLEGE OF ENGINEERING SCIENCES[E](공학대학) > ARCHITECTURE(건축학부) > Articles
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE