Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | 서종원 | - |
dc.date.accessioned | 2018-05-21T07:30:51Z | - |
dc.date.available | 2018-05-21T07:30:51Z | - |
dc.date.issued | 2016-05 | - |
dc.identifier.citation | AUTOMATION IN CONSTRUCTION, v. 65, Page. 51-64 | en_US |
dc.identifier.issn | 0926-5805 | - |
dc.identifier.issn | 1872-7891 | - |
dc.identifier.uri | https://www.sciencedirect.com/science/article/pii/S0926580516300255?via%3Dihub | - |
dc.identifier.uri | https://repository.hanyang.ac.kr/handle/20.500.11754/71430 | - |
dc.description.abstract | Marker-based pose estimation, in which optical cameras monitor fiducial markers to determine the three-dimensional positioning and orientation of an articulated machine's end effector, has been identified as a potential low-cost alternative to currently available machine control and guidance systems. In an effort to develop such a marker-based pose estimation system for excavators, several iterations of prototypes were designed, fabricated, and tested. Performance was measured in terms of the system's ability to estimate bucket tooth position, with an acceptance criterion of 2.5 cm (1 in.# of absolute error. Although initial prototypes were found to possess practicality and performance issues, a fourth prototype offered encouraging experimental results suggesting the feasibility of marker-based sensor technology for excavator pose estimation. Further work needed to refine the technology for large-scale practical implementation was also identified. #C) 2016 Elsevier B.V. All rights reserved. | en_US |
dc.description.sponsorship | This research was funded by the United States National Science Foundation (NSF) via Grants CMMI-1,160,937, CMMI-1,265,733, and IIP-1,343,124. The authors gratefully acknowledge NSF's support. The authors also thank Walbridge Construction Company, Eagle Excavation Company, and the University of Michigan Architecture, Engineering, and Construction (AEC) division for their support in providing access to construction equipment and jobsites for experimentation and validation. The authors also thank undergraduate researchers Gabriel Bartosiewicz, Bradley Hecht, Jack Kosaian, Tracey Lo, Ritika Mehta, Andrea Mercier, Joshua Rios, and Matthew Stone for their assistance in conducting this research. Any opinions, findings, conclusions, and recommendations expressed in this paper are those of the authors and do not necessarily reflect the views of the NSF, Walbridge, Eagle Excavation, or the University of Michigan. Suyang Dong and Vineet R. Kamat have a significant financial and leadership interest in a start-up company named Perception Analytics & Robotics LLC (PeARL), and are the inventors of technology involved in or enhanced by its use in this research project. | en_US |
dc.language.iso | en | en_US |
dc.publisher | ELSEVIER SCIENCE BV | en_US |
dc.subject | Camera-marker network | en_US |
dc.subject | Pose estimation | en_US |
dc.subject | Machine control | en_US |
dc.subject | Equipment monitoring | en_US |
dc.subject | Construction equipment | en_US |
dc.subject | Articulated machines | en_US |
dc.subject | Excavator guidance | en_US |
dc.subject | Grade control | en_US |
dc.title | Optical marker-based end effector pose estimation for articulated excavators | en_US |
dc.type | Article | en_US |
dc.relation.volume | 65 | - |
dc.identifier.doi | 10.1016/j.autcon.2016.02.003 | - |
dc.relation.page | 51-64 | - |
dc.relation.journal | AUTOMATION IN CONSTRUCTION | - |
dc.contributor.googleauthor | Lundeen, Kurt M. | - |
dc.contributor.googleauthor | Dong, Suyang | - |
dc.contributor.googleauthor | Fredricks, Nicholas | - |
dc.contributor.googleauthor | Akula, Manu | - |
dc.contributor.googleauthor | Seo, Jongwon | - |
dc.contributor.googleauthor | Kamat, Vineet R. | - |
dc.relation.code | 2016003867 | - |
dc.sector.campus | S | - |
dc.sector.daehak | COLLEGE OF ENGINEERING[S] | - |
dc.sector.department | DEPARTMENT OF CIVIL AND ENVIRONMENTAL ENGINEERING | - |
dc.identifier.pid | jseo | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.