254 0

Learning cooperative dynamic manipulation skills from human demonstration videos

Title
Learning cooperative dynamic manipulation skills from human demonstration videos
Author
김완수
Keywords
Transfer Learning; Multi-Agent Systems; 3D Pose Estimation; Visual Imitation; Human Action
Issue Date
2022-04
Publisher
PERGAMON-ELSEVIER SCIENCE LTD
Citation
MECHATRONICS, Page. 102807-102807
Abstract
This article proposes a method for learning and robotic replication of dynamic collaborative tasks from offline videos. The objective is to extend the concept of learning from demonstration (LfD) to dynamic scenarios, benefiting from widely available or easily producible offline videos. To achieve this goal, we decode important dynamic information, such as the Configuration Dependent Stiffness (CDS), which reveals the contribution of arm pose to the arm endpoint stiffness, from a three-dimensional human skeleton model. Next, through encoding of the CDS via Gaussian Mixture Model (GMM) and decoding via Gaussian Mixture Regression (GMR), the robot’s Cartesian impedance profile is estimated and replicated. We demonstrate the proposed method in a collaborative sawing task with leader-follower structure, considering environmental constraints and dynamic uncertainties. The experimental setup includes two Panda robots, which replicate the leader-follower roles and the impedance profiles extracted from a two-persons sawing video.
URI
https://arxiv.org/abs/2204.04003https://repository.hanyang.ac.kr/handle/20.500.11754/170779
Appears in Collections:
COLLEGE OF ENGINEERING SCIENCES[E](공학대학) > ETC
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE