181 0

Progressive 3D Reconstruction and Visualization for Intuitive Tangible Augmented Reality

Title
Progressive 3D Reconstruction and Visualization for Intuitive Tangible Augmented Reality
Other Titles
직관적인 탠저블 증강현실을 위한 점진적 3차원 재구성 및 시각화
Author
정경부
Alternative Author(s)
Kyungboo Jung
Advisor(s)
최병욱
Issue Date
2012-08
Publisher
한양대학교
Degree
Doctor
Abstract
Among the many augmented reality (AR) techniques reported to date, tangible AR is one of the most effective means of delivering information to users, as it can display information seamlessly on real-world target objects and the user can easily modify/handle the displayed information by directly manipulating the object. To realize the tangible AR, essential components are seamless augmentation of virtual object, three-dimensional(3D) reconstruction in real time to enable interaction between user and objects, and a natural and effective user interface. A 3D display is also needed to naturally visualize 3D virtual objects. This dissertation deals with three aspects of tangible augmented reality. First, methods of progressive 3D reconstruction of real-world objects necessary for displaying 3D objects on a 3D display are discussed taking into consideration the geometric relations. Second, viewpoint-dependent display of reconstructed 3D virtual objects on a 3D display device is discussed. Finally, intuitive manipulation of virtual objects using hand gestures is considered. Many methods of 3D reconstruction described previously used special depth sensors or structures from motion obtained from video inputs. However, depth sensors are expensive and have restrictions with regard to space and capture range. The structure-from-motion techniques are not very robust and difficult to use. To solve these problems, two 3D reconstruction methods are proposed. In the first method, the user directly acquires a 3D surface, points, plane, etc., using marker-based authoring pens. This method can reconstruct complex objects because it obtains 3D information from the objects by user interaction and with correct marker positions. In the second method, the user indirectly reconstructs an object model using structure from motion and plane classification. Unlike the first method, this method requires minimal user intervention. These methods focus on intuitive extraction of geometric information of target objects for tangible AR. As these methods are executed by degrees according to user interactions, the target objects are progressively reconstructed in 3D. That is, these methods involve intuitive natural multi-marker generation. In general, glasses-based or glasses-less 3D display devices are used to display 3D reconstructed objects or virtual information. Previous 3D display devices consider the fixed geometric relationships between the user and display device, e.g., distance or direction. This dissertation proposes two novel 3D visualization methods that adjust the actual disparity depending on the distance from the display to the user. The visualization methods consider geometric relationships between the user and display device. The proposed methods are software-oriented approaches for generating multi-view images optimized for user position, unlike the hardware-oriented approaches described in previous research. In addition, a hand gesture recognition algorithm for natural 3D human-computer interaction is proposed, which uses a stereo camera and a personalized skin color model that is acquired automatically by special hand gestures. Many previously described methods of user interaction utilized static hand poses. These methods are not intuitive and require a great deal of time to learn the relations between hand poses and commands. However, as the proposed method uses hand gestures that are common in daily life, the user can intuitively interact with objects and there is no special learning period. To verify the efficiency of each proposed method, real-world objects were reconstructed and the distances of the reconstructed results were compared with the real objects. In addition, multi-view images generated by the two proposed methods were compared with those obtained by previously described methods, as multi-view images are displayed by a 3D display device. Effective and extensible possibilities of the interface using hand gestures, by applying the method to a 3D stream application, i.e., a 3D window manager, are shown. Effective and extensible possibilities of the interface using hand gestures, by applying the method to a 3D stream application, i.e., a 3D window manager, are shown. The results show that the proposed methods provide intuitive user interaction for tangible AR.
URI
https://repository.hanyang.ac.kr/handle/20.500.11754/135952http://hanyang.dcollection.net/common/orgView/200000419931
Appears in Collections:
GRADUATE SCHOOL[S](대학원) > ELECTRONICS AND COMPUTER ENGINEERING(전자컴퓨터통신공학과) > Theses (Ph.D.)
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE