Teleyes: a Telepresence System based on Stereoscopic Vision and Head Motion Tracking
Date Issued
2016
Date
2016
Author(s)
Wen, Ming-Chang
Abstract
This research develops a telepresence system aiming to reduce visual distortion of remote environment. Most of teleoperation applications take place in remote environments that are unstructured and not precisely known a priori. Therefore, fully autonomous control of an unmanned system is infeasible in such cases. To impose human intelligence on the task to cope with uncertainties, an improved telepresence system that eliminates the gap between machine and operator is essential. Current methods mainly use two-dimensional (2D) image covering a restricted field-of-view (FOV) as visual aid and numeric onboard sensor readings as references for operators. Since, sensory cues are significantly lost which include ambient visual information, depth information, and kinesthetic input. Also, in cases that use unmanned vehicle system (UVS) to inspect fully unknown areas, sensor information is not applicable until it is analyzed or processed. Instead of sending multiple sensors into the field with unmanned vehicles and retrieve a large amount of data for time-consuming post analysis, an instinct and sensory method for quick understanding and reconstruction of onsite situation is needed. The major effort of this research is taking advantage of state-of-art technologies of three-dimensional (3D) input/output and developing an avatar-like mechanism to synchronize the physical behavior of an operator. Two of 3D input/output methods are used in this research: stereoscopic vision and motion tracking. The system is designed to work under a close-loop control with human feedback involved. Two cameras that are optimized according to human eyes angle-of-view (AOV) for accommodating human eyes’ perspective are used as stereoscopic vision input on the unmanned vehicle. The stereoscopic image is then streamed back to the operator through radio downlink in realtime. On the operator end, a head mount display (HMD) is used for displaying the stereoscopic image and tracking head movement of the operator with embedded sensors. The head tracking data are interpreted to control signal and returned to unmanned vehicle for controlling a three-axis gimbal mechanism on which the two cameras are installed. Then the loop is closed by synchronizing the operator and cameras’ FOV. Teleyes system has been validated on a designed experimental application scenario comparing with current methods with five operators. The result shows that Teleyes significantly reduces distance error of four test operators by 66.84%, 47.75%, 33.41%, and 40.58%. It also significantly reduces time usage of four test operators by 19.24%, 23.86%, 22.63%, and 0.93%. In conclusion, the Teleyes system has improved visual experience and operating efficiency that have potential on saving general resources and expanding the application. The developed system provides operator an immersive first-person view (FPV) that provides the operator a visual experience as being onboard.
Subjects
telepresence
stereoscopic vision
motion tracking
teleoperation
human machine interface
Type
thesis
File(s)![Thumbnail Image]()
Loading...
Name
ntu-105-R02521609-1.pdf
Size
23.32 KB
Format
Adobe PDF
Checksum
(MD5):ca46e42c4a5bb14c1e3f26c8078d6d20
