Please help transcribe this video using our simple transcription tool. You need to be logged in to do so.

Description

We present a method of utilizing depth information as provided by RGBD sensors for robust real-time visual simultaneous localisation and mapping (SLAM) by augmenting monocular visual SLAM to take into account depth data. This is implemented based on the freely available software “Parallel Tracking and Mapping” (PTAM) by Georg Klein, which was originally developed for augmented reality applications. Our modifications allow PTAM to be used as a 6D visual SLAM system even without any additional information about odometry or from an inertial measurement unit.

Questions and Answers

You need to be logged in to be able to post here.