Please help transcribe this video using our simple transcription tool. You need to be logged in to do so.
We present a method of utilizing depth information as provided by RGBD sensors for robust real-time visual simultaneous localisation and mapping (SLAM) by augmenting monocular visual SLAM to take into account depth data. This is implemented based on the freely available software â€œParallel Tracking and Mappingâ€ (PTAM) by Georg Klein, which was originally developed for augmented reality applications. Our modiﬁcations allow PTAM to be used as a 6D visual SLAM system even without any additional information about odometry or from an inertial measurement unit.
Questions and AnswersYou need to be logged in to be able to post here.