welcome: please sign in
location: Diff for "rtslam"
Differences between revisions 16 and 20 (spanning 4 versions)
Revision 16 as of 2011-02-25 17:00:17
Size: 2613
Editor: cyril42e-wifi
Comment: add H264 tag
Revision 20 as of 2011-09-22 14:22:27
Size: 2769
Editor: vis177b
Comment:
Deletions are marked like this. Additions are marked like this.
Line 4: Line 4:

<!> '''Disclaimer:''' RT-SLAM is currently in development and has no official release. It is provided as-is, there is no standalone installation, and it is generally not advised (yet) to people who do not want to get their hands dirty.
Line 12: Line 9:
 * '''speed''': real time at 60 fps, VGA, gray level;  * '''speed''': real time at 60 fps (VGA, gray level) on a decent machine (at least one Core2 core at 2.2GHz)
Line 16: Line 13:
 * '''open source''': in C++ for Linux and MacOS
Line 20: Line 18:
 * '''Prediction''': Constant velocity model;  * '''Prediction''': Constant velocity model, inertial sensor;
Line 37: Line 35:
 * [[http://homepages.laas.fr/croussil/videos/2011-06-09_rtslam-inertial-robot-grass.avi|Demo 3]] (v.a, 1'25, 27Mo, H264 AVI): IMU+camera on a rover robot on grass, with short term memory (lost landmarks are removed, no loop closure)
 * [[/Material|Additional videos referenced in papers]]
Line 41: Line 41:
 * Inertial Slam

RT-SLAM

RT-SLAM stands for Real Time SLAM (Simultaneous Localization And Mapping).

Presentation

RT-SLAM is a fast Slam library and test framework based on EKF. Its main qualities are:

  • genericity: for sensor models, landmark types, landmark models, landmarks reparametrization, biases estimation;

  • speed: real time at 60 fps (VGA, gray level) on a decent machine (at least one Core2 core at 2.2GHz)

  • flexibility: different estimation/image processing sequencing strategies (active search), independent base brick for a hierarchical multimap and multirobots architecture;

  • robustness: near-optimal repartition of landmarks, data association errors detection (gating, ransac);

  • developer-friendly: visualization tools (2D and 3D), offline replay step by step, logs, simulation.

  • open source: in C++ for Linux and MacOS

For now it provides:

  • Landmarks: Anchored Homogeneous Points (Inverse Depth) that can be reparametrized into Euclidean Points;

  • Sensors: Pinhole cameras;

  • Prediction: Constant velocity model, inertial sensor;

  • Data association: Active search, 1-point Ransac, and mixed strategies.

Documentation

Screenshots and videos

3D view

2D view

  • Demo 1 (v.a, 1'33, 23Mo, H264 AVI): hand held alone camera at 60 fps indoor, with several loop closures.

  • Demo 2 (v.b, 1'06, 20Mo, H264 AVI): hand held IMU+camera at 50 fps indoor, with very high dynamic.

  • Demo 3 (v.a, 1'25, 27Mo, H264 AVI): IMU+camera on a rover robot on grass, with short term memory (lost landmarks are removed, no loop closure)

  • Additional videos referenced in papers

Roadmap

  • Stabilize and make a release
  • Odometry Slam
  • Multimap Slam
  • Segments Slam

Authors

OpenrobotsWiki: rtslam (last edited 2014-07-24 13:43:17 by croussil)