Size: 10260
Comment:
|
Size: 10566
Comment:
|
Deletions are marked like this. | Additions are marked like this. |
Line 6: | Line 6: |
* /!\ It '''must''' be a '''global shutter''' camera! Rolling shutter cameras such as most webcams acquire pixels line by line: different lines don't have the same timestamp and it creates peculiar distortion when moving which cannot be calibrated with standard calibration models. | * /!\ It '''must''' be a '''global shutter''' camera! Rolling shutter cameras such as most webcams acquire pixels line by line: different lines don't have the same timestamp and it creates peculiar distortion when moving which cannot be calibrated with standard calibration models. All CCD sensors are global shutter, most CMOS sensors are rolling shutter (only high-end CMOS sensors are global shutter). |
Line 12: | Line 12: |
* For live use of RT-SLAM, only Firewire and Ueye USB cameras are currently supported. We have successfully tested RT-SLAM with PointGrey Flea2 and Flea3, AVT Marlin, and UEye cameras, but it should work correctly with cheaper cameras. | * For live use of RT-SLAM, only Firewire and Ueye USB cameras are currently supported. We have successfully tested RT-SLAM with !PointGrey Flea2 and Flea3, AVT Marlin, and UEye cameras, but it should work correctly with cheaper cameras such as [[http://www.unibrain.com/products/visionimg/tSpec_Fire_i_DC.htm|Unibrain Fire-i cameras]] (around $100). |
Line 36: | Line 36: |
* convert the IMU data to a file named MTI.log with the image files, with the format "[10](<date>, <acc_x>, <acc_y>, <acc_z>, <gyr_x>, <gyr_y>, <gyr_z>, <mag_x>, <mag_y>, <mag_z>)". The date has the same format than for images, and must be synchronized with the images timestamps (but you can shift it with the option IMU_TIMESTAMP_CORRECTION in setup.cfg). The gyrometers values corresponding to the rotation velocity around the specified axis, in rad/s. The magnetometers values are ignored for now. * don't forget to change the file setup.cfg for the camera calibration, IMU noise specs, transformation camera<->IMU etc. |
* if applicable, convert the IMU data to a file named MTI.log with the image files, with the format "[10](<date>, <acc_x>, <acc_y>, <acc_z>, <gyr_x>, <gyr_y>, <gyr_z>, <mag_x>, <mag_y>, <mag_z>)". The date has the same format than for images, and must be synchronized with the images timestamps (but you can shift it with the option IMU_TIMESTAMP_CORRECTION in setup.cfg). The gyrometers values corresponding to the rotation velocity around the specified axis, in rad/s. The magnetometers values are ignored for now. * don't forget to change the file setup.cfg for the camera calibration, IMU noise specs, transformation camera<->IMU etc, and to use the right robot model with option --robot. |
Using RT-SLAM
Requirements and pieces of advice
camera:
It must be a global shutter camera! Rolling shutter cameras such as most webcams acquire pixels line by line: different lines don't have the same timestamp and it creates peculiar distortion when moving which cannot be calibrated with standard calibration models. All CCD sensors are global shutter, most CMOS sensors are rolling shutter (only high-end CMOS sensors are global shutter).
- Only gray level is required
- VGA resolution (640x480) is recommended (may work with lower resolution but will reduce precision, higher resolution will be at the expense of lower framerate which may not be desirable).
- Framerate should be at least 30fps (say between 30 and 60fps), at least for constant velocity slam (without an IMU): the higher the framerate is, the better robustness and accuracy will be. With an IMU, depending on its quality, it is possible to decrease it.
- Firewire cameras have very good timestamps, which is especially important for inertial slam to avoid having to hard synchronize the IMU and the camera.
- Being able to control the shutter time is very interesting for highly dynamic movements with low light, to avoid having fuzzy image. It's better for data association to have noisy images than fuzzy images. It can be done either by software or with hardware trigger, depending on the camera. If you can control the lens aperture, open it as much as you can as long as the image remains in focus at reasonable depths (bigger aperture reduces depth of field).
For live use of RT-SLAM, only Firewire and Ueye USB cameras are currently supported. We have successfully tested RT-SLAM with PointGrey Flea2 and Flea3, AVT Marlin, and UEye cameras, but it should work correctly with cheaper cameras such as Unibrain Fire-i cameras (around $100).
It must be carefully calibrated, for instance with the Matlab calibration toolbox or OpenCV calibration sample. Tangential distortion should be disabled during calibration as it is not supported by RT-SLAM (est_dist=[1;1;0;0;1] for Matlab or -zt option for OpenCV). Third order radial distortion parameter should be enabled if distortion is high.
IMU (optional):
- It must provide both angular velocities and linear accelerations
- Frequency should be greater than images frequency (though otherwise it will interpolate and it will still work)
- No particular requirement on quality, but the noise model has to be correctly setup in the configuration according to the datasheet. However the more precise it will be, the better RT-SLAM precision and robustness will be, and the longer you will be able to live without images.
- For live use of RT-SLAM, only XSens MTi IMUs are supported. If you want to use another hardware you have to write an adapter (called hardwareSensor in RT-SLAM).
The tranformation camera<->IMU needs to be carefully computed. Check the datasheets of the devices to know where is exactly their origin (for an IMU it is the physical location of the accelerometers, for a camera it is the focal point which is located at focal length in front of the image plane). The transformation is the coordinates of the camera's origin in the IMU's frame.
Configuring the demo
There are two configuration files :
--config-setup=data/setup.cfg that contains everything related to the hardware setup (camera, IMU, calibrations, hardware noise specs, ...). If none specified, the default file is data/setup.cfg in live mode, and <data-path>/setup.cfg in replay mode.
--config-estimation=data/estimation.cfg that contains everything related to estimation and Slam (size of map, number of landmarks, strategies, ...). If none specified, the default file is data/estimation.cfg.
You can use the '@' character to refer to <data-path> when specifying the location of the config file to use : --config-estimation=@/estimation.cfg
The parameters are documented in the demo_slam.cpp file, with the definition of the data structures ConfigSetup and ConfigEstimation.
Offline processing
If you already have some data and you want to process them with RT-SLAM:
convert the images to a sequence named with the format "image_%07d.png" with a corresponding date file "image_%07d.time" that contains the timestamp in seconds as a float number.
Reminder: the images must come from a global shutter camera! See requirements section above for explanations.
if applicable, convert the IMU data to a file named MTI.log with the image files, with the format "[10](<date>, <acc_x>, <acc_y>, <acc_z>, <gyr_x>, <gyr_y>, <gyr_z>, <mag_x>, <mag_y>, <mag_z>)". The date has the same format than for images, and must be synchronized with the images timestamps (but you can shift it with the option IMU_TIMESTAMP_CORRECTION in setup.cfg). The gyrometers values corresponding to the rotation velocity around the specified axis, in rad/s. The magnetometers values are ignored for now.
don't forget to change the file setup.cfg for the camera calibration, IMU noise specs, transformation camera<->IMU etc, and to use the right robot model with option --robot.
Sample demo
You can download here a sample image and IMU sequence to quickly test RT-SLAM offline. Give the path where the archive has been unzipped with the option --data-path.
Note that this sequence was meant to illustrate the high dynamic movements that the system can withstand thanks to IMU-vision fusion, so it is recommended to enable the use of IMU data with --robot=1 option. It will still diverge at the very end because it goes beyond the IMU limits (300 deg/s).
As an example you can use a command similar to this one to run the demo:
demo_suite/x86_64-linux-gnu/demo_slam --disp-2d=1 --disp-3d=1 --robot=1 --camera=1 --map=1 --render-all=0 --replay=1 --rand-seed=1 --pause=1 --data-path=~/2011-02-15_inertial-high-dyn/
Running the demo
Control options:
--disp-2d=0/1 use 2D display
--disp-3d=0/1 use 3D display
--render-all=0/1 (needs --replay 1)
--replay=0/1/2 (off/on/off no slam) (needs --data-path and having dumped here an online demo first)
--dump=0/1 dump the images (--replay=0) or the rendered views (--replay=1) (needs --data-path). If --replay=0, --data-path should be in a ram disk.
--export=0/1/2. 0=Off, 1=socket, 2=poster
--log=0/1/filename log result in text file
--rand-seed=0/1/n. 0=generate new one, 1=in replay use the saved one, n=use seed n
--pause=0/n. 0=don't, n=pause for frames>n (needs --replay 1)
--data-path=/mnt/ram/rtslam
--config-setup=data/setup.cfg
--config-estimation=data/estimation.cfg
--verbose=0/1/2/3/4/5. Off/Trace/Warning/Debug/VerboseDebug/VeryVerboseDebug (only if compiled in debug)
--help
--usage
Slam options:
--robot=0/1. 0=constant velocity, 1=inertial
--map=0/1/2. 0=odometry, 1=global, 2=local/multimap
--gps=0/1/2/3 (off/pos/pos+vel/pos+ori) use GPS
--heading=<double,rad>. initial absolute heading
Raw data options:
--simu=0/<environment id>*10+<trajectory id>
--trigger=0/1/2. 0=internal, 1=external with MTI control, 2=external without control
--freq=<double,Hz>. camera frequency (only with trigger==0/1)
--shutter=<double,seconds>. shutter time (only with trigger==1)
Controlling the demo
When doing a replay, you can control how slam is running by interacting with the 2D viewer:
Press "[space]" to switch between play and pause mode.
Press "N" to process one image when in pause mode.
Press "F" to switch between render-all mode and fast mode.
Press "Q" to quit.
Click on an observation to get textual and visual information about the observation and associated landmark.
Click on an empty part of the image to get information about the sensor and the associated robot.
Interpreting the demo
Color code:
magenta: InitialParametrization landmarks that were observed this frame
dark red: InitialParametrization landmarks that were not observed this frame
cyan: ConvergedParametrization landmarks that were observed this frame
dark blue: ConvergedParametrization landmarks that were not observed this frame
Another way to remember the color code:
bright colors represent observed landmarks
dark colors represent non observed landmarks
reddish hues represent InitialParametrization landmarks (like inverse depth for points)
bluish hues represent ConvergedParametrization landmarks (like euclidean for points)
2D landmark representation:
a colored + cross: the predicted position of the landmark
a colored ellipse: the 3-sigma prediction uncertainty
an orange x cross: the measured position of the landmark
a colored label: the id of the landmark, followed by the matching score (/100)
Reasons for a landmark not being observed:
- outside of the sensor frame (you can still see them in the 3D view)
- matching failed (you can see the matching score that is below the threshold)
- the observation was inconsistent (the measure is out of the uncertainty ellipse)
- too many landmarks are in the sensor frame and we didn't try to observe some of them (no matching score is present). In that case the landmark is regardless drawn with the "observed" color, because it wasn't due to a problem with the landmark.
Contributions
If you are willing to submit some contribution, you can either:
- commit, create a patch with the command "git format-patch -M origin/master" and send it to me, so that I can apply it to the main tree with your credits.
- commit, and send me the address of your git repository if it is externally accessible so that I can pull from it.