Skip to content

Instantly share code, notes, and snippets.

@apirrone
Last active October 29, 2017 19:36
Show Gist options
  • Save apirrone/65dd584f4446a5b96a125696fbe41762 to your computer and use it in GitHub Desktop.
Save apirrone/65dd584f4446a5b96a125696fbe41762 to your computer and use it in GitHub Desktop.
M2, UE Initiation à la Recherche

https://link.springer.com/article/10.1007/s40903-015-0032-7

Motivation :

Autonomous robots need to be able to localize themselves in their environment

Notes :

“classic” Odometry : Process of estimating the robot’s motion by using data from motion sensors in conjunction with robot motion model -> integration over time -> accumulation of error (for example in a wheeled robot, wheel slipping, uneven terrain…)

Main differences VO and SLAM : VO -> local, SLAM -> aims for global

SLAM : Simultaneous Localisation And Mapping

Localisation and building a map of a previously unknown environment on the go, without information from an external sensor. Global and consistant estimate of the robot path (using loop closure, if we detect that we are visiting an already visited area, we can try to correct drift)

VO : Visual Odometry

Particular case of SFM (Structure From Motion) focus on “local consistency” -> incrementally estimating path of the robot using information from camera(s). Can be used as a complement to wheel odometry, gps, IMUs, laser odometry [101] Computes camera path incrementally and in real time [101] Main task is to compute the relative transformations from two sequential frames to recover the full trajectory of the camera in 3D space [101]

How to estimate motion ?

Different techniques depending on stereo or mono feature matching, feature tracking, optical flow techniques

Seen references:

[!] : Important, to be fully read

[101] Tutorial on visual odometry :

http://rpg.ifi.uzh.ch/docs/Visual_Odometry_Tutorial.pdf

http://rpg.ifi.uzh.ch/docs/VO_Part_II_Scaramuzza.pdf

Monocular and stereo VO :

[19] http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.63.6895&rep=rep1&type=pdf

[79] https://www.ri.cmu.edu/pub_files/pub3/matthies_l_1987_1/matthies_l_1987_1.pdf

Different techniques for motion estimation (in real time)

[!] [108] https://www-robotics.jpl.nasa.gov/publications/Adnan_Ansar/IROS-2003-Atalukder-JPL.pdf Feature matching, Based on the Sum of Absolute Differences (SAD) metrics,

[!] [31] https://www.diva-portal.org/smash/get/diva2:459981/FULLTEXT01.pdf Feature Tracking (used in robocup rescue league) -> KLT Feature tracker (https://cecas.clemson.edu/~stb/klt/tomasi-kanade-techreport-1991.pdf) Pre-selection of features, the ones with high contrast have a higher probability of being chosen by the KLT, because easier to track

[!] [118] https://www.researchgate.net/profile/Kolja_Kuehnlenz/publication/221063668_Visual_Odometry_for_the_Autonomous_City_Explorer/links/02e7e53298ae35e963000000/Visual-Odometry-for-the-Autonomous-City-Explorer.pdf Optical flow based technique

[88] http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.91.5767&rep=rep1&type=pdf

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment