Running SLAM

SLAM allows your stereo webcam to track it's position and build maps as it moves.

In the previous section we worked on obtaining our stereo camera's intrisnic, extrinsic, and distortion parameters. We can use these parameters in our SLAM library for accurate position tracking.

Running Standalone SLAM Demo

Once you've cloned the project above you can use pixi lock to sync the dependencies.

Now you should replace calib.npz with the calibration file that you gathered earlier.

First we need to start rerun to see the visualization.

pixi run rerun

Then within the same directory but different shell run

pixi run slam-live --rerun-tcp rerun+http://127.0.0.1:9876/proxy --calib calib.npz --camera 0 --log-loop-closures

This will start the live slam demo:

This is a good jumping off point for Visual SLAM and the source code can be found here: https://github.com/CoreVisionX/slam/blob/main/tests/live_slam.py

While there is lot's of supporting code for visualization and CLI arg parsing the SLAM core looks something like this:

Adding Odometry Priors

todo

Useful API references

Last updated