We forked the DROID-SLAM GitHub repo to fix some problems we had with visualization. DROID-SLAM uses Open3D for visualization and every time new data was added or removed for rendering, the viewer reset the camera view making it impossible to inspect the SLAM process while it was ongoing. We fixed this in our fork. Start by:
git clone --recursive https://github.com/MayFly-AI/DROID-SLAM.git
Now you can choose to use anaconda by following the install instructions in the README.md (Getting Started section) but it is not necessary. We had problems with resolving the conda environment so we proceeded without anaconda:
pip install evo --upgrade --no-binary evo
pip install gdown
and then:
python setup.py install
To run DROID-SLAM with a MayFly AI sensor, first you need to calibrate the camera and put the intrinsic camera model into a txt file with the following content:
fx fy cx cy [k1 k2 p1 p2 [ k3 [ k4 k5 k6 ]]]
See our calibration file: ./calib/sensorleap.txt. Follow this guide for camera calibration.
To run DROID-SLAM live on stream from MayFly AI sensor, run:
python sensorleap.py --calib=calib/sensorleap.txt --stride=1
Note that sensorleap.py only runs the frontend part with tracking and local bundle adjustment. To do global bundle adjustment on a recording, use demo.py.