Skip to content

Commit 6463912

Browse files
committed
monocular release
1 parent af781f8 commit 6463912

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

85 files changed

+37428
-1
lines changed

.gitignore

Lines changed: 158 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,158 @@
1+
a# Byte-compiled / optimized / DLL files
2+
__pycache__/
3+
*.py[cod]
4+
*$py.class
5+
6+
# C extensions
7+
*.so
8+
9+
# Distribution / packaging
10+
.Python
11+
build/
12+
develop-eggs/
13+
dist/
14+
downloads/
15+
eggs/
16+
.eggs/
17+
lib/
18+
lib64/
19+
parts/
20+
sdist/
21+
var/
22+
wheels/
23+
share/python-wheels/
24+
*.egg-info/
25+
.installed.cfg
26+
*.egg
27+
MANIFEST
28+
29+
# PyInstaller
30+
# Usually these files are written by a python script from a template
31+
# before PyInstaller builds the exe, so as to inject date/other infos into it.
32+
*.manifest
33+
*.spec
34+
35+
# Installer logs
36+
pip-log.txt
37+
pip-delete-this-directory.txt
38+
39+
# Unit test / coverage reports
40+
htmlcov/
41+
.tox/
42+
.nox/
43+
.coverage
44+
.coverage.*
45+
.cache
46+
nosetests.xml
47+
coverage.xml
48+
*.cover
49+
*.py,cover
50+
.hypothesis/
51+
.pytest_cache/
52+
cover/
53+
54+
# Translations
55+
*.mo
56+
*.pot
57+
58+
# Django stuff:
59+
*.log
60+
local_settings.py
61+
db.sqlite3
62+
db.sqlite3-journal
63+
64+
# Flask stuff:
65+
instance/
66+
.webassets-cache
67+
68+
# Scrapy stuff:
69+
.scrapy
70+
71+
# Sphinx documentation
72+
docs/_build/
73+
74+
# PyBuilder
75+
.pybuilder/
76+
target/
77+
78+
# Jupyter Notebook
79+
.ipynb_checkpoints
80+
81+
# IPython
82+
profile_default/
83+
ipython_config.py
84+
85+
# pyenv
86+
# For a library or package, you might want to ignore these files since the code is
87+
# intended to run in multiple environments; otherwise, check them in:
88+
# .python-version
89+
90+
# pipenv
91+
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
92+
# However, in case of collaboration, if having platform-specific dependencies or dependencies
93+
# having no cross-platform support, pipenv may install dependencies that don't work, or not
94+
# install all needed dependencies.
95+
#Pipfile.lock
96+
97+
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
98+
__pypackages__/
99+
100+
# Celery stuff
101+
celerybeat-schedule
102+
celerybeat.pid
103+
104+
# SageMath parsed files
105+
*.sage.py
106+
107+
# Environments
108+
.env
109+
.venv
110+
env/
111+
venv/
112+
ENV/
113+
env.bak/
114+
venv.bak/
115+
116+
# Spyder project settings
117+
.spyderproject
118+
.spyproject
119+
120+
# Rope project settings
121+
.ropeproject
122+
123+
# mkdocs documentation
124+
/site
125+
126+
# mypy
127+
.mypy_cache/
128+
.dmypy.json
129+
dmypy.json
130+
131+
# Pyre type checker
132+
.pyre/
133+
134+
# pytype static type analyzer
135+
.pytype/
136+
137+
# Cython debug symbols
138+
cython_debug/
139+
140+
141+
142+
__pycache__
143+
build
144+
dist
145+
*.egg-info
146+
*.vscode/
147+
*.pth
148+
tests
149+
checkpoints
150+
datasets
151+
runs
152+
cache
153+
*.out
154+
*.o
155+
data
156+
figures/*.pdf
157+
158+

.gitmodules

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
[submodule "thirdparty/lietorch"]
2+
path = thirdparty/lietorch
3+
url = https://github.com/princeton-vl/lietorch
4+
[submodule "thirdparty/eigen"]
5+
path = thirdparty/eigen
6+
url = https://gitlab.com/libeigen/eigen.git

README.md

Lines changed: 118 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,4 +6,121 @@
66
[DROID-SLAM: Deep Visual SLAM for Monocular, Stereo, and RGB-D Cameras](https://arxiv.org/abs/2108.10869)
77
Zachary Teed and Jia Deng
88

9-
* Code for our paper DROID-SLAM will be added on September 1st.
9+
```
10+
@article{teed2021droid,
11+
title={{DROID-SLAM: Deep Visual SLAM for Monocular, Stereo, and RGB-D Cameras}},
12+
author={Teed, Zachary and Deng, Jia},
13+
journal={arXiv preprint arXiv:2108.10869},
14+
year={2021}
15+
}
16+
```
17+
18+
**Initial Code Release:** This repo currently provides a single GPU implementation of our monocular SLAM system. It also contains demos, training, and evaluation scripts. Stereo, RGB-D, and multi-GPU code will be added on **September 7**.
19+
20+
21+
## Requirements
22+
23+
To run the code you will need ...
24+
* **Inference:** Running the demos will require a GPU with at least 11G of memory.
25+
26+
* **Training:** Training requires a GPU with at least 24G of memory. We train on 4 x RTX-3090 GPUs.
27+
28+
## Getting Started
29+
1. Clone the repo using the `--recursive` flag
30+
```Bash
31+
git clone --recursive https://github.com/princeton-vl/DROID-SLAM.git
32+
```
33+
34+
2. Creating a new anaconda environment using the provided .yaml file. Use `environment_novis.yaml` to if you do not want to use the visualization
35+
```Bash
36+
conda env create -f environment.yml
37+
pip install evo --upgrade --no-binary evo
38+
pip install gdown
39+
```
40+
41+
3. Compile the extensions (takes about 10 minutes)
42+
```Bash
43+
python setup.py install
44+
```
45+
46+
47+
## Demos
48+
49+
1. Download the model from google drive: [droid.pth](https://drive.google.com/file/d/1PpqVt1H4maBa_GbPJp4NwxRsd9jk-elh/view?usp=sharing)
50+
51+
2. Download some sample videos using the provided script.
52+
```Bash
53+
./tools/download_sample_data.sh
54+
```
55+
56+
Run the demo on any of the samples (all demos can be run on a GPU with 11G of memory). While running, press the "s" key to increase the filtering threshold (= more points) and "a" to decrease the filtering threshold (= fewer points).
57+
58+
59+
```Python
60+
python demo.py --imagedir=data/abandonedfactory --calib=calib/tartan.txt --stride=2
61+
```
62+
63+
```Python
64+
python demo.py --imagedir=data/sfm_bench/rgb --calib=calib/eth.txt
65+
```
66+
67+
```Python
68+
python demo.py --imagedir=data/Barn --calib=calib/barn.txt --stride=1 --backend_nms=4
69+
```
70+
71+
```Python
72+
python demo.py --imagedir=data/mav0/cam0/data --calib=calib/euroc.txt --t0=150
73+
```
74+
75+
```Python
76+
python demo.py --imagedir=data/rgbd_dataset_freiburg3_cabinet/rgb --calib=calib/tum3.txt
77+
```
78+
79+
80+
**Running on your own data:** All you need is a calibration file. Calibration files are in the form
81+
```
82+
fx fy cx cy [k1 k2 p1 p2 [ k3 [ k4 k5 k6 ]]]
83+
```
84+
with parameters in brackets optional.
85+
86+
## Evaluation (Monocular)
87+
We provide evaluation scripts for TartanAir, EuRoC, and TUM. EuRoC and TUM can be run on a 1080Ti. The TartanAir validation script will require 24G of memory.
88+
89+
### EuRoC
90+
Download the [EuRoC](https://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets) sequences (ASL format) and put them in `datasets/EuRoC`
91+
```Bash
92+
./tools/evaluate_euroc.sh
93+
```
94+
95+
### TUM-RGBD
96+
Download the fr1 sequences from [TUM-RGBD](https://vision.in.tum.de/data/datasets/rgbd-dataset/download) and put them in `datasets/TUM-RGBD`
97+
```Bash
98+
./tools/evaluate_tum.sh
99+
```
100+
101+
### TartanAir
102+
Download the [TartanAir](https://theairlab.org/tartanair-dataset/) dataset using the script `thirdparty/tartanair_tools/download_training.py` and put them in `datasets/TartanAir`
103+
```Bash
104+
./tools/validate_tartanair.sh
105+
```
106+
107+
## Training
108+
109+
First download the TartanAir dataset. The download script can be found in `thirdparty/tartanair_tools/download_training.py`. You will only need the `rgb` and `depth` data.
110+
111+
```
112+
python download_training.py --rgb --depth
113+
```
114+
115+
You can then run the training script. We use 4x3090 RTX GPUs for training which takes approximatly 1 week. If you use a different number of GPUs, adjust the learning rate accordingly.
116+
117+
**Note:** On the first training run, covisibility is computed between all pairs of frames. This can take several hours, but the results are cached so that future training runs will start immediately.
118+
119+
120+
```
121+
python train.py --datapath=<path to tartanair> --gpus=4 --lr=0.00025
122+
```
123+
124+
125+
## Acknowledgements
126+
Data from [TartanAir](https://theairlab.org/tartanair-dataset/) was used to train our model. We additionally use evaluation tools from [evo](https://github.com/MichaelGrupp/evo) and [tartanair_tools](https://github.com/castacks/tartanair_tools).

calib/barn.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
1161.545689 1161.545689 960.000000 540.000000 -0.025158 0.0 0.0 0.0

calib/eth.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
726.21081542969 726.21081542969 359.2048034668 202.47247314453

calib/euroc.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
458.654 457.296 367.215 248.375 -0.28340811 0.07395907 0.00019359 1.76187114e-05

calib/tartan.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
320.0 320.0 320.0 240.0

calib/tum3.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
535.4 539.2 320.1 247.6

0 commit comments

Comments
 (0)