Skip to content

Latest commit

 

History

History
32 lines (22 loc) · 1.01 KB

TUTORIAL.md

File metadata and controls

32 lines (22 loc) · 1.01 KB

Tutorial

Download tiny dataset at first;

curl -s https://tri-ml-public.s3.amazonaws.com/github/packnet-sfm/datasets/KITTI_tiny.tar | tar -xv -C /data/datasets/

Visualization (GIF's Left)

Adopting vidar and CamViz to visualization demo.

python thirdparty/vidar/scripts/launch.py demo_configs/camviz_demo.yaml

Self-supervised Prior Learning

Using vidar, pre-train the depth, ego-motion, and intrinsics as:

python thirdparty/vidar/scripts/launch.py demo_configs/selfsup_resnet18_vo_calib.yaml

Then, the checkpoint file will be store at /data/checkpoints/vo_demo/<DATE>/models/###.ckpt

Visual Odometry

Set the above checkpoint path to SELFSUP_CKPT_OVERRIDE= of demo_selfsup_vo_integration.sh, then run

./shells/demo_selfsup_vo_integration.sh