Implementation of Paper "Extreme Relative Pose Network under Hybrid Representations".
- pytorch (>0.4)
- open3d
- scipy,sklearn
- torchvision
please make sure to have following folder structure:
Hybrid_Relative_Pose/
data/
dataList/
eval/
plane_data/
topdown_data/
test_data/
pretrained_model/
global/
360_image/
plane/
topdown/
local/
experiments/
Images: suncg,matterport,scannet.
Datalist: Download datalist and put them under data/datalList/.
Pretrained model: Download pretrained model and put them under data/pretrained_model/.
Relative Pose data: Download Relative pose test data and put them under data/eval/.
python generate_testdata.py --exp fd_param --dataset={suncg,scannet,matterport} --snumclass={15,21,21} --split=test
python eval_spectral.py --dataList={suncg,scannet,matterport} --method=ours --exp=test_code --fitmethod irls_sm_v2 --d 1 --hybrid 1 --hybrid_method 360+plane+topdown --numMatches {1,3,5} --w_plane_1 {1.44,2.05,2.00} --w_topdown {0.25,0.29,0.30}
python local_inference.py --model=./data/pretrained_model/local/{suncg,scannet,matterport}.tar --batch_size=12 --resume --enable_training=0 --eval_local=1 --local_eval_list=data/dataList/{suncg,scannet,matterport}_local.npy --dataList={suncg,scannet,matterport}
4pcs and Global Registration model:
python run_baseline.py --dataset={scannet, suncg, matterport} --method={4pcs, gr}
Baseline local model:
python baseline_icp.py --dataset {suncg,scannet,matterport} --global_method {ours,gr}
Zhenpei Yang and Siming Yan
If you use our code or method in your work, please cite the following:
@misc{yang2019extreme,
title={Extreme Relative Pose Network under Hybrid Representations},
author={Zhenpei Yang and Siming Yan and Qixing Huang},
year={2019},
eprint={1912.11695},
archivePrefix={arXiv},
primaryClass={cs.CV}
}