This repo contains the code for our RAL2021 paper, Keypoint Matching for Point Cloud Registration Using Multiplex Dynamic Graph Attention Networks.
In this paper, we propose a novel and flexible graph network architecture to tackle the keypoint matching problem in an end-to-end fashion. This repo includes PyTorch code for training and testing out MDGAT-matcher network on top of USIP keypoints and FPFH descriptors.
If you use our implementation in your academic work, please cite the corresponding paper:
@ARTICLE{shi2021ral,
author={Shi, Chenghao and Chen, Xieyuanli and Huang, Kaihong and Xiao, Junhao and Lu, Huimin and Stachniss, Cyrill},
journal={IEEE Robotics and Automation Letters},
title={Keypoint Matching for Point Cloud Registration Using Multiplex Dynamic Graph Attention Networks},
year={2021},
volume={6},
number={4},
pages={8221-8228},
doi={10.1109/LRA.2021.3097275}}
This repo is implemented under Pytorch. For the Pytorch version
-
Pytorch Installation guide: link
-
Python Dependencies (the requirements file provides the version used in our experiments, it may also work with different versions)
sudo -H pip3 install -r requirements.txt
Our method runs on top of detected keypoints and initial descriptors. In our experiment, we use USIP keypoionts and FPFH descriptor. Before training and test, one need to pre-extract the features on point cloud and generate the ground truth. One can follow USIP to generate your own keypoints and descriptors.
One can also download the our generated features used in our experiments.
For mainland China friends, you may use a faster link, extract code: 08xs.
Put the files in the ./KITTI directory, then you are ready to use our model.
To train the network with default parameters, run the following command:
python3 train.py
For quick test, we provide a pre-trained model in ./pre-trained. Run the following command:
python3 test.py
The true matches are colored by green, the false match are colored by red, and the keypoints are represented as red dot.
This project is free software made available under the MIT License. For details see the LICENSE file.