This implementation is based on the CVPR 2020 (oral) paper Learning a Neural Solver for Multiple Object Tracking (Guillem Brasó, Laura Leal-Taixe) [Paper][Youtube][CVPR Daily]
The old implementation addreess is here.
Our 2 new mechanisms are showed below
If you want to look the original setup, please read old_README.md
- Clone and enter this repository:
git clone --recursive https://github.com/anslt/ADL4CV.git cd mot_neural_solver
- (OPTIONAL) Download Anaconda if you work on Colab
wget -c https://repo.anaconda.com/archive/Anaconda3-2020.11-Linux-x86_64.sh chmod +x Anaconda3-2020.11-Linux-x86_64.sh bash ./Anaconda3-2020.11-Linux-x86_64.sh -b -f -p /usr/local rm Anaconda3-2020.11-Linux-x86_64.sh
- Create an Anaconda environment for this project:
conda env create -f environment.yaml conda activate mot_neural_solver pip install -e tracking_wo_bnw pip install -e .
- Download the MOTChallenge data, reid network and preprocessed detection:
bash scripts/setup/download_motcha.sh bash scripts/setup/download_models.sh bash scripts/setup/download_prepr_dets.sh
- (OPTIONAL) Install other lacking package if you work on Colab:
conda install -y ipykernel
For other parameters not including in old_README.md in training, we introduce below:
graph_model_params:
time_aware: whether the node updating is time aware (defualt: False)
attention:
use_attention: whether use attention (defualt: False)
alpha: LeakyRelu parameter (deafualt: 0.2)
attention_head_num: the number of attention network applied in MPN (deafualt: 2)
att_regu: apply regularization on MPN (deafualt: False)
att_regu_strength: weight of regularization (deafualt: 0.5)
new_softmax: apply regularization on MPN (deafualt: False)
dynamical_graph:
graph_pruning: whether use graph (deafualt: False)
first_prune_step: from which iteration we start prune the graph(deafualt: 4)
prune_factor: how many edges are pruned in each iteration(deafualt: 0.05)
prune_frequency: the frequency prune the graph(deafualt: 1)
mode: the score is generated in which method (deafualt: "classifier node wise")
["classifier node wise","classifier naive"]
prune_min_edge: the threshold of linked edges to stop pruning for a node (deafualt: 5)
We use the cross_val_split 2. We train a model with regularized attention by running:
python scripts/train.py with cross_val_split=2 train_params.save_every_epoch=True train_params.num_epochs=25 train_params.num_workers=4 graph_model_params.attention.use_attention=True graph_model_params.attention.att_regu=True graph_model_params.attention.new_softmax=True
We train a model with edge pruning:
python scripts/train.py with cross_val_split=2 train_params.save_every_epoch=True train_params.num_epochs=25 train_params.num_workers=4 graph_model_params.dynamical_graph.graph_pruning=True
We use our attention + edge pruning to generate a small demo by clicking here.
If you use our work in your research, please cite the original publication:
@InProceedings{braso_2020_CVPR,
author={Guillem Brasó and Laura Leal-Taixé},
title={Learning a Neural Solver for Multiple Object Tracking},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2020}
}
Please, also consider citing Tracktor if you use it for preprocessing detections:
@InProceedings{tracktor_2019_ICCV,
author = {Bergmann, Philipp and Meinhardt, Tim and Leal{-}Taix{\'{e}}, Laura},
title = {Tracking Without Bells and Whistles},
booktitle = {The IEEE International Conference on Computer Vision (ICCV)},
month = {October},
year = {2019}}