The several datasets are configured in ./etc/datasets/
.
Below, we list each dataset, how to download them, and how to change their configuration to point to your local files.
Download : https://hpatches.github.io/ (test)
Once downloaded, update the path hpatches_path
in the configuration file ./etc/datasets/hpatches/defaults.yaml
.
Download : https://cocodataset.org/#download (train, val, test)
Once downloaded, update the path root
in the configuration files ./etc/datasets/coco/{training,validation,test}.yaml
. The field annFile
parameter should point to the proper annotation file (even though annotations aren't used), otherwise the resulting dataset will be considered empty.
Download : https://www.image-net.org/download.php
Once downloaded, update the path root
in the configuration files ./etc/datasets/image-net/{training,validation}.yaml
.
Download : https://github.com/ScanNet/ScanNet#scannet-data
Once downloaded, update the path path
in the configuration files ./etc/datasets/scannet-frames/{training-all,test}.yaml
.
Remark #1 : We've designed our own dataset class for ScanNet that loads directly from the original raw source. There is no need to run a frame extractor (suggested by ScanNet).
Remark #2 : Additionally, our class does cache a few things like frame offsets in order to speed-up random access to frames. As a consequence, the first access to a scan will be slow, but should be faster afterwards. The cache path can be changed by changing the cache_path
field.
Download : https://www.cs.cornell.edu/projects/megadepth/ (all)
Once downloaded, update the path root
in the configuration files ./etc/datasets/megadepth/defaults.yaml
.