Reducing Semantic Ambiguity In Domain Adaptive Semantic Segmentation Via Probabilistic Prototypical Pixel Contrast
This is the official implementation of the paper "Reducing Semantic Ambiguity In Domain Adaptive Semantic Segmentation Via Probabilistic Prototypical Pixel Contrast"
# create conda environment
conda create --name PPPC -y python=3.8
conda activate PPPC
conda install -y ipython pip
# Upgrade pip, otherwise the installation of mmcv-full will be slow.
pip install --upgrade pip
pip install -r requirements.txt
- GTAV: Download GTAV from here and extract them to
data/gta
. - Synthia: Download Synthia from here and extract it to
data/synthia
. - Cityscapes: Download Cityscapes from here and extract it to
data/synthia
. - Dark Zurich: Download Dark Zurich from here and extract it to
data/synthia
.
The folder structure should like this:
PPPC
├── ...
├── data
│ ├── cityscapes
│ │ ├── leftImg8bit
│ │ │ ├── train
│ │ │ ├── val
│ │ ├── gtFine
│ │ │ ├── train
│ │ │ ├── val
│ ├── dark_zurich
│ │ ├── gt
│ │ │ ├── val
│ │ ├── rgb_anon
│ │ │ ├── train
│ │ │ ├── val
│ ├── gta
│ │ ├── images
│ │ ├── labels
│ ├── synthia
│ │ ├── RGB
│ │ ├── GT
│ │ │ ├── LABELS
├── ...
Perform preprocessing to convert label IDs to the train IDs and gather dataset statistics:
python tools/convert_datasets/gta.py data/gta --nproc 20
python tools/convert_datasets/cityscapes.py data/cityscapes --nproc 20
To evaluate the model on Cityscapes, run:
python -m tools.test /path/to/config /path/to/checkpoint --eval mIoU
Our trained model and config are available via GTAV → Cityscapes, SYNTHIA → Cityscapes.
- Please follow the instructions in SePiCo, and submit them to the official test server.
- Our trained model and config are available via Cityscapes → Dark Zurich.
- Our submission and score log are available here.
The detail of train configration is at 'experiments.py'.
python run_experiments.py --exp <exp_id>
<exp_id> |
task |
---|---|
1 |
GTAV → Cityscapes |
2 |
SYNTHIA → Cityscapes |
3 |
Cityscapes → Dark Zurich |
This project is based on the following open-source projects. We thank their authors for making the source code publicly available.