Skip to content

Latest commit

 

History

History
84 lines (52 loc) · 3.51 KB

README.md

File metadata and controls

84 lines (52 loc) · 3.51 KB

WaterMono

Teacher-Guided Anomaly Masking and Enhancement Boosting for Robust Underwater Self-Supervised Monocular Depth Estimation [paper link]

Yilin Ding, Kunqian Li*, Han Mei, Shuaixin Liu and Guojia Hou

teaser

(WaterMono 448x288)

⚙️ Setup

You can install the dependencies with:

pip install -r requirements.txt
pip install 'git+https://github.com/saadnaeem-dev/pytorch-linear-warmup-cosine-annealing-warm-restarts-weight-decay'

We ran our experiments with PyTorch 1.12.1, CUDA 11.7, Python 3.9.18 and Ubuntu 18.04. If you encounter problems with the environment setup, you can refer to Lite-Mono's README file and issues.

💾 Data Preparation

Dataset

We mainly used the FLSea dataset. You can download the FLSea-VI dataset from here and the FLSea-stereo dataset from here. As for our manually created challenge dataset for measuring rotation robustness, it can be downloaded from here.

Our default settings expect that you have converted the tiff images to jpeg to save memory during training. You can convert the format using the following command, which also deletes the raw FLSea .tiff files.

find archive/ -name '*.tiff' | parallel 'convert -quality 92 -sampling-factor 2x2,1x1,1x1 {.}.tiff {.}.jpg && rm {}'

Splits

The train/test/validation splits are defined in the splits/ folder. Our proposed split method is referred to as the OUC_split. You can also define your own split method and use them by setting the --split flag.

📦 Models

Name Input size OUC disparities
Lite-Mono(Teacher Network) 448 x 288 Download 🔗
WaterMono(Student Network) 448 x 288 Download 🔗

📊 Test and Evaluation

Test

You can predict disparity for a single image with:

python test_simple.py --load_weights_folder path/to/your/weights/folder --image_path path/to/your/test/image

Evaluation

If you want to evaluate the model on the test set defined by OUC_split, first prepare the ground truth depth maps by running:

python export_gt_depth.py

Then evaluate the model by running:

python evaluate_depth.py --load_weights_folder path/to/your/weights/folder --data_path path/to/FLSea_data/ --model lite-mono

If you want to test generalization on the FLSea-stereo dataset, please add flag --eval_stereo.

🕒Training

The code of training will be available after the paper is received.

start training

python train.py --data_path path/to/your/data --model_name mytrain --num_epochs 30 --batch_size 12

tensorboard visualization

tensorboard --log_dir ./tmp/mytrain

💕Thanks

Our code is based on Monodepth2, Lite-Mono and Sea-thru. You can refer to their README files and source code for more implementation details.

🖇️Citation

None