All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
-
Added
input_channels
argument to UNet (#297) -
Added data monitor callbacks
ModuleDataMonitor
andTrainingDataMonitor
(#285) -
Added
VisionDataModule
as parent class forBinaryMNISTDataModule
,CIFAR10DataModule
,FashionMNISTDataModule
, andMNISTDataModule
(#400) -
Added GIoU loss (#347)
-
Added IoU loss (#469)
-
Set PyTorch Lightning 1.0 as the minimum requirement (#274)
-
Move
pl_bolts.callbacks.self_supervised.BYOLMAWeightUpdate
topl_bolts.callbacks.byol_updates.BYOLMAWeightUpdate
(#288) -
Move
pl_bolts.callbacks.self_supervised.SSLOnlineEvaluator
topl_bolts.callbacks.ssl_online.SSLOnlineEvaluator
(#288) -
Move
pl_bolts.datamodules.*_dataset
topl_bolts.datasets.*_dataset
(#275)
-
Fixed duplicate warnings when optional packages are unavailable (#341)
-
Fixed ModuleNotFoundError when importing datamoules (#303)
-
Fixed cyclic imports in
pl_bolts.utils.self_suprvised
(#350) -
Fixed VAE loss to use KL term of ELBO (#330)
-
Fixed dataloders of
MNISTDataModule
to useself.batch_size
(#331) -
Fixed missing
outputs
in SSL hooks for PyTorch Lightning 1.0 (#277)
- Enabled PyTorch Lightning 1.0 compatibility
- Enabled manual returns (#267)
- Enabled PyTorch Lightning 0.10 compatibility (#264)
- Added dummy datasets (#266)
- Added semantic segmentation model
SemSegment
withUNet
backend (#259) - Added
KittiDataModule
(#248) - Added
UNet
(#247) - Added reinforcement learning models, losses and datamodules (#257)
- Fixed confused logit (#222)
- Added pretrained VAE with resnet encoders and decoders
- Added pretrained AE with resnet encoders and decoders
- Added CPC pretrained on CIFAR10 and STL10
- Verified BYOL implementation
- Dropped all dependencies except PyTorch Lightning and PyTorch
- Decoupled datamodules from GAN (#206)
- Modularize AE & VAE (#196)
- Fixed gym (#221)
- Fix L1/L2 regularization (#216)
- Fix max_depth recursion crash in AsynchronousLoader (#191)
- Enabled Apache License, Version 2.0
- Moved unnecessary dependencies to
__main__
section in BYOL (#176)
- Fixed CPC STL10 finetune (#173)
- Added Faster RCNN + Pscal VOC DataModule (#157)
- Added a better lars scheduling
LARSWrapper
(#162) - Added CPC finetuner (#158)
- Added
BinaryMNISTDataModule
(#153) - Added learning rate scheduler to BYOL (#148)
- Added Cityscapes DataModule (#136)
- Added learning rate scheduler
LinearWarmupCosineAnnealingLR
(#138) - Added BYOL (#144)
- Added
ConfusedLogitCallback
(#118) - Added an asynchronous single GPU dataloader. (#1521)
- Fixed simclr finetuner (#165)
- Fixed STL10 finetuner (#164)
- Fixed Image GPT (#108)
- Fixed unused MNIST transforms in tran/val/test (#109)
- Enhanced train batch function (#107)
- Added setup and repo structure
- Added requirements
- Added docs
- Added Manifest
- Added coverage
- Added MNIST template
- Added VAE template
- Added GAN + AE + MNIST
- Added Linear Regression
- Added Moco2g
- Added simclr
- Added RL module
- Added Loggers
- Added Transforms
- Added Tiny Datasets
- Added regularization to linear + logistic models
- Added Linear and Logistic Regression tests
- Added Image GPT
- Added Recommenders module
- Device is no longer set in the DQN model init
- Moved RL loss function to the losses module
- Moved rl.common.experience to datamodules
- train_batch function to VPG model to generate batch of data at each step (POC)
- Experience source no longer gets initialized with a device, instead the device is passed at each step()
- Refactored ExperienceSource classes to be handle multiple environments.
- Removed N-Step DQN as the latest version of the DQN supports N-Step by setting the
n_step
arg to n - Deprecated common.experience
- Documentation
- Doct tests
- CI pipeline
- Imports and pkg
- CPC fixes