This repository provides a minimal example of distributed training using PyTorch's DistributedDataParallel
(DDP) on a Slurm-managed cluster.
-
Notifications
You must be signed in to change notification settings - Fork 0
Yichuan0712/DDP-Slurm
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
This repository provides a minimal example of distributed training using PyTorch's DistributedDataParallel on a Slurm-managed cluster
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published