Replies: 1 comment
-
The max parameters provide a ceiling to the amount of resources a process can use. If you set a step to request 10 cores but However, adding more resources is mostly helpful to make sure that larger datasets (like UK Biobank) will produce results. It doesn't mean the workflow will complete faster. For example, the There's a Nextflow execution report in
|
Beta Was this translation helpful? Give feedback.
-
I've read this, https://pgsc-calc.readthedocs.io/en/latest/how-to/bigjob.html, but I am a bit confused. I see that the HPC setup has each step configured with its own resources. I am running pgsc_calc locally, with a lot of RAM and many scores. All steps are using a very small portion (e.g., usually under 8 GB) of the total available RAM. Do I need to specify RAM usage for individual steps? For example, for 32 GB computers, I have the custom .config file passed in as a parameter set to this:
However, this does not seem to be sufficient for allowing each step to use more resources. The pipeline takes a long time to finish so I'm hoping that being able to use more RAM will speed it up.
Beta Was this translation helpful? Give feedback.
All reactions