Hi Asli,
Analyzing 2464 matrices of size 400 x 400 might be demanding in terms of memory.
One way to save memory is to avoid pre-computation of the permutations - this will increase computation time but save memory. To do this, in NBSrun.m, you will see "Limit=10^8/3; and this can be changed to "Limit=1" to save memory.
Other than that, I would recommend splitting the sample into test and validation cohorts. This would enable you to test the replicability of your findings in the validation cohorts.
Kind regards,
Andrew
Originally posted by asli akdeniz:
Dear Andrew,
I'm running into persistent memory issues while using the NBS (Network-Based Statistic) toolbox in MATLAB on our Slurm cluster. In my last attempt, I requested 300 GB of memory, but the job was still killed with an out-of-memory (OOM) error.
The message was: "Killed matlab ... error: Detected 1 oom-kill event(s)".
My analysis involves 2464 connectivity matrices, each of size 400 by 400. The design contrast is [1 -1], and I'm running 5000 permutations as part of a standard two-group comparison. I'm using MATLAB R2021a and the NBS v1.2 toolbox. The job was submitted with 8 cores and 300 GB of memory, but it still failed due to memory issues. I am not entirely sure whether NBS supports parallel processing, so I’m uncertain if the additional cores I requested are actually being utilized.
I would really appreciate your suggestions on how to handle this better. Would you recommend reducing the number of permutations or splitting the analysis into batches? Or is this setup exceeding realistic memory limits for the nodes available on our system? Any guidance on how to make this analysis more efficient or more stable would be very helpful.
Thanks in advance,
Aslı
Threaded View
Title | Author | Date |
---|---|---|
asli akdeniz | Apr 10, 2025 | |
Andrew Zalesky | Apr 11, 2025 | |
asli akdeniz | Apr 14, 2025 | |