Dear Andrew,
I'm running into persistent memory issues while using the NBS (Network-Based Statistic) toolbox in MATLAB on our Slurm cluster. In my last attempt, I requested 300 GB of memory, but the job was still killed with an out-of-memory (OOM) error.
The message was: "Killed matlab ... error: Detected 1 oom-kill event(s)".
My analysis involves 2464 connectivity matrices, each of size 400 by 400. The design contrast is [1 -1], and I'm running 5000 permutations as part of a standard two-group comparison. I'm using MATLAB R2021a and the NBS v1.2 toolbox. The job was submitted with 8 cores and 300 GB of memory, but it still failed due to memory issues. I am not entirely sure whether NBS supports parallel processing, so I’m uncertain if the additional cores I requested are actually being utilized.
I would really appreciate your suggestions on how to handle this better. Would you recommend reducing the number of permutations or splitting the analysis into batches? Or is this setup exceeding realistic memory limits for the nodes available on our system? Any guidance on how to make this analysis more efficient or more stable would be very helpful.
Thanks in
advance,
Aslı
Threaded View
Title | Author | Date |
---|---|---|
asli akdeniz | Apr 10, 2025 | |
Andrew Zalesky | Apr 11, 2025 | |
asli akdeniz | Apr 14, 2025 | |