open-discussion
open-discussion > RE: number of clusters
Aug 15, 2015 10:08 AM | S Mody - Indian Institute of Science, Bangalore
RE: number of clusters
Originally posted by Adam Steel:
It appears that (at least some of the) code may not complete in a practical amount of time for higher resolutions. I used preprocessed fMRI data from the Human Connectome Project which has a voxel resolution of (2 x 2 x 2) mm. I used a cluster count of 1000. The function make_local_connectivity_tcorr() with a grey mask having about 255,000 voxels runs successfully to create the correlation matrix. The correlation matrix has around 1,000,000 non-zero entries. However the next step binfile_parcellate() runs forever - as the eigenvalue decomposition of the Laplacian of the correlation matrix does not converge (after 24+ hours of run time).
There is a comment in the ncut() function of the toolbox about regularizing the matrix for better stability. So it appears that the author himself may have had problems with the eigenvalue decomposition - which is of course not at all unexpected. Trying to find the eigenvectors of a large sparse matrix can be hell. Hopefully the author reads this and comments.
My next step is to resample the data and grey-mask to (4 x 4 x 4) and rerun the code.
Wondering whether anyone else has had a similar experience.
Regards,
Sandeep Mody.
Hi Shangwei -
I'm having the same issue at the moment. Was this issue ever resolved?
Thanks
Adam
I'm having the same issue at the moment. Was this issue ever resolved?
Thanks
Adam
It appears that (at least some of the) code may not complete in a practical amount of time for higher resolutions. I used preprocessed fMRI data from the Human Connectome Project which has a voxel resolution of (2 x 2 x 2) mm. I used a cluster count of 1000. The function make_local_connectivity_tcorr() with a grey mask having about 255,000 voxels runs successfully to create the correlation matrix. The correlation matrix has around 1,000,000 non-zero entries. However the next step binfile_parcellate() runs forever - as the eigenvalue decomposition of the Laplacian of the correlation matrix does not converge (after 24+ hours of run time).
There is a comment in the ncut() function of the toolbox about regularizing the matrix for better stability. So it appears that the author himself may have had problems with the eigenvalue decomposition - which is of course not at all unexpected. Trying to find the eigenvectors of a large sparse matrix can be hell. Hopefully the author reads this and comments.
My next step is to resample the data and grey-mask to (4 x 4 x 4) and rerun the code.
Wondering whether anyone else has had a similar experience.
Regards,
Sandeep Mody.
Threaded View
Title | Author | Date |
---|---|---|
Shengwei Zhang | Jan 27, 2015 | |
Daniel Lurie | Aug 17, 2015 | |
Daniel Lurie | Aug 17, 2015 | |
S Mody | Aug 15, 2015 | |
Adam Steel | Aug 13, 2015 | |
S Mody | Aug 15, 2015 | |
Shengwei Zhang | Aug 13, 2015 | |