help > HCP preprocessing batch
Showing 1-2 of 2 posts
Display:
Results per page:
May 1, 2024  06:05 AM | Zack Shan
HCP preprocessing batch

Hi,


I am using 'CONN_BATCH_HUMANCONNECTOMEPROJECT.m' to preprocess HCP resting-state fMRI data. The script was executed without errors. However, the "s.nii" seems correct, but the "ds*.nii" files showed no intensity variation across 1200 volumes.


I searched the forum and found another person encountering the same problem. However, no answers were provided afterward.


I would appreciate your suggestions/help on this matter.


Thank you,


Zack

May 6, 2024  03:05 PM | Alfonso Nieto-Castanon - Boston University
RE: HCP preprocessing batch

Hi Zack,


If the ds*.nii files contain constant data (not varying at all across time) that means that the denoising procedure was too agressive and resulted in no effective degrees of freedom left on the denoised data. To investigate this further please let me know what changes -if any- were done to your version of the conn_batch_humanconnectomeproject script, for example I am thinking that if you skipped the renaming step when importing the realignment data (renaming .txt files to .deg.txt) this could cause that data to be mis-interpreted as containing rotation values in radian units (while in the HCP files rotations are specified in degrees) which will in turn could cause all time-points to be flagged as outliers and their BOLD signal fluctuations removed from the denoised data.


Best


Alfonso


Originally posted by Zack Shan:



Hi,


I am using 'CONN_BATCH_HUMANCONNECTOMEPROJECT.m' to preprocess HCP resting-state fMRI data. The script was executed without errors. However, the "s.nii" seems correct, but the "ds*.nii" files showed no intensity variation across 1200 volumes.


I searched the forum and found another person encountering the same problem. However, no answers were provided afterward.


I would appreciate your suggestions/help on this matter.


Thank you,


Zack