help > Analysis for macaque brain
Showing 1-3 of 3 posts
Display:
Results per page:
Jan 28, 2025  06:01 AM | Chi-Hyeon Yoo - Athinoula A. Martinos center for Biomedical Imaging, MGH
Analysis for macaque brain

Hello, 


I have been enjoying CONN toolbox for powerful visual and well-organized QC in human FC analysis. 


I would like to use this functions also in data for NHP brains. 


Especially, preprocessing and denoising steps with its visual QC not just the output. 


I have a template with labels including GM, WM,CSF, I can do registration outside this toolbox. 


 


So I potentially thought


1) I can try to swap some MNI-related template by NHP's template. But for this I am not sure where should I found the template file (should I take a look into SPM12 folders?), and maybe I thought overly simple. 


2) or I can do registration of structural and functional template before the steps and do subject space preprocessing and later provide GM, WM, CSF timeseries exported elsewhere for denoising? 


 


I am eagarly trying to find a way to apply CONN's great QC for NHP data, so any thought would be helpful. 


Best, 


Chi-Hyeon


 

Jan 28, 2025  11:01 AM | Alfonso Nieto-Castanon - Boston University
RE: Analysis for macaque brain

Hi Chi-Hyeon


If going for option (1) (performing normalization within CONN) the file that CONN uses by default is in spm/tpm/TPM.nii. This file contains the tissue probability map for several tissue classes in humans, but there is no need to overwrite this file, you may simply select an alternative TPM file during preprocessing (when prompted, change the option that reads "Use default Tissue Probability Map (SPM/TPM)'" to "Custom TPM file: [select file]" and then select your new tissue probability map to be used during normalization/segmentation).


If going for option (2) (performing normalization outside of CONN) you may simply import the resulting normalized functional and anatomical images in the Setup.Functional and Setup.Structural tabs, and also the normalized tissue masks (for Gray matter, White matter, and CSF) in the Setup.ROIs tab, and CONN will extract the corresonding BOLD signals within those areas during the Setup step.


Hope this helps


Alfonso 


 


Originally posted by Chi-Hyeon Yoo:



Hello, 


I have been enjoying CONN toolbox for powerful visual and well-organized QC in human FC analysis. 


I would like to use this functions also in data for NHP brains. 


Especially, preprocessing and denoising steps with its visual QC not just the output. 


I have a template with labels including GM, WM,CSF, I can do registration outside this toolbox. 


 


So I potentially thought


1) I can try to swap some MNI-related template by NHP's template. But for this I am not sure where should I found the template file (should I take a look into SPM12 folders?), and maybe I thought overly simple. 


2) or I can do registration of structural and functional template before the steps and do subject space preprocessing and later provide GM, WM, CSF timeseries exported elsewhere for denoising? 


 


I am eagarly trying to find a way to apply CONN's great QC for NHP data, so any thought would be helpful. 


Best, 


Chi-Hyeon


 



 

Jan 30, 2025  03:01 AM | Chi-Hyeon Yoo - Athinoula A. Martinos center for Biomedical Imaging, MGH
RE: Analysis for macaque brain

Hello Alfonso, 


Thank you for suggesting the way. 


I have succesfully done preprocessing and denoising for functional data. 


However, somehow my denoised image (dsau*) is zoomed out.


Before the denoising, the subject space preprocessed image was fine (sau*). 


I am not sure why denoised image somehow resampled (zoomed out) although I didn't specify any registration options in that step. 


Is this because I don't have proper atals or network template matched with the NHP image but leave it as default (MNI)? 


For now I am testing the preprocessing, denoising step, so I don't have the template ready yet. 


 


Best, 


Chi-Hyeon