help
help > Denoised result in different space dimensions
Feb 28, 2020 11:02 AM | Daniel van de Velden
Denoised result in different space dimensions
Dear conn-community,
I am using the conn toolbox now for some time and have implemented a very robust analysis pipeline.
However, the last few of my subject functional datasets result, after the "denoised"-step, in a 256x256x256 space dimension with a 1x1x1mm voxeldimension. This increases the size of my .nii-files from ~30GB ti over 145GB and makes them hardly readable/loadbale.
The data input dimensions of my structural and functinal data is the same as before.
I use an implicit brainmask correctly.
batch.Denoising.confounds.names = {'Grey Matter', 'CSF', 'White Matter', 'realignment'};
batch.Denoising.detrending = 1;
batch.Denoising.overwrite = 'Yes';
I get a correctly dimensioned ("preprocessed") smoothed function dataset out of my preprocessing steps, this result .nii file then is used for the denoising of course. After this step the dimension is screwed up like explained above.
Does anyone know why and/or how to fix it ?
Greetings and all the best
Daniel van de Velden
I am using the conn toolbox now for some time and have implemented a very robust analysis pipeline.
However, the last few of my subject functional datasets result, after the "denoised"-step, in a 256x256x256 space dimension with a 1x1x1mm voxeldimension. This increases the size of my .nii-files from ~30GB ti over 145GB and makes them hardly readable/loadbale.
The data input dimensions of my structural and functinal data is the same as before.
I use an implicit brainmask correctly.
% preprocessing steps and
configurations
preproc_STEPS = {'functional_art' ...
'functional_coregister_affine' ...
'structural_segment' ...
'functional_segment' ...
'functional_smooth' ...
};
preproc_STEPS = {'functional_art' ...
'functional_coregister_affine' ...
'structural_segment' ...
'functional_segment' ...
'functional_smooth' ...
};
batch.Setup.preprocessing.steps =
preproc_STEPS;
batch.Setup.preprocessing.fwhm = 3;
batch.Setup.preprocessing.coregtomean = 1;
batch.Setup.preprocessing.art_thresholds = [3 0.5];
batch.Setup.preprocessing.removescans = params.FMRI.early_drop;
batch.Setup.preprocessing.voxelsize_anat = 1;
batch.Setup.preprocessing.voxelsize_func = 1;
batch.Setup.preprocessing.fwhm = 3;
batch.Setup.preprocessing.coregtomean = 1;
batch.Setup.preprocessing.art_thresholds = [3 0.5];
batch.Setup.preprocessing.removescans = params.FMRI.early_drop;
batch.Setup.preprocessing.voxelsize_anat = 1;
batch.Setup.preprocessing.voxelsize_func = 1;
% denoising steps and configurations
batch.Setup.outputfiles(2) = 1;
batch.Setup.analysisunits = 1;
batch.Setup.voxelmask = 2;
batch.Setup.outputfiles(2) = 1;
batch.Setup.analysisunits = 1;
batch.Setup.voxelmask = 2;
batch.Setup.voxelmaskfile =
subject.brainmask;
% batch.Setup.voxelresolution = 2;
batch.Setup.analysisunits = 1;
batch.Denoising.done = 1;
batch.Denoising.filter = [0.08 0.9];
% batch.Setup.voxelresolution = 2;
batch.Setup.analysisunits = 1;
batch.Denoising.done = 1;
batch.Denoising.filter = [0.08 0.9];
batch.Denoising.confounds.names = {'Grey Matter', 'CSF', 'White Matter', 'realignment'};
batch.Denoising.detrending = 1;
batch.Denoising.overwrite = 'Yes';
I get a correctly dimensioned ("preprocessed") smoothed function dataset out of my preprocessing steps, this result .nii file then is used for the denoising of course. After this step the dimension is screwed up like explained above.
Does anyone know why and/or how to fix it ?
Greetings and all the best
Daniel van de Velden
Threaded View
Title | Author | Date |
---|---|---|
Daniel van de Velden | Feb 28, 2020 | |
Alfonso Nieto-Castanon | Feb 28, 2020 | |
Daniel van de Velden | Mar 3, 2020 | |
Alfonso Nieto-Castanon | Mar 3, 2020 | |