help
help > dramatic normalization failure after denoising
May 17, 2022 02:05 PM | Katie Surrence
dramatic normalization failure after denoising
Hi all,
I was hoping to use the Conn denoising tools before normalization in my preprocessing pipeline for task-based fMRI to remove visible stripe artifact. I hoped to use it before normalization because my preexisting normalization was ok, but not great (internal features looked good but the outline of the brain was somewhat shifted forward from the template). This meant two things: one, that I wasn't confident about the veridicality of normalized white matter and CSF masks so I didn't want to regress out the signal after normalization, and two, that I hoped that denoising could possibly make normalization better.
Instead, confusingly, normalization has gotten much worse. Or rather, some small aspects of it looks better (the functional data collected in the front of the brain conforms better to the shape of the template) but it's not aligned with the template at all. I am using exactly the same scripts I used to normalize before -- the only difference is that I now select the functional files with the 'bd' prefix.
Here's my whole preprocessing pipeline in SPM:
1) slice time correction
2) realignment (without writing out files)
3) use this procedure to set the origin of the functional files: https://github.com/rordenlab/spmScripts/blob/master/nii_setOrigin.m
4) hand realign the anatomical files to have the origin at the anterior commissure
5) segment and skullstrip the anatomical files
6) coregister the skullstripped file to the mean functional file, applying the same transform to the unskullstripped file
7) resegment the unskullstripped file
8) This was the point at which I ran denoising.
I have an idiosyncratic architecture in which I run my preprocessing scripts, but I'm attaching a function I wrote within it so you can see what I ran. I tried to follow the example here: https://web.conn-toolbox.org/fmri-methods/denoising-pipeline#h.p_payQ7SwXgn1h
9) I calculate the DARTEL flowfields and
10) run DARTEL normalization
I'm attaching two pages of images. The first shows a single functional run before normalization. The first functional image is the "d" prefixed image and the second is the "bd" prefixed image. The anatomical and the template follow.
The second shows normalization failure -- that's the smoothed normalized functional image, the normalized anatomical, and the template. I also note that the anterior portion of the anatomical looks a little crinkly in a way it never did before.
I know in some ways this is a question about the failure of a preprocessing step I did outside of CONN, but given that normalization at least roughly worked when I fed it my slice time corrected/realigned functional images directly, I hoped someone here might have insight into what caused the problem. On visual inspection CONN was doing a good job of removing the strip artifact and I would like to be able to use denoising in my pipeline. Thanks in advance for your help!
Best,
Katie
I was hoping to use the Conn denoising tools before normalization in my preprocessing pipeline for task-based fMRI to remove visible stripe artifact. I hoped to use it before normalization because my preexisting normalization was ok, but not great (internal features looked good but the outline of the brain was somewhat shifted forward from the template). This meant two things: one, that I wasn't confident about the veridicality of normalized white matter and CSF masks so I didn't want to regress out the signal after normalization, and two, that I hoped that denoising could possibly make normalization better.
Instead, confusingly, normalization has gotten much worse. Or rather, some small aspects of it looks better (the functional data collected in the front of the brain conforms better to the shape of the template) but it's not aligned with the template at all. I am using exactly the same scripts I used to normalize before -- the only difference is that I now select the functional files with the 'bd' prefix.
Here's my whole preprocessing pipeline in SPM:
1) slice time correction
2) realignment (without writing out files)
3) use this procedure to set the origin of the functional files: https://github.com/rordenlab/spmScripts/blob/master/nii_setOrigin.m
4) hand realign the anatomical files to have the origin at the anterior commissure
5) segment and skullstrip the anatomical files
6) coregister the skullstripped file to the mean functional file, applying the same transform to the unskullstripped file
7) resegment the unskullstripped file
8) This was the point at which I ran denoising.
I have an idiosyncratic architecture in which I run my preprocessing scripts, but I'm attaching a function I wrote within it so you can see what I ran. I tried to follow the example here: https://web.conn-toolbox.org/fmri-methods/denoising-pipeline#h.p_payQ7SwXgn1h
function runDenoise(ExpInfo, newdir)
[dataPath, subjects] =
fc.setup(ExpInfo);
for i = 1:length(subjects)
subject = subjects(i).ID;
white = spm_select('FPList', fullfile(dataPath, subject, 'anat'),...
'^c2rs.*/.nii');
csf = spm_select('FPList', fullfile(dataPath, subject, 'anat'),...
'^c3rs.*/.nii');
subject = subjects(i).ID;
white = spm_select('FPList', fullfile(dataPath, subject, 'anat'),...
'^c2rs.*/.nii');
csf = spm_select('FPList', fullfile(dataPath, subject, 'anat'),...
'^c3rs.*/.nii');
for j =
1:length(subject.runs)
session = subject.runs{j};
runConn(dataPath, subject, session, white, csf);
end
end
session = subject.runs{j};
runConn(dataPath, subject, session, white, csf);
end
end
end
function runConn(dataPath, subjID,
session, white, csf)
selectors = {'^t.*\.nii', '^rp.*\.txt', '^art_regression_outliers_t.*\.mat'};
selectors = {'^t.*\.nii', '^rp.*\.txt', '^art_regression_outliers_t.*\.mat'};
sessionPath = fullfile(dataPath,
subjID, 'func', session);
[functionals, rp, art] = deal(cellfun(@(x) spm_select('FPList', ...
sessionPath, x), selectors, 'UniformOutput', 0));
[functionals, rp, art] = deal(cellfun(@(x) spm_select('FPList', ...
sessionPath, x), selectors, 'UniformOutput', 0));
conn_module('preprocessing',...
'functionals', functionals, ...
'covariates', struct(...
'names', {{'realignment','scrubbing'}},...
'files', {{rp, art}}),...
'masks',...
struct(...
'White', {{white}},...
'CSF', {{csf}}),...
'steps', {'functional_regression', 'functional_bandpass'},...
'reg_names', {'realignment','scrubbing','White Matter','CSF'}, ...
'reg_dimensions', [inf, inf, 5, 5],...
'reg_deriv', [1, 0, 0, 0], ...
'bp_filter', [0.008 inf] )
end
'functionals', functionals, ...
'covariates', struct(...
'names', {{'realignment','scrubbing'}},...
'files', {{rp, art}}),...
'masks',...
struct(...
'White', {{white}},...
'CSF', {{csf}}),...
'steps', {'functional_regression', 'functional_bandpass'},...
'reg_names', {'realignment','scrubbing','White Matter','CSF'}, ...
'reg_dimensions', [inf, inf, 5, 5],...
'reg_deriv', [1, 0, 0, 0], ...
'bp_filter', [0.008 inf] )
end
9) I calculate the DARTEL flowfields and
10) run DARTEL normalization
I'm attaching two pages of images. The first shows a single functional run before normalization. The first functional image is the "d" prefixed image and the second is the "bd" prefixed image. The anatomical and the template follow.
The second shows normalization failure -- that's the smoothed normalized functional image, the normalized anatomical, and the template. I also note that the anterior portion of the anatomical looks a little crinkly in a way it never did before.
I know in some ways this is a question about the failure of a preprocessing step I did outside of CONN, but given that normalization at least roughly worked when I fed it my slice time corrected/realigned functional images directly, I hoped someone here might have insight into what caused the problem. On visual inspection CONN was doing a good job of removing the strip artifact and I would like to be able to use denoising in my pipeline. Thanks in advance for your help!
Best,
Katie
Threaded View
Title | Author | Date |
---|---|---|
Katie Surrence | May 17, 2022 | |
Katie Surrence | May 18, 2022 | |