help
help > large filesize after matc2nii conversion
May 10, 2017 04:05 PM | Matti Gärtner
large filesize after matc2nii conversion
Dear Conn Experts,
I want to extract confound-corrected time course data and checked the box in Setup-->Options-->"Create confound-corrected time-series". I have datasets of 205 images X 2 conditions that have a filesize of 350 MB for each condition. The confound-corrected nifti images (niftiDATA_Subject004_Condition000.nii) have a large filesize of 1,5 GB. Since this filesize seemed a bit large to me I used spm_vol to look at the data and found a structure array of length "number-of-images" (Z = 410x1 struct). When I looked at size(Z(1).private.dat) I saw that it contained 4-D data of 410 images. It looks like Z(2), Z(3) ... Z(n) all contain the same 4-D which I think is the explanation for the large file size. My question is whether I made a mistake somewhere or whether there is a reason why the same 4-D is saved 410 times?
Thanks a lot for your help in advance
Matti
I want to extract confound-corrected time course data and checked the box in Setup-->Options-->"Create confound-corrected time-series". I have datasets of 205 images X 2 conditions that have a filesize of 350 MB for each condition. The confound-corrected nifti images (niftiDATA_Subject004_Condition000.nii) have a large filesize of 1,5 GB. Since this filesize seemed a bit large to me I used spm_vol to look at the data and found a structure array of length "number-of-images" (Z = 410x1 struct). When I looked at size(Z(1).private.dat) I saw that it contained 4-D data of 410 images. It looks like Z(2), Z(3) ... Z(n) all contain the same 4-D which I think is the explanation for the large file size. My question is whether I made a mistake somewhere or whether there is a reason why the same 4-D is saved 410 times?
Thanks a lot for your help in advance
Matti
Threaded View
Title | Author | Date |
---|---|---|
Matti Gärtner | May 10, 2017 | |
Alfonso Nieto-Castanon | May 24, 2017 | |