Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENH]: Smoothing images as a preprocessing step #161

Merged
merged 12 commits into from
Apr 9, 2024

Conversation

synchon
Copy link
Member

@synchon synchon commented Mar 8, 2024

Are you requiring a new dataset or marker?

  • I understand this is not a marker or dataset request

Which feature do you want to include?

One standard preprocessing step in a lot of MRI analyses (that is not applied by fMRIprep) is smoothing images after confound regression. It would be good to have this as an additional inbuilt preprocessing step.

How do you imagine this integrated in junifer?

The easiest way will probably be using the nilearn function for smooting images, although one can also use AFNI, but I dont expect that to yield very different results.

Do you have a sample code that implements this outside of junifer?

No response

Anything else to say?

No response

@LeSasse LeSasse added enhancement New feature or request triage New issues waiting to be reviewed labels Dec 22, 2022
@LeSasse
Copy link
Collaborator Author

LeSasse commented Feb 5, 2023

One particular smoothing function that is quite different from others is FSL's SUSAN. This is particularly interesting because it is the smoothing function used by XCP engine. If we do want to implement this (as an additional dependency...) then the nipype function create_susan_smooth may be of interest. This blogpost shows how to use it. Command line GUI/wiki for SUSAN is here. There is also a wrapper for the susan command in python

@synchon
Copy link
Member

synchon commented Feb 14, 2023

@LeSasse Do you already have a use-case for this? Will help to check the implementation(s) on real-world usage.

@synchon synchon added preprocess Issues or pull requests related to preprocessors WIP Work in progress and removed triage New issues waiting to be reviewed labels Feb 14, 2023
@LeSasse
Copy link
Collaborator Author

LeSasse commented Feb 14, 2023

So, I have a use case where I would like to test the current pipeline without SUSAN and compare it with the results when using SUSAN. In addition, I have equivalent data preprocessed with xcpengine which uses SUSAN so we can also compare it to that.

@synchon
Copy link
Member

synchon commented Feb 14, 2023

Sounds good, will let you know when I have something concrete.

@synchon
Copy link
Member

synchon commented Mar 7, 2023

@LeSasse @fraimondo I think we can have either (i) 1 class handling 1/2/3 "backends" (nilearn, AFNI, FSL) or (ii) 3 classes named accordingly. Having 1 class will of course bloat it but keep it in one piece. I personally prefer (ii) but if (i) is better from usage POV, I don't mind it either.

@synchon synchon added this to the 0.0.4 (alpha 3) milestone Jul 21, 2023
@synchon
Copy link
Member

synchon commented Mar 8, 2024

@LeSasse @fraimondo I think we can have either (i) 1 class handling 1/2/3 "backends" (nilearn, AFNI, FSL) or (ii) 3 classes named accordingly. Having 1 class will of course bloat it but keep it in one piece. I personally prefer (ii) but if (i) is better from usage POV, I don't mind it either.

Since, there's no agreement or disagreement, I'll proceed with (ii).

Copy link

github-actions bot commented Mar 8, 2024

PR Preview Action v1.4.7
Preview removed because the pull request was closed.
2024-04-09 14:35 UTC

Copy link

codecov bot commented Mar 8, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 100.00%. Comparing base (c36165b) to head (9054ff4).

Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff            @@
##              main      #161   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files            1         1           
  Lines            1         1           
=========================================
  Hits             1         1           
Flag Coverage Δ
docs 100.00% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

@fraimondo
Copy link
Contributor

If there is no "common" API that the different "backends" use, then having one single class can be difficult (e.g. yield too many optional parameters conditional to the backend).

In that case, I would KISS with (ii).

@fraimondo
Copy link
Contributor

For me, different backends should not alter the result, rather than the kind of tool used. If the result is not the same, then it should not be a backend, but a different class.

@synchon
Copy link
Member

synchon commented Mar 8, 2024

For me, different backends should not alter the result, rather than the kind of tool used. If the result is not the same, then it should not be a backend, but a different class.

After looking at the implementations, they don't exactly do the same thing in the sense that they have different parameters which affect the result so having distinct classes make sense.

@LeSasse
Copy link
Collaborator Author

LeSasse commented Mar 8, 2024

@LeSasse @fraimondo I think we can have either (i) 1 class handling 1/2/3 "backends" (nilearn, AFNI, FSL) or (ii) 3 classes named accordingly. Having 1 class will of course bloat it but keep it in one piece. I personally prefer (ii) but if (i) is better from usage POV, I don't mind it either.

Since, there's no agreement or disagreement, I'll proceed with (ii).

sounds good to me

@synchon synchon force-pushed the feat/smoothing-preprocessor branch from c637037 to 9054ff4 Compare April 9, 2024 12:24
@synchon
Copy link
Member

synchon commented Apr 9, 2024

@LeSasse This PR works for you right? Before merging, would be cool if we can check your pipeline comparison works as expected.

@LeSasse
Copy link
Collaborator Author

LeSasse commented Apr 9, 2024

@LeSasse This PR works for you right? Before merging, would be cool if we can check your pipeline comparison works as expected.

Saw this just now, yeah works smoothly on my side. Consider it approved!

Copy link
Collaborator

@LeSasse LeSasse left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@synchon synchon merged commit 0fde637 into main Apr 9, 2024
26 of 28 checks passed
@synchon synchon deleted the feat/smoothing-preprocessor branch April 9, 2024 14:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request preprocess Issues or pull requests related to preprocessors
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants