-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference operator for "Algo bundles" trained with Auto3DSeg #418
Comments
@nvahmadi Thanks for you request. Yes, we had encountered such models along with the custom app scripts in MONAI Model Zoo, e.g. the Lung Nodule Detection. The Inference config for these models use steps/sections that are needed for inference but not supported by the MONAIBundleInference, i.e. anything that is not pre_processing, infer, and post_processing. The App SDK IS designed to allow a dev user to create custom operators, while leveraging built-in ones where applicable, to compose the application. This means that a custom inference operator will need to be developed, using the code snippet shown above. DICOM input can still be processed by the built-in operators, or if NIfTI input, either a new loader operator or the built-in nii dataloader can be used. Please check out this example here for creating the deploy app for the aforementioned Lung Nodule Detection model (please note this app also integrates with Nuance platform, so it has extra modules. I do have a MONAI deploy native version, but was not checked in) |
Is your feature request related to a problem? Please describe.
I would like to create a MAP from a MONAI Algo "bundle" that was trained with Auto3DSeg. It seems that this is not supported yet, because Auto3DSeg does not produce regular bundles. For example, the
monai_bundle_inference_operator
first checks whether there is a JSON/YAML config file. But Auto3DSeg Algos are structured differently, with python files for training/inference under the subfolder./scripts
, and an execution model based on Python Fire.Describe the solution you'd like
An inference operator for MONAI "bundles" that are Algo directories which were trained with Auto3DSeg, e.g.
monai_algo_inference_operator.py
. Would be great to be able to take trained Algos and deploy them directly. 🙂Describe alternatives you've considered
I've spent a few hours trying to understand all the steps performed in
monai_bundle_inference_operator.py
, and implement my own operator class by inheriting from it, i.e.class MonaiAlgoInferenceOperator(MonaiBundleInferenceOperator)
. But it was too complex or it would take me too long, given certain slight differences: For example, theMonaiBundleInferenceOperator
assumes a separation of the inference into 3 parts: pre-processing, compute, and post-processing. But a trained Auto3DSeg Algo object encapsulates these steps into a single inference class, see below. Maybe there is a way to re-use that and create a very simple InferenceOperator for DeployAppSDK. Hopefully the code snippet below can help illustrate what I mean, maybe even help towards a solution.Additional context
In principle, inference with an Algo object is easy, following this code snippet. let's assume I want to run inference with thetrained fold 0 of the Algo template
segresnet
, and I want to run it on a Nifti file located atimage_filepath
on disk:I still haven't fully understood the requirements to create an Inference Operator for DeployAppSDK. But maybe it's possible to inherit from
InferenceOperator
directly and implement a very simple inference operator like this:I know this is very amateurish, apologies 😅
Looking forward to more expert suggestions!
The text was updated successfully, but these errors were encountered: