Automated procesing using fastr

The Network

For this tutorial we created a network to demonstrate one of FastR’s features. In this example we make use of FSL’s bet to calculate a brain mask. To make this method a little more robust we actually perform a parameter sweep of some of BET’s parameters and then combine these masks into one segmentation.

network = fastr.create_network(id="fsl_bet")

# Source images
source_image = network.create_source('DicomImageFile',
                                     id='input',
                                     step_id='input')

# Parameter sweep
fraction_threshold_values = {
    "0_20": 0.20,
    "0_35": 0.35,
    "0_50": 0.50,
    "0_65": 0.65,
    "0_80": 0.80,
}

fraction_gradient_values = {
    "-0_20": -0.20,
    "-0_10": -0.10,
    "0_00": 0.00,
    "0_10": 0.10,
    "0_20": 0.20,
}

# Create parameter sweep constant
fraction_threshold = network.create_constant('Number', fraction_threshold_values,
                                      node_group='fraction_threshold',
                                      id='fraction_threshold',
                                      step_id='brain_extraction')

fraction_gradient = network.create_constant('Number', fraction_gradient_values,
                                      node_group='fraction_gradient',
                                      id='fraction_gradient',
                                      step_id='brain_extraction')

# Dicom conversions
image_dicom_to_nifti = network.create_node('dcm2nii/DicomToNifti:0.1',
                                           id='image_dicom_to_nifti',
                                           tool_version ='0.1',
                                           step_id='input')
image_dicom_to_nifti.inputs['dicom_image'] = source_image.output

# use fsl-bet for Automated brain extraction
brain_extraction = network.create_node('fsl/Bet2:5.0.2',
                                       tool_version="0.2",
                                       id='brain_extraction',
                                       step_id='brain_extraction')

brain_extraction.inputs['image'] = image_dicom_to_nifti.outputs['image']
brain_extraction.inputs['fraction_threshold'] = fraction_threshold.output
brain_extraction.inputs['fraction_threshold'].input_group = 'fraction_threshold'

brain_extraction.inputs['fraction_gradient'] = fraction_gradient.output
brain_extraction.inputs['fraction_gradient'].input_group = 'fraction_gradient'

# use pxcastconvert to cast masks to char, needed for PxCombineSegmentations
cast_brainmask = network.create_node('itktools/PxCastConvert:0.3.0',
                                     tool_version="0.1",
                                     id='cast_brainmask')
cast_brainmask.inputs['image'] = brain_extraction.outputs['mask']
cast_brainmask.inputs['component_type'] = 'unsigned_char',

# Combine bet combine_masks
combine_masks = network.create_node('itktools/PxCombineSegmentations:0.3.0',
                                    tool_version="0.1",
                                    id='combine_masks',
                                    step_id='output')

combine_mask_link = network.create_link(cast_brainmask.outputs['image'], combine_masks.inputs['images'])
combine_mask_link.collapse = ['fraction_threshold', 'fraction_gradient']
combine_masks.inputs['method'] = 'VOTE',
combine_masks.inputs['number_of_classes'] = 2,

# Save brain mask
brain_mask = network.create_sink('NiftiImageFileCompressed', id='brain_mask', step_id='output')
brain_mask.inputs['input'] = combine_masks.outputs['hard_segment']

return network

Automated processing will have started shortly after pressing “Save & Finish” in the ViewR. You can check the state of the experiment using the StudyGovenor. To check progress we can check the log file created by fastr:

cat tracr/project-data/scratch/fastr_fsl_bet_1_stdout.txt

You should see all jobs being create/execute and finish. To do some extra inspection we can make use of the Pipeline Inspection Monitoring tool. This is being developed by the LUMC (Thomas Kroes), https://pim-production.scaleout-emc.surf-hosted.nl. Because this tool is used for all pipelines , you should hav a look at you specific URL. By running the above command againg and looking for the following snippet somewhere in the beginnging, looking for the line “Run registered in PIM at : <YOUR URL>”

[INFO] networkrun:0517 >> ####################################
[INFO] networkrun:0518 >> #     network execution STARTED    #
[INFO] networkrun:0519 >> ####################################
[INFO] networkrun:0544 >> Running network via /usr/local/lib/python3.7/site-packages/fastr/api/__init__.py (last modified Thu Nov 21 08:47:17 2019)
[INFO] networkrun:0545 >> FASTR loaded from /usr/local/lib/python3.7/site-packages/fastr
[INFO] networkrun:0561 >> Network run tmpdir: /tmp/fastr_fsl_bet_1
[INFO] pimreporter:0580 >> Using PIM API version 2
[INFO] pimreporter:0462 >> Registering fsl_bet_2019-11-21T09-20-30 with PIM at https://pim-production.scaleout-emc.surf-hosted.nl/api/runs/
[INFO] pimreporter:0477 >> Run registered in PIM at https://pim-production.scaleout-emc.surf-hosted.nl/runs/fsl_bet_2019-11-21T09-20-30
[INFO]   noderun:0576 >> Creating job for node fastr:///networks/fsl_bet/0.0/runs/fsl_bet_2019-11-21T09-20-30/nodelist/input sample id <SampleId ('ANONYMIZ001_BRAIN',)>, index <SampleIndex (0)>

Copying this url to you browser (there might be some issues in Safari) should show you progress of your pipeline. After the pipeline has finished you should be able to see it in XNAT. To check, go to the experiment in XNAT and from the menu on the right select manage files. At the t1w NIFTI resource you should see the ‘t1w_brainmask_fslbet.nii.gz’ and ‘t1w_brainmask_fslbet_nii_gz.json’ files that were uploaded by the pipeline.