Deep learning for medical image segmentation



Min Tang
Vincent Zhang
Sepehr Valipour
Ryan Kiros
Dana Cobzas,
Martin Jagersand

References

Tang, M., Valipour, S., Zhang, V., Cobzas, D. and Jagersand, M. A deep level set method for image segmentation, DLMIA@MICCAI2014

Kiros R.,Pupuri K., Cobzas D., Jagersand M. Stacked Multiscale Feature Learning for Domain Independent Medical Image Segmentation MLMI@MICCAI2014



Description

We proposed (MICCAI17) a novel image segmentation approach that integrates fully convolutional networks (FCNs) with a level set model. Compared with a FCN, the integrated method can incorporate smoothing and prior information to achieve an accurate segmentation. Furthermore, different than using the level set model as a post-processing tool, we integrate it into the training phase to fiee-tune the FCN. This allows the use of unlabeled data during training in a semi-supervised setting.


Fig 1: Overview of the proposed FCN-levelset model. The pre-trained FCN is re ned by further training with both labeled (top) and unlabeled data (bottom). The level set gets initialized with the probability map produced by the pre-trained FCN and provides a refined contour for fine-tuning the FCN.

We also proposed (MLMI@MICCAI14) a framework for learning features from data itself at multiple scales and depth. Our method uses two layers of stacked dictionaries. Our features can be easily integrated into classifiers or energy-based segmentation algorithms. We test the performance of our proposed method on two MICCAI grand challenges, obtaining the top score on VESSEL12 and competitive performance on BRATS2012. This work is the winner of the VESSEL12 challenge.


Fig 2: Visualization of our feature learning approach. Each volume slice is scaled using a Gaussian pyramid. Patches are extracted at each scale to learn a dictionary D using OMP. Convolution is performed over all scales with the dictionary filters, resulting in k feature maps. After training the first layer, the feature maps can then be used as input to a second layer.