A improvement for Multi-source Domain Adaptation in Semantic Segmentation based on Medical image processing.
- Use StarGAN instead of CycleGAN to improve model scalability.Now we can finish this task with only one Generator and Discrminator instead of N G&D in the past.
- Use Mixmatch (semi-supervised method) to reduce the dependence of the model on the amount of labeled data.
- Tow Source Domain DataSet is collect from Brats2018 .I select MRI-T1CE modal as the first source Domain which has 1000 labeled images and MRI-T2 modal as the second one which as 800 labeld images.
- Target Domain DataSet is collect from OpenBayes,which has 500 images.
I transfered two Source Domain images to Target Domain.Here are two samples.
I compare model performance which contains without DA,DA and DA+mixmatch.The results are shown in the table below.The results show that we get a better performance.
Here are two segmentation samples
I implement code based on Tensorflow and Keras,here is my environment setting:
- Python 3.7
- Tensorflow 1.14.0
- Keras 2.3.1
- numpy 1.17.2
- opencv 4.1.1.26
I implement my code based on this CycleGAN code.I use LSGAN and PatchGAN in Discriminator.
initailize the dataset use this function.If your image size is different from image shape in config.py, set parameter need_resize=1。
self.target=Dataset(image_path='./OpenBayes',label_path='./Openbayes_label',need_resize=1)
initial your dataset with right path in init function in model.py。
start training
python model.py
initailize the labeled and unlabeled dataset use this function.If your image size is different from image shape in config.py, set parameter need_resize=1。
ldataSet=DataSet(image_path='./unet2_image',label_path='./unet2_label')
udataSet=TestSet(image_path='./OpenBayes',label_path='./Openbayes_label',need_resize=1)
start training
python cycle_mixmatch.py
start evalution
python evalution.py
- [1]. MixMatch: A Holistic Approach to Semi-Supervised Learning.(arxiv)
- [2]. StarGAN: Unified Generative Adversarial Networks for Multi-Domain Image-to-Image Translation (full paper)