Recursive Deformable Image Registration Network with Mutual Attention

Deformable image registration, estimating the spatial transformation between different images, is an important task in medical imaging. Many previous studies have used learning-based methods for multi-stage registration to perform 3D image registration to improve performance. The performance of the multi-stage approach, however, is limited by the size of the receptive field where complex motion does not occur at a single spatial scale. We propose a new registration network combining recursive network architecture and mutual attention mechanism to overcome these limitations. Compared with the state-of-the-art deep learning methods, our network based on the recursive structure achieves the highest accuracy in lung Computed Tomography (CT) data set (Dice score of 92\% and average surface distance of 3.8mm for lungs) and one of the most accurate results in abdominal CT data set with 9 organs of various sizes (Dice score of 55\% and average surface distance of 7.8mm). We also showed that adding 3 recursive networks is sufficient to achieve the state-of-the-art results without a significant increase in the inference time.

PDF Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Image Registration Unpaired-abdomen-CT Dnet DSC 0.47 # 9
ASD 8.72 # 4
Image Registration Unpaired-abdomen-CT RMAn DSC 0.55 # 6
ASD 7.78 # 3
Image Registration Unpaired-lung-CT Dnet DSC 0.88 # 4
ASD 5.01 # 4
Image Registration Unpaired-lung-CT RMAn DSC 0.92 # 2
ASD 3.83 # 3

Methods


No methods listed for this paper. Add relevant methods here