In Vivo Super-Resolution Ultrasound Imaging

S. Harput, K. Christensen-Jeffries, J. Brown, Y. Li, K.J. Williams, A.H. Davies, R.J. Eckersley, C. Dunsby, and M-X. Tang, “Two-stage Motion Correction for Super-Resolution Ultrasound Imaging in Human Lower Limb

IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control (vol. 65, no. 5, pp. 803-814, 2018)

DOI: 10.1109/TUFFC.2018.2824846


S. Harput, Y. Li, and M. X. Tang: ULIS Group, Department of Bioengineering, Imperial College London, London, SW7 2AZ, UK
K. Christensen-Jeffries, J. Brown, and R. J. Eckersley: Biomedical Engineering Department, Division of Imaging Sciences, King’s College London, SE1 7EH, London, UK
Katherine J. Williams and Alun H. Davies: Section of Surgery, Imperial College London, Charing Cross Hospital, London, UK
C. Dunsby: Department of Physics and the Centre for Pathology, Imperial College London, London, SW7 2AZ, UK



The structure of microvasculature cannot be resolved using conventional ultrasound imaging due to the fundamental diffraction limit at clinical ultrasound frequencies. It is possible to overcome this resolution limitation by localizing individual microbubbles through multiple frames and forming a super-resolved image, which usually requires seconds to minutes of acquisition. Over this time interval, motion is inevitable and tissue movement is typically a combination of large and small scale tissue translation and deformation. Therefore, super-resolution imaging is prone to motion artefacts as other imaging modalities based on multiple acquisitions are.

This study investigates the feasibility of a two-stage motion estimation method, which is a combination of affine and non-rigid estimation, for super-resolution ultrasound imaging. Firstly, the motion correction accuracy of the proposed method is evaluated using simulations with increasing complexity of motion. A mean absolute error of 12.2 μm was achieved in simulations for the worst case scenario. The motion correction algorithm was then applied to a clinical dataset to demonstrate its potential to enable in vivo super-resolution ultrasound imaging in the presence of patient motion. The size of the identified microvessels from the clinical super-resolution images were measured to assess the feasibility of the two-stage motion correction method, which reduced the width of the motion blurred microvessels approximately 1.5-fold.


June 2018 front cover image of the IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control


Figure shows super-resolution ultrasound images of microvasculature in a healthy human leg before and after motion correction. A Philips iU22 ultrasound scanner was used with a handheld 6 MHz linear array probe. Contrast enhanced ultrasound frames were acquired using a diluted contrast agent solution. Localization-based super-resolution images were created by processing the contrast mode frames acquired in less than one minute. A two-stage motion correction was applied to the clinical dataset in the presence of patient motion. The two-stage method is a combination of affine image registration that can estimate the global motion, and non-rigid image registration that can estimate the local deformation of tissue. Effect of motion correction is presented in 8 super-resolution image pairs. Images are displayed in two columns, where the left hand side image is without motion correction and the right hand side image is with motion correction.


Supporting Bodies:

This work was supported in part by the EPSRC under Grant EP/N015487/1 and EP/N014855/1, in part by the King’s College London and Imperial College London EPSRC Centre for Doctoral Training in Medical Imaging (EP/L015226/1), in part by the Wellcome EPSRC Centre for Medical Engineering at King’s College London (WT 203148/Z/16/Z), in part by the Department of Health through the National Institute for Health Research comprehensive Biomedical Research Center Award to Guy’s and St Thomas’ NHS Foundation Trust in partnership with King’s College London and King’s College Hospital NHS Foundation Trust, and in part by the Graham-Dixon Foundation.