Neural-network-based motion tracking for breast ultrasound strain elastography: An initial assessment of performance and feasibility

Document Type

Article

Publication Date

3-1-2020

Department

Department of Computer Science

Abstract

Accurate tracking of tissue motion is critically important for several ultrasound elastography methods. In this study, we investigate the feasibility of using three published convolution neural network (CNN) models built for optical flow (hereafter referred to as CNN-based tracking) by the computer vision community for breast ultrasound strain elastography. Elastographic datasets produced by finite element and ultrasound simulations were used to retrain three published CNN models: FlowNet-CSS, PWC-Net, and LiteFlowNet. After retraining, the three improved CNN models were evaluated using computer-simulated and tissue-mimicking phantoms, and in vivo breast ultrasound data. CNN-based tracking results were compared with two published two-dimensional (2D) speckle tracking methods: coupled tracking and GLobal Ultrasound Elastography (GLUE) methods. Our preliminary data showed that, based on the Wilcoxon rank-sum tests, the improvements due to retraining were statistically significant (p < 0.05) for all three CNN models. We also found that the PWC-Net model was the best neural network model for data investigated, and its overall performance was on par with the coupled tracking method. CNR values estimated from in vivo axial and lateral strain elastograms showed that the GLUE algorithm outperformed both the retrained PWC-Net model and the coupled tracking method, though the GLUE algorithm exhibited some biases. The PWC-Net model was also able to achieve approximately 45 frames/second for 2D speckle tracking data investigated.

Publication Title

Ultrasonic imaging

Share

COinS