Automated Tumor and FUS Lesion Quantification on Multi-frequency Harmonic Motion and B-mode Imaging Using a Multi-modality Neural Network DOI

Shiqi Hu,

Yangpei Liu, Xiaoyue Judy Li

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2024, Volume and Issue: unknown

Published: Oct. 3, 2024

Abstract Harmonic Motion Imaging (HMI) is an ultrasound elasticity imaging method that measures the mechanical properties of tissue using amplitude-modulated acoustic radiation force (AM-ARF). By estimating tissue’s on-axis oscillatory motion, HMI-derived displacement images represent localized relative stiffness and can predict tumor response to neoadjuvant chemotherapy (NACT) monitor focused (FUS) ablation therapy. Multi-frequency HMI (MF-HMI) excites at various AM frequencies simultaneously, which allows for image optimization without prior knowledge inclusion size stiffness. However, challenges remain in estimation as inconsistent boundary effects result different perceived sizes across frequencies. Herein, we developed automated FUS lesion quantification a transformer-based multi-modality neural network, HMINet. It was trained on 380 pairs MF-HMI B-mode phantoms vivo orthotopic breast cancer mice (4T1). Test datasets included (n = 32), 4T1 24), patients 16), FUS-induced lesion, with average segmentation accuracy (Dice Similarity Score) 0.95, 0.86, 0.82, 0.87, respectively. To increase generalizability HMINet, applied transfer learning strategy, i.e., fine-tuning model patient data. For NACT patients, ratios (DR) between surrounding were calculated based HMINet-segmented boundaries changes.

Language: Английский

Automated Tumor and FUS Lesion Quantification on Multi-frequency Harmonic Motion and B-mode Imaging Using a Multi-modality Neural Network DOI

Shiqi Hu,

Yangpei Liu, Xiaoyue Judy Li

et al.

bioRxiv (Cold Spring Harbor Laboratory), Journal Year: 2024, Volume and Issue: unknown

Published: Oct. 3, 2024

Abstract Harmonic Motion Imaging (HMI) is an ultrasound elasticity imaging method that measures the mechanical properties of tissue using amplitude-modulated acoustic radiation force (AM-ARF). By estimating tissue’s on-axis oscillatory motion, HMI-derived displacement images represent localized relative stiffness and can predict tumor response to neoadjuvant chemotherapy (NACT) monitor focused (FUS) ablation therapy. Multi-frequency HMI (MF-HMI) excites at various AM frequencies simultaneously, which allows for image optimization without prior knowledge inclusion size stiffness. However, challenges remain in estimation as inconsistent boundary effects result different perceived sizes across frequencies. Herein, we developed automated FUS lesion quantification a transformer-based multi-modality neural network, HMINet. It was trained on 380 pairs MF-HMI B-mode phantoms vivo orthotopic breast cancer mice (4T1). Test datasets included (n = 32), 4T1 24), patients 16), FUS-induced lesion, with average segmentation accuracy (Dice Similarity Score) 0.95, 0.86, 0.82, 0.87, respectively. To increase generalizability HMINet, applied transfer learning strategy, i.e., fine-tuning model patient data. For NACT patients, ratios (DR) between surrounding were calculated based HMINet-segmented boundaries changes.

Language: Английский

Citations

1