TY - JOUR
T1 - A Deep Learning Approach for Segmentation, Classification, and Visualization of 3-D High-Frequency Ultrasound Images of Mouse Embryos
AU - Qiu, Ziming
AU - Xu, Tongda
AU - Langerman, Jack
AU - Das, William
AU - Wang, Chuiyu
AU - Nair, Nitin
AU - Aristizabal, Orlando
AU - Mamou, Jonathan
AU - Turnbull, Daniel H.
AU - Ketterling, Jeffrey A.
AU - Wang, Yao
N1 - Funding Information:
Manuscript received December 31, 2020; accepted March 17, 2021. Date of publication March 23, 2021; date of current version June 29, 2021. This work was supported in part by the NIH under Grant EB022950 and Grant HD097485. (Corresponding author: Yao Wang.) Ziming Qiu, Tongda Xu, Nitin Nair, and Yao Wang are with the Department of Electrical and Computer Engineering, New York University, New York, NY 11201 USA (e-mail: zq415. . yu.edu; x.tongda. . yu.edu; nn1174. . yu.edu; yw523. . yu.edu).
Publisher Copyright:
© 1986-2012 IEEE.
PY - 2021/7
Y1 - 2021/7
N2 - Segmentation and mutant classification of high-frequency ultrasound (HFU) mouse embryo brain ventricle (BV) and body images can provide valuable information for developmental biologists. However, manual segmentation and identification of BV and body requires substantial time and expertise. This article proposes an accurate, efficient and explainable deep learning pipeline for automatic segmentation and classification of the BV and body. For segmentation, a two-stage framework is implemented. The first stage produces a low-resolution segmentation map, which is then used to crop a region of interest (ROI) around the target object and serve as the probability map of the autocontext input for the second-stage fine-resolution refinement network. The segmentation then becomes tractable on high-resolution 3-D images without time-consuming sliding windows. The proposed segmentation method significantly reduces inference time (102.36-0.09 s/volume ≈ 1000× faster) while maintaining high accuracy comparable to previous sliding-window approaches. Based on the BV and body segmentation map, a volumetric convolutional neural network (CNN) is trained to perform a mutant classification task. Through backpropagating the gradients of the predictions to the input BV and body segmentation map, the trained classifier is found to largely focus on the region where the Engrailed-1 (En1) mutation phenotype is known to manifest itself. This suggests that gradient backpropagation of deep learning classifiers may provide a powerful tool for automatically detecting unknown phenotypes associated with a known genetic mutation.
AB - Segmentation and mutant classification of high-frequency ultrasound (HFU) mouse embryo brain ventricle (BV) and body images can provide valuable information for developmental biologists. However, manual segmentation and identification of BV and body requires substantial time and expertise. This article proposes an accurate, efficient and explainable deep learning pipeline for automatic segmentation and classification of the BV and body. For segmentation, a two-stage framework is implemented. The first stage produces a low-resolution segmentation map, which is then used to crop a region of interest (ROI) around the target object and serve as the probability map of the autocontext input for the second-stage fine-resolution refinement network. The segmentation then becomes tractable on high-resolution 3-D images without time-consuming sliding windows. The proposed segmentation method significantly reduces inference time (102.36-0.09 s/volume ≈ 1000× faster) while maintaining high accuracy comparable to previous sliding-window approaches. Based on the BV and body segmentation map, a volumetric convolutional neural network (CNN) is trained to perform a mutant classification task. Through backpropagating the gradients of the predictions to the input BV and body segmentation map, the trained classifier is found to largely focus on the region where the Engrailed-1 (En1) mutation phenotype is known to manifest itself. This suggests that gradient backpropagation of deep learning classifiers may provide a powerful tool for automatically detecting unknown phenotypes associated with a known genetic mutation.
KW - Classification and visualization
KW - deep learning
KW - high-frequency ultrasound (HFU)
KW - image segmentation
KW - mouse embryo
KW - Neural Networks, Computer
KW - Animals
KW - Image Processing, Computer-Assisted
KW - Ultrasonography
KW - Mice
KW - Imaging, Three-Dimensional
KW - Deep Learning
UR - http://www.scopus.com/inward/record.url?scp=85103238757&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85103238757&partnerID=8YFLogxK
U2 - 10.1109/TUFFC.2021.3068156
DO - 10.1109/TUFFC.2021.3068156
M3 - Article
C2 - 33755564
AN - SCOPUS:85103238757
SN - 0885-3010
VL - 68
SP - 2460
EP - 2471
JO - IRE Transactions on Ultrasonic Engineering
JF - IRE Transactions on Ultrasonic Engineering
IS - 7
M1 - 9383281
ER -