Yufang Jiang1, Haiyan Liu1, *, Linyan Ke2
1Department of Ultrasonography, Changxing County People's Hospital, Huzhou 313100, Zhejiang Province, China - 2Department of Ultrasound, Lishui Municipal People's Hospital, Lishui 323000, Zhejiang Province, China
To explore the adoption value of ultrasonic image features based on deep residual network (ResNet) in breast imaging reporting and data system (BI-RADS) classification and prognosis of breast cancer, 60 patients with breast cancer were selected and classified into control group (traditional manual feature extraction) and experimental group (image feature extraction based on deep ResNet) averagely. The results showed that the region of interest (ROI) extracted by deep ResNet model was more accurate than traditional manual extraction. In grades 3, 4, 5, and 6 of BI-RADS, the experimental group’s accuracy (0.75, 0.79, 0.82, and 0.89, respectively), recall (0.81, 0.78, 0.93, and 0.77, respectively), and F1 scores (0.76, 0.78, 0.88, and 0.84, respectively) were better than those of control group (0.57, 0.67, 0.74, and 0.68; 0.65, 0.65, 0.69, and 0.68; 0.59, 0.61, 0.73, and 0.65, respectively). Besides, in grades 4a, 4b, and 4c, observation group’s accuracy (0.73, 0.78, and 0.82, respectively), recall (0.78, 0.78, and 0.80, respectively), and F1 scores (0.71, 0.74, and 0.73, respectively) were also better than those of control group (0.58, 0.64, and 0.69; 0.67, 0.7, and 0.66; 0.65, 0.63, and 0.66, respectively) (P<0.05). The size of breast tumor was closely related to lymph node metastasis and patient’s prognosis. To sum up, breast cancer ultrasonic image features based on deep learning had vital adoption value in BI-RADS classification and prognosis.
Breast cancer, ultrasonic image, BI-RADS classification, deep ResNet, support vector machine, prognosis of breast cancer.