[go: up one dir, main page]

CN113768544A - Ultrasonic imaging method and equipment for mammary gland - Google Patents

Ultrasonic imaging method and equipment for mammary gland Download PDF

Info

Publication number
CN113768544A
CN113768544A CN202110968952.2A CN202110968952A CN113768544A CN 113768544 A CN113768544 A CN 113768544A CN 202110968952 A CN202110968952 A CN 202110968952A CN 113768544 A CN113768544 A CN 113768544A
Authority
CN
China
Prior art keywords
rads
feature
value set
breast
ultrasonic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110968952.2A
Other languages
Chinese (zh)
Inventor
姜玉新
王红燕
李建初
徐雯
安兴
朱磊
董多
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Peking Union Medical College Hospital Chinese Academy of Medical Sciences
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Peking Union Medical College Hospital Chinese Academy of Medical Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd, Peking Union Medical College Hospital Chinese Academy of Medical Sciences filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority to CN202110968952.2A priority Critical patent/CN113768544A/en
Publication of CN113768544A publication Critical patent/CN113768544A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0825Clinical applications for diagnosis of the breast, e.g. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Radiology & Medical Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

本发明实施例提供一种乳腺的超声成像方法及设备,该方法包括:获取被检者乳腺区域的超声图像;基于预训练的乳腺病灶智能化分析模型对所述超声图像进行分析,得到所述被检者乳腺区域中乳腺病灶的BI‑RADS特征集合取值对应的第一特征值集合;检测用户对所述第一特征值集合修改或确认的操作,以获得第二特征值集合;根据所述第一特征值集合、所述第二特征值集合以及所述超声图像确定所述乳腺病灶的BI‑RADS分级。本发明实施例的方法,在确定乳腺病灶的BI‑RADS分级时,不仅利用了乳腺区域的超声图像,而且结合了用户的反馈信息,提高了BI‑RADS分级的准确性。

Figure 202110968952

Embodiments of the present invention provide a breast ultrasound imaging method and device, the method includes: acquiring an ultrasound image of a breast region of a subject; analyzing the ultrasound image based on a pre-trained intelligent analysis model of breast lesions, and obtaining the the first feature value set corresponding to the value of the BI-RADS feature set of the breast lesions in the breast region of the subject; detecting the user's modification or confirmation of the first feature value set to obtain the second feature value set; The first set of eigenvalues, the second set of eigenvalues, and the ultrasound image determine the BI-RADS grading of the breast lesion. The method of the embodiment of the present invention, when determining the BI-RADS grading of breast lesions, not only utilizes the ultrasound image of the breast region, but also combines the feedback information of the user to improve the accuracy of the BI-RADS grading.

Figure 202110968952

Description

Ultrasonic imaging method and equipment for mammary gland
Technical Field
The embodiment of the invention relates to the technical field of medical image processing, in particular to an ultrasonic imaging method and equipment for mammary glands.
Background
The breast cancer is a malignant tumor which occurs in mammary gland epithelial tissues, and the statistical data of the cancer shows that the breast cancer is the first place of the incidence rate of female malignant tumors, so the early screening of the breast cancer is particularly important. The breast ultrasound can clearly display the position, the form, the internal structure and the change of adjacent tissues of each layer of soft tissues of the breast and lesions in the soft tissues, has the advantages of economy, convenience, no wound, no pain, no radioactivity, strong repeatability and the like, and becomes one of important ways for early screening of the breast cancer.
The signs of Breast lesions are complex, and the current diagnostic standard in clinical diagnosis that is widely used and relatively authoritative is the Breast Imaging Reporting and Data system (BI-RADS) proposed by the American College of Radiology (ACR). BI-RADS uses unified and professional terms to diagnose and classify the focus, but the diagnosis rule is complex and many, and the diagnosis rule is difficult to remember for low-age capital or primary hospital doctors, thereby influencing the diagnosis efficiency of clinicians. With the rapid development of artificial intelligence technology, especially deep learning and other technologies, computer-aided diagnosis is used for carrying out intelligent analysis on breast ultrasound images, provides an automatic and efficient auxiliary diagnosis tool for clinic, and has great clinical value. Although the existing breast ultrasound image analysis method and system based on artificial intelligence technology are helpful for improving the diagnosis efficiency of clinicians, the accuracy of auxiliary diagnosis still needs to be improved because the analysis is usually performed only by using image information as input data.
Disclosure of Invention
The embodiment of the invention provides an ultrasonic imaging method and equipment for mammary glands, which are used for solving the problem of low accuracy of the existing method.
In a first aspect, an embodiment of the present invention provides a method for ultrasound imaging of a breast, including:
acquiring an ultrasonic image of a breast area of a subject;
analyzing the ultrasonic image based on a pre-trained breast lesion intelligent analysis model to obtain a first feature value set corresponding to a BI-RADS feature set value of a breast lesion in a breast area of the subject, wherein the breast lesion intelligent analysis model is obtained based on sample ultrasonic image training labeled with the BI-RADS feature set value;
detecting the operation of modifying or confirming the first characteristic value set by a user to obtain a second characteristic value set;
determining a BI-RADS rating of the breast lesion from the first set of feature values, the second set of feature values, and the ultrasound image.
In one embodiment, the BI-RADS feature set includes a shape type, a direction type, an edge type, an echo type, a posterior echo type, a calcification type, and a blood supply type.
In one embodiment, said determining a BI-RADS rating of said breast lesion from said first set of feature values, said second set of feature values and said ultrasound image comprises:
obtaining image information feature vectors based on the ultrasound images
Figure BDA0003225222950000021
Wherein x isiIs the feature vector corresponding to the ith BI-RADS feature in the BI-RADS feature set extracted from the ultrasonic image, wiIs xiN is the number of BI-RADS features in the BI-RADS feature set;
if the ith BI-RADS feature is in the first featureW 'if the values in the value set and the second characteristic value set are the same'i=wi+ Δ adjust xiThe weight of (c); if the ith BI-RADS feature is not equal to the values in the first feature value set and the second feature value set, passing w'i=wi-delta adjustment xiThe weight of (c); wherein delta is weight adjustment amount, w'iFor x after adjustmentiThe weight of (c);
determining an image information feature vector for determining a BI-RADS ranking of the breast lesion according to the feature vector corresponding to each BI-RADS feature and the corresponding adjusted weight
Figure BDA0003225222950000022
Inputting the image information feature vector X' for determining the BI-RADS classification of the breast lesion into a pre-trained BI-RADS classification model to obtain the BI-RADS classification of the breast lesion, wherein the BI-RADS classification model is obtained by training based on a sample feature vector labeled with the BI-RADS classification.
In one embodiment, the method further comprises:
obtaining an attribute feature vector based on the second set of feature values
Figure BDA0003225222950000023
Wherein, yiIs the value of the ith BI-RADS characteristic in the second characteristic value set, riIs yiThe initial weight of (a);
r 'if the ith BI-RADS feature is taken to be the same in the first and second sets of feature values'i=ri+ Delta adjustment yiThe weight of (c); if the ith BI-RADS feature is not equal to the values in the first feature value set and the second feature value set, passing r'i=ri-delta adjustment yiThe weight of (c); wherein r'iTo y after adjustmentiThe weight of (c);
obtaining values in the second feature value set according to each BI-RADS feature anddetermining attribute information feature vectors for determining BI-RADS ranking of the breast lesion corresponding to the adjusted weights
Figure BDA0003225222950000031
Fusing the image information feature vector X 'for determining the BI-RADS classification of the breast lesion and the attribute information feature vector Y' for determining the BI-RADS classification of the breast lesion;
and inputting the fused feature vector into the pre-trained BI-RADS grading model to obtain the BI-RADS grading of the mammary gland lesion.
In an embodiment, if the obtained ultrasound image of the breast area of the subject is a multi-frame ultrasound image, the analyzing the ultrasound image based on the pre-trained breast lesion intelligent analysis model to obtain a first feature value set corresponding to a BI-RADS feature set value of the breast lesion in the breast area of the subject includes:
analyzing any one frame of ultrasonic image in the multi-frame ultrasonic image based on a pre-trained breast focus intelligent analysis model to obtain a value set of a BI-RADS feature set corresponding to the any one frame of ultrasonic image;
and obtaining the first characteristic value set from a value set of a BI-RADS characteristic set corresponding to the multi-frame ultrasonic image according to a preset strategy.
In one embodiment, said determining a BI-RADS rating of said breast lesion from said first set of feature values, said second set of feature values and said ultrasound image comprises:
when the value of any one BI-RADS feature in the value set of the BI-RADS feature set corresponding to any one ultrasonic image in the multi-frame ultrasonic images is the same as the value of any one BI-RADS feature in the second feature set, increasing the weight of any one ultrasonic image when the value of any one BI-RADS feature in the BI-RADS grading is determined; and when the value of any one BI-RADS feature in the value set of the BI-RADS feature set corresponding to any one ultrasonic image is different from the value of any one BI-RADS feature in the second feature set, reducing the weight of any one frame of ultrasonic image when the value of any one BI-RADS feature in the BI-RADS is determined.
In one embodiment, before the detecting the user's operation of modifying or confirming the first set of feature values, the method further comprises:
displaying the ultrasound image and the first set of feature values in a contrasting manner on a display interface.
In a second aspect, an embodiment of the present invention provides an ultrasound imaging apparatus, including:
the ultrasonic probe is used for transmitting ultrasonic waves to a target tissue of a detected person, receiving echoes of the ultrasonic waves returned by the target tissue, and outputting ultrasonic echo signals based on the received echoes of the ultrasonic waves, wherein the ultrasonic echo signals carry tissue structure information of the target tissue;
the transmitting circuit is used for outputting the corresponding transmitting sequence to the ultrasonic probe according to a set mode so as to control the ultrasonic probe to transmit the corresponding ultrasonic wave;
the receiving circuit is used for receiving the ultrasonic echo signal output by the ultrasonic probe and outputting ultrasonic echo data;
a display for outputting visual information;
a processor for performing the ultrasound imaging method of the breast as provided in any of the embodiments above.
In a third aspect, an embodiment of the present invention provides a computer-readable storage medium, in which computer-executable instructions are stored, and when the computer-executable instructions are executed by a processor, the computer-readable storage medium is used for implementing the ultrasound imaging method for the breast as provided in any one of the above embodiments.
According to the breast ultrasonic imaging method and device provided by the embodiment of the invention, through acquiring an ultrasonic image of a breast area of a subject, firstly, the ultrasonic image is analyzed based on a pre-trained breast focus intelligent analysis model, and a first feature value set corresponding to a BI-RADS feature set value of a breast focus in the breast area of the subject is obtained; then detecting the operation of modifying or confirming the first characteristic value set by a user to obtain a second characteristic value set; and finally determining a BI-RADS rating of the breast lesion from the first set of feature values, the second set of feature values, and the ultrasound image. The second feature value set can reflect not only the ultrasonic image information of the breast lesion, but also judgment information of a user on the breast lesion according to clinical experience, so that the BI-RADS rating of the breast lesion determined based on the second feature value set is helpful to improve the accuracy of the BI-RADS rating.
Drawings
Fig. 1 is a block diagram of an ultrasound imaging apparatus according to an embodiment of the present invention;
fig. 2 is an ultrasonic breast imaging method according to an embodiment of the present invention;
fig. 3 is an ultrasonic breast imaging method according to another embodiment of the present invention;
FIGS. 4A-4C are schematic diagrams of a display interface provided in accordance with an embodiment of the present invention;
fig. 5 is a method for ultrasound imaging of a breast provided by a further embodiment of the present invention;
FIG. 6 is a schematic diagram of a display interface provided in accordance with another embodiment of the present invention;
fig. 7 is a method for ultrasound imaging of a breast according to another embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Wherein like elements in different embodiments are numbered with like elements associated therewith. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. However, those skilled in the art will readily recognize that some of the features may be omitted or replaced with other elements, materials, methods in different instances. In some instances, certain operations related to the present application have not been shown or described in detail in order to avoid obscuring the core of the present application from excessive description, and it is not necessary for those skilled in the art to describe these operations in detail, so that they may be fully understood from the description in the specification and the general knowledge in the art.
Furthermore, the features, operations, or characteristics described in the specification may be combined in any suitable manner to form various embodiments. Also, the steps or actions in the method descriptions may be transposed or transposed in order in a manner apparent to one skilled in the art. Accordingly, the various sequences in the specification and drawings are for clarity of description of certain embodiments only and are not meant to be required unless otherwise indicated where a certain sequence must be followed.
The numbering of the components as such, e.g., "first", "second", etc., is used herein only to distinguish the objects as described, and does not have any sequential or technical meaning. The term "connected" and "coupled" when used in this application, unless otherwise indicated, includes both direct and indirect connections (couplings).
As shown in fig. 1, the ultrasound imaging apparatus provided by the present invention may include: an ultrasound probe 20, a transmission/reception circuit 30 (i.e., a transmission circuit 310 and a reception circuit 320), a beam synthesis module 40, an IQ demodulation module 50, a memory 60, a processor 70, and a human-computer interaction device. The processor 70 may include a control module 710 and an image processing module 720.
The ultrasonic probe 20 includes a transducer (not shown) composed of a plurality of array elements arranged in an array, the plurality of array elements are arranged in a row to form a linear array, or arranged in a two-dimensional matrix to form an area array, and the plurality of array elements may also form a convex array. The array element is used for emitting ultrasonic beams according to the excitation electric signals or converting the received ultrasonic beams into electric signals. Each array element can thus be used to perform a mutual transformation of the electrical impulse signal and the ultrasound beam, thereby performing an emission of ultrasound waves into a target region of human tissue (e.g. a breast region in this embodiment) and also to receive echoes of ultrasound waves reflected back through the tissue. In performing ultrasonic testing, it is possible to control which array elements are used for transmitting ultrasonic beams and which array elements are used for receiving ultrasonic beams, or to control the time slots of the array elements for transmitting ultrasonic beams or receiving echoes of ultrasonic beams, through the transmitting circuit 310 and the receiving circuit 320. The array elements participating in the ultrasonic wave transmission can be simultaneously excited by the electric signals, so that the ultrasonic waves are transmitted simultaneously; or the array elements participating in the ultrasonic wave transmission can be excited by a plurality of electric signals with certain time intervals, so that the ultrasonic waves with certain time intervals are continuously transmitted.
In this embodiment, the user selects a suitable position and angle by moving the ultrasound probe 20 to transmit the ultrasound waves to the mammary gland region 10 and receive the echoes of the ultrasound waves returned by the mammary gland region 10, and obtains and outputs the electric signals of the echoes, where the electric signals of the echoes are channel analog electric signals formed by using the receiving array elements as channels, and carry amplitude information, frequency information, and time information.
The transmitting circuit 310 is configured to generate a transmitting sequence according to the control of the control module 710 of the processor 70, where the transmitting sequence is configured to control some or all of the plurality of array elements to transmit ultrasonic waves to the biological tissue, and parameters of the transmitting sequence include the position of the array element for transmission, the number of array elements, and ultrasonic beam transmitting parameters (e.g., amplitude, frequency, number of transmissions, transmitting interval, transmitting angle, wave pattern, focusing position, etc.). In some cases, the transmit circuitry 310 is further configured to phase delay the transmitted beams to cause different transmit elements to transmit ultrasound at different times so that each transmitted ultrasound beam can be focused at a predetermined region of interest. In different operation modes, such as a B image mode, a C image mode, and a D image mode (doppler mode), the parameters of the transmit sequence may be different, and the echo signals received by the receiving circuit 320 and processed by subsequent modules and corresponding algorithms may generate a B image reflecting the tissue anatomy, a C image reflecting the tissue anatomy and blood flow information, and a D image reflecting the doppler spectrum image.
The receiving circuit 320 is used for receiving the electrical signal of the ultrasonic echo from the ultrasonic probe 20 and processing the electrical signal of the ultrasonic echo. The receive circuit 320 may include one or more amplifiers, analog-to-digital converters (ADCs), and the like. The amplifier is used for amplifying the electric signal of the received ultrasonic echo after proper gain compensation, the analog-to-digital converter is used for sampling the analog echo signal according to a preset time interval so as to convert the analog echo signal into a digitized signal, and the digitized echo signal still retains amplitude information, frequency information and phase information. The data output from the receiving circuit 320 may be output to the beam forming module 40 for processing or may be output to the memory 60 for storage.
The beam forming module 40 is connected to the receiving circuit 320 for performing beam forming processing such as corresponding delay and weighted summation on the signal output by the receiving circuit 320, and because the distances from the ultrasonic receiving point in the measured tissue to the receiving array elements are different, the channel data of the same receiving point output by different receiving array elements have delay differences, delay processing is required, the phases are aligned, and weighted summation is performed on different channel data of the same receiving point to obtain the ultrasonic image data after beam forming, and the ultrasonic image data output by the beam forming module 40 is also called as radio frequency data (RF data). The beam synthesis module 40 outputs the radio frequency data to the IQ demodulation module 50. In some embodiments, the beam forming module 40 may also output the rf data to the memory 60 for buffering or saving, or directly output the rf data to the image processing module 720 of the processor 70 for image processing.
Beamforming module 40 may perform the above functions in hardware, firmware, or software, for example, beamforming module 40 may include a central controller Circuit (CPU), one or more microprocessor chips, or any other electronic components capable of processing input data according to specific logic instructions, which when implemented in software, may execute instructions stored on a tangible and non-transitory computer readable medium (e.g., memory 60) to perform beamforming calculations using any suitable beamforming method.
The IQ demodulation module 50 removes the signal carrier by IQ demodulation, extracts the tissue structure information included in the signal, and performs filtering to remove noise, and the signal obtained at this time is referred to as a baseband signal (IQ data pair). The IQ demodulation module 50 performs image processing on the IQ data to an image processing module 720 that outputs to the processor 70. In some embodiments, the IQ demodulation module 50 further buffers or saves the IQ data pair output to the memory 60, so that the image processing module 720 reads out the data from the memory 60 for subsequent image processing.
The processor 70 is used for configuring a central controller Circuit (CPU), one or more microprocessors, a graphics controller circuit (GPU) or any other electronic components capable of processing input data according to specific logic instructions, performing control on peripheral electronic components according to the input instructions or predetermined instructions, or performing data reading and/or saving on the memory 60, or processing the input data by executing programs in the memory 60, such as performing one or more processing operations on the acquired ultrasound data according to one or more working modes, the processing operations including but not limited to adjusting or defining the form of the ultrasound waves emitted by the ultrasound probe 20, generating various image frames for display by the display 80 of a subsequent human-computer interaction device, or adjusting or defining the content and form displayed on the display 80, or adjust one or more image display settings (e.g., ultrasound images, interface components, positioning a region of interest) displayed on the display 80. The processor 70 provided in this embodiment can be used to execute the ultrasound imaging method for the breast provided in any embodiment of the present invention.
The image processing module 720 is used for processing the data output by the beam synthesis module 40 or the data output by the IQ demodulation module 50 to generate a gray-scale image of signal intensity variation within the scanning range, which reflects the anatomical structure inside the tissue, and is called a B image. The image processing module 720 may output the B image to the display 80 of the human-computer interaction device for display.
The human-computer interaction device is used for performing human-computer interaction, namely receiving input and output visual information of a user; the input of the user can be received by a keyboard, an operating button, a mouse, a track ball and the like, and a touch screen integrated with a display can also be adopted; which outputs visual information using the display 80.
The memory 60 may be a tangible and non-transitory computer readable medium, such as a flash memory card, solid state memory, hard disk, etc., for storing data or programs, e.g., the memory 60 may be used to store acquired ultrasound data or temporarily not immediately displayed image frames generated by the processor 70, or the memory 60 may store a graphical user interface, one or more default image display settings, programming instructions for the processor, the beam-forming module, or the IQ decoding module.
Referring to fig. 2, an ultrasound imaging method of a breast is provided based on the ultrasound imaging apparatus shown in fig. 1.
As shown in fig. 2, the ultrasound imaging method for a breast provided by this embodiment may include:
s101, obtaining an ultrasonic image of the breast area of the detected person.
S102, analyzing the ultrasonic image based on a pre-trained breast focus intelligent analysis model to obtain a first feature value set corresponding to a BI-RADS feature set value of a breast focus in a breast area of the detected person, wherein the breast focus intelligent analysis model is obtained based on sample ultrasonic image training labeled with the BI-RADS feature set value.
S103, detecting the operation of modifying or confirming the first characteristic value set by a user to obtain a second characteristic value set.
S104, determining the BI-RADS rating of the breast lesion according to the first feature value set, the second feature value set and the ultrasonic image.
In an alternative embodiment, determining a BI-RADS ranking of the breast lesion from the first set of feature values, the second set of feature values, and the ultrasound image may include:
obtaining image information feature vectors based on the ultrasound images
Figure BDA0003225222950000081
Wherein x isiIs the feature vector corresponding to the ith BI-RADS feature in the BI-RADS feature set extracted from the ultrasonic image, wiIs xiN is the number of BI-RADS features in the BI-RADS feature set;
if the ith BI-RADS feature is in the first feature value setAnd when the values in the second characteristic value set are the same, passing w'i=wi+ Δ adjust xiThe weight of (c); if the ith BI-RADS feature is not equal to the values in the first feature value set and the second feature value set, passing w'i=wi-delta adjustment xiThe weight of (c); wherein delta is weight adjustment amount, w'iFor x after adjustmentiThe weight of (c);
determining an image information feature vector for determining a BI-RADS ranking of the breast lesion according to the feature vector corresponding to each BI-RADS feature and the corresponding adjusted weight
Figure BDA0003225222950000082
Inputting the image information feature vector X' for determining the BI-RADS classification of the breast lesion into a pre-trained BI-RADS classification model to obtain the BI-RADS classification of the breast lesion, wherein the BI-RADS classification model is obtained by training based on a sample feature vector labeled with the BI-RADS classification.
In order to further improve the accuracy of BI-RADS ranking, attribute feature vectors may also be added in determining the BI-RADS ranking, which may specifically include:
obtaining an attribute feature vector based on the second set of feature values
Figure BDA0003225222950000083
Wherein, yiIs the value of the ith BI-RADS characteristic in the second characteristic value set, riIs yiThe initial weight of (a);
r 'if the ith BI-RADS feature is taken to be the same in the first and second sets of feature values'i=ri+ Delta adjustment yiThe weight of (c); if the ith BI-RADS feature is not equal to the values in the first feature value set and the second feature value set, passing r'i=ri-delta adjustment yiThe weight of (c); wherein r'iTo y after adjustmentiThe weight of (c);
determining attribute information feature vectors for determining BI-RADS classification of the breast lesion according to values of the BI-RADS features in the second feature value set and corresponding adjusted weights
Figure BDA0003225222950000091
Fusing the image information feature vector X 'for determining the BI-RADS classification of the breast lesion and the attribute information feature vector Y' for determining the BI-RADS classification of the breast lesion;
and inputting the fused feature vector into the pre-trained BI-RADS grading model to obtain the BI-RADS grading of the mammary gland lesion.
In the ultrasonic imaging method of the breast provided by this embodiment, by obtaining an ultrasonic image of a breast region of a subject, first, the ultrasonic image is analyzed based on a pre-trained breast lesion intelligent analysis model, so as to obtain a first feature value set corresponding to a BI-RADS feature set value of a breast lesion in the breast region of the subject; then detecting the operation of modifying or confirming the first characteristic value set by a user to obtain a second characteristic value set; and finally determining the BI-RADS rating of the breast lesion according to the first feature value set, the second feature value set and the ultrasonic image. Since the second feature value set can reflect not only the ultrasound image information of the breast lesion but also judgment information of the breast lesion by the user according to clinical experience, determining the BI-RADS rating of the breast lesion based on the second feature value set is helpful to improve the accuracy of the BI-RADS rating.
Referring to fig. 3, an ultrasound imaging method of a breast is provided based on the ultrasound imaging apparatus shown in fig. 1.
As shown in fig. 3, the ultrasound imaging method for a breast provided by this embodiment may include:
s201, obtaining an ultrasonic image of the breast area of the detected person.
In this embodiment, the ultrasound image of the breast area may be acquired in real time, or may be read from a pre-stored ultrasound image in a storage medium. For example, the ultrasound probe 20 of the ultrasound imaging apparatus in fig. 1 may transmit ultrasound waves to the breast area of the subject in real time, and the receiving circuit 320 processes the ultrasound echo electrical signals received from the ultrasound probe 20, and then processes the signals through the beam forming module 40, the IQ demodulation module 50, and the image processing module 720, so as to obtain an ultrasound image of the breast area of the subject in real time. For example, a previously acquired ultrasound image of the breast area of the subject may be acquired from the memory 60.
In this embodiment, a single ultrasound image of the breast area of the subject may be acquired, and a plurality of ultrasound images of the breast area of the subject may also be acquired. The number of ultrasound images is not limited in this embodiment.
S202, determining a value set corresponding to a BI-RADS feature set of the breast lesion in the breast area of the subject according to the ultrasonic image so as to obtain a first feature value set.
In this embodiment, after obtaining the ultrasound image of the breast Region of the subject, a Region of Interest (ROI) of the breast lesion may be detected in the obtained ultrasound image, a boundary of the breast lesion may be segmented, and then BI-RADS features of the breast lesion may be analyzed, so as to determine a value set corresponding to a BI-RADS feature set of the breast lesion in the breast Region of the subject.
The breast lesion ROI detection may be extracted based on deep learning, machine learning, or a conventional image processing method, and the specific implementation manner of the breast lesion ROI detection is not limited in this embodiment. In the breast lesion ROI detection based on deep learning, a deep learning ROI detection network needs to be trained based on the collected breast region ultrasound image data and the labeling result of the senior physician on the breast lesion ROI in the ultrasound image. The ROI may be labeled with coordinate information, for example, the ROI may be labeled with a rectangular box. The deep learning ROI detection network may use, but is not limited to, RCNN, FasterRCNN, SSD, YOLO, and the like. In the network training stage, calculating the error between the detection result and the labeling result of the breast lesion ROI in the iterative process, continuously updating the weight in the network by aiming at minimizing the error, and then continuously repeating the process to ensure that the detection result gradually approaches to the true value of the breast lesion ROI so as to obtain a trained breast lesion ROI detection model. The model can realize automatic detection and extraction of the breast lesion ROI from the input ultrasonic image data. The conventional image processing method or machine learning method for detecting the breast lesion ROI can be generally divided into the following steps: (a) finding out a region to be selected based on an image processing method, such as using a Select Search algorithm; (b) converting the area to be selected to a fixed size, and extracting feature vectors such as gradient and texture of the image by using an image processing method, such as a Sift operator, a HoG operator, a GLCM gray level co-occurrence matrix and the like; training the feature vectors of the to-be-selected area through a traditional machine learning algorithm to obtain a classification model of the to-be-selected area; (d) a rectangular box (bounding box) of the target, i.e., breast lesion ROI, was obtained by regression method. Another implementation method for extracting the ROI of the breast lesion based on machine learning can train a machine learning model based on collected ultrasonic images and ROI labeling results of the breast lesion, for example, a machine learning model such as SVM, Kmeans, Cmeass and the like is adopted to carry out secondary classification on gray values or texture values of pixel points, and whether each pixel point belongs to an ROI area is judged, so that the ROI of the breast lesion is extracted.
Methods for boundary segmentation of breast lesions include, but are not limited to: (1) the detected breast lesion ROI or the ultrasonic full image is subjected to boundary extraction based on a deep learning segmentation network, and the deep learning segmentation network can adopt Unet, FCN and a network improved on the basis of the Unet and FCN. When deep learning segmentation is carried out, a labeling area corresponding to the ultrasonic image and the ultrasonic image is input, the labeling area can be a binarization image of the breast lesion, and the position information of the breast lesion can also be written into a labeling file such as xml or json. And calculating the error between the segmentation result and the labeling result output by the deep learning segmentation network, and continuously iterating to minimize the error until the segmentation result approaches to a real value. (2) A multi-task deep learning network with synchronous detection and segmentation is adopted for boundary extraction, common networks such as mask-Rcnn, PolarmMask, SOLO and the like are adopted, the first step of the network is to firstly position the rough position of the ROI and then finely segment a target region. (3) The method comprises the steps of adopting a traditional image processing algorithm and a segmentation algorithm based on regions, wherein the segmentation algorithm mainly comprises a region growing method, a watershed algorithm, a Dajin threshold method and the like; gradient-based segmentation algorithms such as sobel, canny operators, etc. (4) The method for realizing the segmentation of the breast lesion based on machine learning is adopted, a machine learning segmentation model is trained based on collected ultrasonic images and labeling results, machine learning models such as SVM, Kmeans, Cmeas and the like can be adopted to carry out secondary classification on gray values or texture values of image pixel points, whether each pixel point or a texture feature vector representing the current pixel point belongs to the ROI of the breast lesion is judged, and therefore segmentation of the ROI boundary of the breast lesion is realized.
Methods of breast lesion BI-RADS feature analysis include, but are not limited to: and (3) analyzing each BI-RADS characteristic based on a method of deep learning alone and traditional image characteristic combined machine learning alone, and combining the two schemes. The method specifically comprises the following steps: (1) and predicting a plurality of BI-RADS characteristics based on the multi-task deep learning network. In an alternative embodiment, the extracted ROI region of the breast lesion may be used as an input to directly predict each BI-RADS feature using multiple branches of the multitask deep learning network. For example, the shape type, the direction type, the echo type, the calcification type and the edge type can be regarded as 5 prediction tasks, and a large network of a multi-task deep learning network comprises 5 branches to respectively process 5 different prediction tasks. The backbone network used by each partial volume block includes, but is not limited to, typical deep learning convolution classification networks, such as AlexNet, Resnet, VGG, and the like. When the multi-task deep learning network is trained, the classification subnets of the BI-RADS features can be trained independently, and the whole network can be trained simultaneously. Specifically, errors between the prediction results and the calibration results of each branch are calculated, the calibration results can be understood as real results of the shape type, the direction type, the echo type, the calcification type and the edge type of the breast lesion, then the prediction results gradually approach the calibration results through continuous iteration, and finally the multi-task deep learning network model capable of conducting multiple BI-RADS feature prediction is obtained. And (2) respectively constructing a deep learning network for each BI-RADS characteristic, and parallelly and independently constructing a plurality of deep learning networks to analyze the plurality of BI-RADS characteristics, wherein the deep learning networks can adopt deep learning volume integral networks including but not limited to AlexNet, Resnet, VGG and the like. (3) And respectively extracting the characteristics of each BI-RADS characteristic by adopting a characteristic extraction algorithm, and setting a proper threshold value according to the extracted characteristics or analyzing the characteristics in series by adopting a machine learning algorithm. In an alternative embodiment, the characteristics of the breast lesion, including but not limited to histogram, gray level co-occurrence matrix characteristics, etc., may be extracted and input into a machine learning model such as SVM, Kmean, KNN, etc., to predict the echo type of the breast lesion, so as to obtain the analysis result of the echo type of the breast lesion. (4) And (3) regarding each BI-RADS feature as a prediction or classification task, and designing an algorithm or a model suitable for the feature based on different schemes for different BI-RADS features, wherein the schemes can be based on deep learning or based on a traditional image processing method combined with machine learning.
In an alternative embodiment, the BI-RADS feature set may include a shape type, a direction type, an edge type, an echo type, a posterior echo type, a calcification type, and a blood supply type. For example, if the values of the respective BI-RADS features of the breast lesion in the breast area of the subject determined from the ultrasound image are: the shape type is irregular, the direction type is parallel, the edge type is angulation, the echo type is low echo, the back echo type is unchanged, the calcification type is intra-block calcification, the blood supply type is internal blood flow, and then the first feature value set can be: irregular, parallel, angled, hypoechoic, no change, calcification within the mass, and internal blood flow. It should be noted that the BI-RADS feature set in this embodiment may also include more or less BI-RADS features than those described above. For example, the BI-RADS feature set may include only a shape type, an orientation type, and an edge type.
The BI-RADS classification of a breast lesion in a breast region of a subject may also be determined from the ultrasound image in this embodiment.
And S203, displaying the first feature value set on a display interface.
After the first feature value set is obtained, the first feature value set may be displayed on a display interface for facilitating the user to view and modify or confirm the first feature value set. For example, the BI-RADS feature names and the corresponding values thereof may be displayed in an associated manner.
In an optional embodiment, to further facilitate the user to modify or confirm the first set of feature values, the ultrasound image of the breast region of the subject and the first set of feature values may be displayed on the display interface in a contrasting manner. For example, the ultrasound image and the first feature value set may be respectively displayed in different areas of the display interface, so that the user can check the values of the BI-RADS features in the first feature value set while viewing the ultrasound image. When the number of the acquired ultrasound images of the breast area of the subject is multiple, the multiple ultrasound images may be automatically scrolled and displayed by using a preset frequency, or only the ROI of the breast lesion in the multiple ultrasound images may be displayed.
The BI-RADS ranking of the breast lesion in the breast region of the subject determined from the ultrasound image may also be displayed on the display interface in this embodiment.
Referring to fig. 4A, fig. 4A is a schematic diagram illustrating a first set of feature values displayed on a display interface according to an embodiment.
S204, detecting the operation of modifying or confirming the first characteristic value set by the user to obtain a second characteristic value set.
After the user views the first feature value set, if the value of a certain BI-RADS feature is in doubt, the value of the BI-RADS feature can be modified through input equipment such as a mouse, a keyboard and the like; if the value of a certain BI-RADS feature is confirmed, the value of the BI-RADS feature can be confirmed through input equipment such as a mouse, a keyboard and the like. In specific implementation, the value range of each BI-RADS characteristic can be displayed by adopting pull-down menus, radio boxes and other modes for a user to modify or confirm.
It is understood that the second feature value set may reflect not only the ultrasound image information of the breast lesion but also the judgment information of the breast lesion by the user according to the clinical experience.
And S205, displaying the second feature value set on a display interface.
In order to facilitate the user to view the values of the respective BI-RADS features after modification or confirmation, the second feature value set may be displayed on the display interface, for example, the second feature value set may be displayed in real time when the user performs modification or confirmation.
For example, if the user considers that the angulation of the edge type in fig. 4A is not accurate enough, the angulation is modified to be "differential leaf and burr", and the confirmation operation is performed on the values of the other BI-RADS features in fig. 4A, the second feature value set is displayed on the display interface as shown in fig. 4B, and fig. 4B is a schematic diagram of displaying the second feature value set on the display interface according to an embodiment. The user may save the second set of feature values via a "save" button, and initiate analysis of the breast lesion BI-RADS classification via an "analyze" button.
And S206, determining the BI-RADS grade of the breast lesion according to the second feature value set.
The second characteristic value set obtained after modifying or confirming the values of all BI-RADS characteristics in the first characteristic value set obtained based on the ultrasonic image can not only reflect the breast lesion information in the ultrasonic image, but also reflect the judgment information of a user on the breast lesion, so that the BI-RADS classification of the breast lesion is determined according to the second characteristic value set, and the accuracy of the BI-RADS classification can be improved.
In an alternative embodiment, the second set of feature values may be used as an input to output a BI-RADS ranking of the breast lesion. For example, a machine learning method may be used to pre-train the BI-RADS classification model based on the BI-RADS feature value sets labeled with the BI-RADS classification.
In another alternative embodiment, in order to fully utilize the information of the ultrasound image, the ultrasound image of the breast region and the second feature value set can be used as input at the same time, and the BI-RADS grade of the breast lesion can be output. For example, a machine learning method may be used to pre-train the BI-RADS classification model based on the ultrasound images labeled with the BI-RADS classification and the BI-RADS feature value sets.
And S207, displaying the BI-RADS classification of the breast lesion on a display interface.
In order to facilitate the user to view the BI-RADS rating of the breast lesion so that the user may perform a corresponding diagnosis and treatment operation according to the BI-RADS rating, in this embodiment, after determining the BI-RADS rating of the breast lesion, the BI-RADS rating of the breast lesion may be displayed on the display interface. In the example of the second set of eigenvalues shown in FIG. 4B, FIG. 4C shows the BI-RADS ranking of breast lesions determined from the second set of eigenvalues. It is understood that the second set of feature values and the BI-RADS ranking of the breast lesion may also be displayed on the display interface simultaneously for ease of viewing.
In the ultrasound imaging method of a breast provided by this embodiment, by obtaining an ultrasound image of a breast region of a subject, a value set corresponding to a BI-RADS feature set of a breast lesion in the breast region of the subject is first determined according to the ultrasound image to obtain a first feature value set, and the first feature value set is displayed on a display interface; then detecting the operation of modifying or confirming the first characteristic value set by a user to obtain a second characteristic value set, and displaying the second characteristic value set on a display interface; and finally, determining the BI-RADS rating of the breast lesion according to the second feature value set, and displaying the BI-RADS rating of the breast lesion on a display interface. The second feature value set can reflect not only the ultrasound image information of the breast lesion, but also the judgment information of the user on the breast lesion according to clinical experience, so that the BI-RADS rating of the breast lesion determined based on the second feature value set is helpful for improving the accuracy of the BI-RADS rating.
On the basis of the above-described embodiments, further details will be given below regarding how to determine a BI-RADS ranking of breast lesions from the second set of feature values. And determining the BI-RADS grading method by combining the clinical feedback information of the user according to the second feature value set, wherein the method can be realized by performing weighted optimization on one or more of the ultrasonic image used by the BI-RADS grading algorithm, the feature vector corresponding to the ultrasonic image and the BI-RADS feature value. Wherein, the weighted optimization strategy may include: (1) the method can be used for enhancing the effect of an ultrasound image corresponding to the unmodified (i.e. confirmed) BI-RADS features of a doctor, a feature vector corresponding to the ultrasound image and/or corresponding BI-RADS feature values in the BI-RADS grading process, and has the following significance: for the BI-RADS features which are not modified by a doctor, the BI-RADS feature values obtained by analysis of an intelligent algorithm are consistent with evaluation of the doctor, and the ultrasound images corresponding to the BI-RADS feature values, the feature vectors corresponding to the ultrasound images and the BI-RADS feature values are relatively and reasonably profiled when being used for BI-RADS grading analysis, so that the roles of the corresponding ultrasound images, the feature vectors corresponding to the ultrasound images and/or the BI-RADS feature values in BI-RADS grading are strengthened; (2) weakening the effect of an ultrasonic image corresponding to BI-RADS characteristics modified by a doctor, a characteristic vector corresponding to the ultrasonic image and/or BI-RADS characteristic values in a BI-RADS grading process, and the significance is as follows: for the BI-RADS features modified by the doctor, the BI-RADS feature values obtained by the analysis of the intelligent algorithm are inconsistent with the evaluation of the doctor, and the corresponding ultrasound images, feature vectors corresponding to the ultrasound images and the BI-RADS feature values are relatively not spectrum dependent when used for BI-RADS hierarchical analysis, for example, the ultrasound images may cause inaccurate BI-RADS feature analysis results due to blurring or unobvious features.
In an alternative embodiment, the determining a BI-RADS ranking of the breast lesion from the second set of feature values may specifically comprise: for any one BI-RADS feature in the BI-RADS feature set, extracting a feature vector corresponding to any one BI-RADS feature from the ultrasonic image; when any one BI-RADS feature is the same in the first feature value set and the second feature value set, increasing the weight of the feature vector corresponding to any one BI-RADS feature in determining the BI-RADS classification of the breast lesion; and when any one BI-RADS feature is different from the values in the first feature value set and the second feature value set, reducing the weight of the feature vector corresponding to any one BI-RADS feature in determining the BI-RADS classification of the breast lesion.
When the BI-RADS features are the same in the first feature value set and the second feature value set, the user does not modify the values of the BI-RADS features, the fact that the evaluation result of the ultrasound image based intelligent algorithm is consistent with the evaluation result of the user based on clinical experience is indicated, and the effect of the feature vector corresponding to the BI-RADS features in determining the BI-RADS grading is enhanced; when the values of the BI-RADS features in the first feature value set and the second feature value set are different, the user is indicated to modify the values of the BI-RADS features, the fact that the evaluation result of the ultrasound image based on the intelligent algorithm is inconsistent with the evaluation result of the user based on clinical experience is indicated, and the effect of feature vectors corresponding to the BI-RADS features in determining BI-RADS grading should be weakened.
Taking the BI-RADS feature set including a shape type, a direction type, an edge type, an echo type, a rear echo type, a calcification type and a blood supply type as an example, X represents an image information feature vector for the BI-RADS ranking task acquired based on an ultrasound image, for example, X is a feature vector input into the BI-RADS ranking model.
X=w0x0+w1x1+w2x2+w3x3+w4x4+w5x5+w6x6
Wherein x is0For feature vectors corresponding to the shape type extracted from the ultrasound image, w0Is the weight of the feature vector. By analogy, x1、x2、x3、x4、x5And x6Respectively extracting the feature vectors, w, corresponding to the direction type, the edge type, the echo type, the rear echo type, the calcification type and the blood supply type from the ultrasonic image1~w6Is the weight of each feature vector.
After the first feature value set is obtained, the user modifies the values of the shape type, the edge type and the calcification type through an input device, and confirms the values of other BI-RADS features, so that the weights can be adjusted for weighting optimization in the following manner: w'0=w0-Δ;w′2=w2-Δ; w′5=w5-Δ;w′1=w1+Δ;w′3=w3+Δ;w′4=w4+Δ;w′6=w6+Δ; X′=w′0x0+w′1x1+w′2x2+w′3x3+w′4x4+w′5x5+w′6x6(ii) a Where Δ is a weight adjustment amount, which may be preset, for example, Δ is set to 0.1.
And then determining BI-RADS classification according to the weighted and optimized image information feature vector X' so as to enhance the effect of the feature vector corresponding to the BI-RADS features confirmed by the users in the BI-RADS classification process and weaken the effect of the feature vector corresponding to the BI-RADS features modified by the users in the BI-RADS classification process, thereby improving the accuracy of the BI-RADS classification.
In another alternative embodiment, the BI-RADS ranking of breast lesions based on the second set of feature values may specifically comprise: for any one BI-RADS feature in the BI-RADS feature set, when the value of the any one BI-RADS feature is the same in the first feature value set and the second feature value set, the weight of the value of the any one BI-RADS feature in the second feature value set in the BI-RADS grading determination of the breast lesion is increased; and when any one BI-RADS feature is different from the values in the first characteristic value set and the second characteristic value set, reducing the weight of the value of any one BI-RADS feature in the second characteristic value set when determining the BI-RADS classification of the breast lesion.
When the BI-RADS characteristics are the same in the first characteristic value set and the second characteristic value set, the user does not modify the values of the BI-RADS characteristics, the evaluation result of the ultrasound image based intelligent algorithm is consistent with the evaluation result of the user based on clinical experience, and the effect of the BI-RADS characteristic values in determining BI-RADS grading can be enhanced; when the values of the BI-RADS features in the first feature value set and the second feature value set are different, the user modifies the values of the BI-RADS features, the fact that the evaluation result of the ultrasound image based intelligent algorithm is inconsistent with the evaluation result of the user based on clinical experience is shown, and the effect of the BI-RADS feature values in determining the BI-RADS grading can be weakened.
Still taking the example that the BI-RADS feature set includes a shape type, a direction type, an edge type, an echo type, a posterior echo type, a calcification type and a blood supply type, Y represents a BI-RADS attribute feature vector for the BI-RADS ranking task acquired based on the ultrasound image.
Y=r0y0+r1y1+r2y2+r3y3+r4y4+r5y5+r6y6
Wherein, y0Is the value of the shape type in the second feature value set. By analogy with the general formula y1、y2、y3、 y4、y5And y6And the values of the direction type, the edge type, the echo type, the rear echo type, the calcification type and the blood supply type in the second characteristic value set are respectively taken. r is0~r6And (4) initial weight of each BI-RADS characteristic value. Assuming that the user modifies the values of the shape type, the edge type and the calcification type through the input device and confirms the values of other BI-RADS features, the weighting optimization can be performed by adjusting the weights as follows: r'0=r0-Δ;r′2=r2-Δ;r′5=r5-Δ;r′1=r1+Δ;r′3=r3+Δ; r′4=r4+Δ;r′6=r6+Δ;Y′=r′0y0+r′1y1+r′2y2+r′3y3+r′4y4+r′5y5+r′6y6(ii) a Where Δ is a weight adjustment amount, which may be preset, for example, Δ is set to 0.1.
And then determining BI-RADS classification according to the weighted and optimized BI-RADS attribute feature vector Y' so as to strengthen the effect of the BI-RADS feature values confirmed by the users in the BI-RADS classification process and weaken the effect of the BI-RADS feature values modified by the users in the BI-RADS classification process, thereby improving the accuracy of the BI-RADS classification.
The embodiment respectively performs weighting optimization on the image information feature vector X and the BI-RADS attribute feature vector Y to improve the accuracy of BI-RADS classification. And performing weighting optimization on the image information characteristic vector X and the BI-RADS attribute characteristic vector Y at the same time, namely determining the BI-RADS classification through the image information characteristic vector X 'after weighting optimization and the BI-RADS attribute characteristic vector Y' after weighting optimization. For example, X '+ Y' may be used as a feature vector when determining the BI-RADS classification, where "+" is used to indicate fusion of two feature vectors, for example, splicing of feature vectors may be used, and the fusion manner is not limited in this embodiment.
When the acquired ultrasound image of the breast area of the subject is a multi-frame ultrasound image, each frame of ultrasound image can be analyzed respectively to obtain a value set of a BI-RADS feature set corresponding to each frame of ultrasound image; determining a BI-RADS ranking of the breast lesion based on the second feature set may specifically comprise: when the value of any one BI-RADS feature in the value set of the BI-RADS feature set corresponding to any one ultrasonic image in the multi-frame ultrasonic images is the same as the value of any one BI-RADS feature in the second feature set, increasing the weight of any one frame of ultrasonic image in determining the value of any one BI-RADS feature in the BI-RADS grading; and when the value of any one BI-RADS feature in the value set of the BI-RADS feature set corresponding to any one ultrasound image is different from the value of any one BI-RADS feature in the second feature set, reducing the weight of any one frame of ultrasound image in the process of determining the value of any one BI-RADS feature in the BI-RADS.
The above embodiment elaborates the determination of the BI-RADS rating by combining the feedback information of the user on the BI-RADS feature value and the ultrasound image of the mammary gland region, that is, the accuracy of the BI-RADS rating is improved by detecting the confirmation or modification operation of the user on the BI-RADS feature value; how to modify the values of the BI-RADS information set by combining the feedback information of the BI-RADS information set by the user and the ultrasound image of the mammary gland region to improve the accuracy of the BI-RADS information will be described below, where the BI-RADS information set includes a BI-RADS feature set and a BI-RADS classification, that is, the user may confirm or modify the BI-RADS feature values, or may confirm or modify the BI-RADS classification. Referring to fig. 5, fig. 5 is a diagram illustrating a breast ultrasound imaging method according to another embodiment of the present invention. As shown in fig. 5, the ultrasound imaging method for a breast provided by this embodiment may include:
s401, obtaining an ultrasonic image of the breast area of the detected person.
In this embodiment, reference may be made to step S201 in the above embodiment for a specific implementation manner of obtaining an ultrasound image of a breast area of a subject, which is not described herein again.
S402, determining a value set corresponding to a BI-RADS information set of the breast lesion in the breast area of the subject according to the ultrasonic image to obtain a first value set, wherein the BI-RADS information set comprises a BI-RADS characteristic set and a BI-RADS classification.
In an alternative embodiment, after the ultrasound image of the breast area of the subject is acquired, a Region of Interest (ROI) of the breast lesion may be detected in the acquired ultrasound image, the boundary of the breast lesion may be segmented, and then BI-RADS information of the breast lesion may be analyzed to determine a value set corresponding to the BI-RADS information set of the breast lesion in the breast area of the subject. For a specific implementation manner of the breast lesion ROI region detection and the breast lesion boundary segmentation, reference may be made to step S202 in the above embodiment, which is not described herein again. The BI-RADS information of the breast lesion may be analyzed by referring to the method for analyzing the BI-RADS characteristics of the breast lesion in step S202, and the BI-RADS classification may be added to the output.
The following specifically explains how to determine a value set corresponding to a BI-RADS information set of a breast lesion in a breast area of a subject according to an ultrasound image from two angles, namely, a traditional image processing method and a deep learning method. When a traditional image processing method is adopted, firstly, a Feature vector corresponding to a breast lesion can be extracted from an ultrasonic image, wherein the Feature vector comprises one or more of Histogram, gray level co-occurrence matrix Feature, Scale Invariant Feature Transform (SIFT) Feature and Histogram of Oriented Gradient (HOG) Feature; and then determining a value set corresponding to the BI-RADS information set of the breast lesion in the breast area of the subject according to the characteristic vector. When the deep learning method is adopted, firstly, a breast lesion interest region can be automatically or manually obtained, for example, an ultrasonic image is input into a pre-trained interest region detection model to obtain a breast lesion interest region of an ultrasonic image, the interest region detection model is obtained based on ultrasonic image training marked with the breast lesion interest region, or the operation of an operator on tracing the breast lesion interest region in the ultrasonic image is detected to obtain the breast lesion interest region of the ultrasonic image; and then inputting the breast focus interesting region of the ultrasonic image into a pre-trained BI-RADS information identification model to obtain a value set corresponding to the BI-RADS information set of the breast focus, wherein the BI-RADS information identification model is obtained based on the ultrasonic image training marked with the BI-RADS information value. The BI-RADS information identification model can adopt a multitask deep learning network, and each branch of the multitask deep learning network is used for identifying one type of BI-RADS information; or, the BI-RADS information identification model adopts a plurality of parallel deep learning networks, and each deep learning network is used for identifying one type of BI-RADS information.
In this embodiment, the BI-RADS information set includes a BI-RADS feature set and a BI-RADS rating, which means that the user may modify or confirm the BI-RADS feature or the BI-RADS rating. The BI-RADS feature set may include, among other things, a shape type, a direction type, an edge type, an echo type, a posterior echo type, a calcification type, and a blood supply type.
And S403, displaying the first value set on a display interface.
In this embodiment, after the first value set is obtained, the first value set may be displayed on a display interface in order to facilitate viewing by a user and modification or confirmation operations for the first value set. For example, the name of the BI-RADS information and the corresponding value thereof may be displayed in association.
In an optional embodiment, to further facilitate the user to modify or confirm the first value set, the ultrasound image of the breast area of the subject and the first value set may be displayed on the display interface in a contrasting manner. For example, the ultrasound image and the first value set may be respectively displayed in different areas of the display interface, so that the user can check the values of the BI-RADS information in the first value set while viewing the ultrasound image. Referring to fig. 6, fig. 6 is a schematic view of a display interface according to another embodiment of the present invention. In fig. 6, the left side of the display interface shows the ultrasound image of the breast area, and the right side shows the names and corresponding values of the respective BI-RADS information. It should be noted that, in this embodiment, the position relationship and the display mode of the ultrasound image and the first value set on the display interface are not limited.
S404, detecting the operation of modifying or confirming the first value set by the user to obtain a second value set.
After the user views the first value set, if the value of a certain BI-RADS information is in doubt, the value of the BI-RADS information can be modified through an input device such as a mouse, a keyboard and the like; if the value of a certain BI-RADS information is confirmed, the value of the BI-RADS information can be confirmed through input equipment such as a mouse, a keyboard and the like. In the specific implementation, the value range of each BI-RADS information can be displayed by adopting pull-down menus, radio boxes and other modes for a user to modify or confirm.
It can be understood that the second value set may not only reflect the ultrasound image information of the breast lesion, but also reflect the judgment information of the user on the breast lesion according to the clinical experience.
And S405, displaying the second value set on a display interface.
In order to facilitate the user to view the values of the respective BI-RADS information after modification or confirmation, the second value set may be displayed on the display interface, for example, the second value set may be displayed in real time when the user performs modification or confirmation.
And S406, determining a value set corresponding to the BI-RADS information set of the breast lesion according to the second value set and the ultrasonic image to obtain a third value set.
The judgment information of the user on the breast lesion is fully embodied by modifying or confirming the value of each BI-RADS information in the first value set obtained based on the ultrasonic image, so that the value of the BI-RADS information set of the breast lesion is determined by integrating the second value set and the ultrasonic image, and the accuracy of the BI-RADS information value can be remarkably improved.
In an optional embodiment, the second value set and the ultrasound image may be used as input, and the third value set may be output, for example, by using a machine learning method.
Taking fig. 6 as an example, by analyzing the ultrasound image of the breast area of the subject, each BI-RADS information value shown in fig. 6, that is, the first value set, is obtained. If the doctor thinks that the edge type value is inaccurate, the doctor changes the angle from 'angulation' to 'differential leaf and burr', confirms the value of other BI-RADS information, and can start the system to re-analyze through an 'analysis' button in figure 6. And updating each BI-RADS information value according to the second value set and the ultrasonic image during reanalysis, and updating the BI-RADS grades from 4B to 4C.
And S407, displaying the third value set on a display interface.
In order to facilitate the user to view the final values of the respective BI-RADS information of the breast lesion, so that the user performs corresponding diagnosis and treatment operations according to the final values of the respective BI-RADS information, in this embodiment, after determining the final value set corresponding to the BI-RADS information set of the breast lesion according to the second value set and the ultrasound image, that is, after determining the third value set, the third value set may be displayed on the display interface. Specifically, the BI-RADS information name and the BI-RADS information value can be displayed in a correlated manner.
In the ultrasound imaging method of the breast provided by this embodiment, by obtaining an ultrasound image of a breast area of a subject, a value set corresponding to a BI-RADS information set of a breast lesion in the breast area of the subject is first determined according to the ultrasound image to obtain a first value set, and the first value set is displayed on a display interface; then detecting the operation of modifying or confirming the first value set by a user to obtain a second value set, and displaying the second value set on a display interface; and finally, determining a value set corresponding to the BI-RADS information set of the breast lesion according to the second value set and the ultrasonic image to obtain a third value set, and displaying the third value set on a display interface. The second value set fully reflects the judgment information of the user on the breast lesion according to clinical experience, so that the BI-RADS information value of the breast lesion determined by combining the second value set is beneficial to improving the accuracy.
When the acquired ultrasound image of the breast area of the subject is a multi-frame ultrasound image, determining a value set corresponding to a BI-RADS information set of the breast lesion in the breast area of the subject according to the ultrasound image, so as to obtain the first value set, which may specifically include: analyzing any one frame of ultrasonic image in the multi-frame ultrasonic image to obtain a value set of a BI-RADS information set corresponding to any one frame of ultrasonic image; and obtaining a first value set from the value set of the BI-RADS information set corresponding to the multi-frame ultrasonic image according to a preset strategy. For a specific implementation manner of analyzing the ultrasound image to obtain the value set of the BI-RADS information set corresponding to the ultrasound image, reference may be made to step S202 in the above embodiment, which is not described herein again. The preset policy in this embodiment may adopt, for example, secondary processing, voting-minority majority obeying, averaging processing, and the like.
On the basis of the above embodiment, determining a value set corresponding to the BI-RADS information set of the breast lesion according to the second value set and the ultrasound image to obtain a third value set specifically includes: when the value of any one BI-RADS information in the value set of the BI-RADS information set corresponding to any one ultrasonic image is the same as the value of any one BI-RADS information in the second value set, increasing the weight of any one ultrasonic image when the value of any one BI-RADS information in the third value set is determined; and when the value of any one BI-RADS information in the value set of the BI-RADS information set corresponding to any one ultrasonic image is different from the value of any one BI-RADS information in the second value set, reducing the weight of any one ultrasonic image when the value of any one BI-RADS information in the third value set is determined.
In an optional embodiment, determining a value set corresponding to the BI-RADS information set of the breast lesion according to the second value set and the ultrasound image to obtain a third value set, which may specifically include: extracting a feature vector corresponding to any one BI-RADS information from the ultrasonic image for any one BI-RADS information in the BI-RADS information set; when any one of the BI-RADS information has the same value in the first value set and the second value set, increasing the weight of the feature vector corresponding to any one of the BI-RADS information when the third value set is determined; and when the values of any one BI-RADS information in the first value set and the second value set are different, reducing the weight of the feature vector corresponding to any one BI-RADS information in determining the third value set.
When the values of the BI-RADS information in the first value set and the second value set are the same, the user does not modify the values of the BI-RADS information, the fact that the evaluation result of the ultrasound image based on the intelligent algorithm is consistent with the evaluation result of the user based on clinical experience is shown, and the effect of the feature vector corresponding to the BI-RADS information in determining the third value set is strengthened; when the values of the BI-RADS information in the first value set and the second value set are different, the user is indicated to modify the values of the BI-RADS information, the fact that the evaluation result of the ultrasound image based on the intelligent algorithm is inconsistent with the evaluation result of the user based on clinical experience is indicated, and the effect of the feature vector corresponding to the BI-RADS information in determining the third value set should be weakened.
Taking the BI-RADS information set including shape type, direction type, edge type, echo type, back echo type, calcification type, blood supply type and BI-RADS classification as an example, X represents an image information feature vector input in determining the third value set.
X=w0x0+w1x1+w2x2+w3x3+w4x4+w5x5+w6x6+w7x7
Wherein x is0For feature vectors corresponding to the shape type extracted from the ultrasound image, w0Is the weight of the feature vector. By analogy, x1、x2、x3、x4、x5、x6And x7Respectively extracting the feature vectors w corresponding to the direction type, edge type, echo type, rear echo type, calcification type, blood supply type and BI-RADS grading extracted from the ultrasonic image1~w7Is the weight of each feature vector. The present embodiment does not limit the type and specific extraction manner of the feature vector, and the feature vector may be, for example, a gradient, a gray level co-occurrence matrix, and the like. The feature vectors corresponding to the respective BI-RADS information may be of the same type or of different types.
Assuming that after the first value set is obtained, the user modifies the values of the shape type, the edge type and the calcification type through the input device, and confirms the values of other BI-RADS information, the weighting optimization can be performed by adjusting the weights in the following manner: w'0=w0-Δ;w′2=w2-Δ;w′5=w5-Δ;w′1=w1+Δ;w′3=w3+Δ;w′4=w4+Δ;w′6=w6+Δ; w′7=w7+Δ;X′=w′0x0+w′1x1+w′2x2+w′3x3+w′4x4+w′5x5+w′6x6+w′7x7(ii) a Where Δ is a weight adjustment amount, which may be preset, for example, Δ is set to 0.1. Optionally, normalization processing may be performed on the weights after the optimization of the weights.
And determining a third value set according to the weighted and optimized image information feature vector X', weakening the effect of the feature vector corresponding to the BI-RADS information modified by the user by strengthening the effect of the feature vector corresponding to the BI-RADS information confirmed by the user, and improving the accuracy of the third value set.
On the basis of any of the above embodiments, determining a value set corresponding to the BI-RADS information set of the breast lesion according to the second value set and the ultrasound image to obtain a third value set, which may further include:
for any one BI-RADS information in the BI-RADS information set, when the value of the any one BI-RADS information is the same in the first value set and the second value set, the weight of the value of the any one BI-RADS information in the second value set when a third value set is determined is increased; and when the values of any one BI-RADS information in the first value set and the second value set are different, reducing the weight of the value of any one BI-RADS information in the second value set when a third value set is determined.
When the values of the BI-RADS information in the first value set and the second value set are the same, the user does not modify the values of the BI-RADS information, the evaluation result of the ultrasound-based image through the intelligent algorithm is consistent with the evaluation result of the user based on clinical experience, and the effect of the BI-RADS information values in determining the BI-RADS grading can be enhanced; when the values of the BI-RADS information in the first value set and the second value set are different, the user is indicated to modify the values of the BI-RADS information, the fact that the evaluation result of the ultrasound-image-based intelligent algorithm is inconsistent with the evaluation result of the user based on clinical experience is indicated, and the effect of the BI-RADS information values in determining the BI-RADS grading can be weakened.
And still using the BI-RADS information set to comprise a shape type, a direction type, an edge type, an echo type, a rear echo type, a calcification type, a blood supply type and a BI-RADS grading, wherein Y represents a BI-RADS attribute feature vector input in the process of determining the third value set.
Y=r0y0+r1y1+r2y2+r3y3+r4y4+r5y5+r6y6+r7y7
Wherein, y0Is the value of the shape type in the second value set. By analogy with the general formula y1、y2、y3、y4、y5、 y6And y7Respectively, the values of the direction type, the edge type, the echo type, the back echo type, the calcification type, the blood supply type and the BI-RADS in the second value set. r is0~r7And (4) initial weight of each BI-RADS information value. Assuming that the user modifies the values of the shape type, the edge type and the calcification type through the input device and confirms the values of other BI-RADS information, the weighting optimization can be performed by adjusting the weights as follows: r'0=r0-Δ;r′2=r2-Δ;r′5=r5-Δ;r′1=r1+Δ;r′3=r3+Δ;r′4=r4+Δ;r′6=r6+Δ;r′7=r7+Δ;Y′=r′0y0+r′1y1+r′2y2+ r′3y3+r′4y4+r′5y5+r′6y6+r′7y7(ii) a Where Δ is a weight adjustment amount, which may be preset, for example, Δ is set to 0.1. Optionally, normalization processing may be performed on the weights after the weighting optimization.
And then determining a third value set according to the weighted and optimized BI-RADS attribute feature vector Y', weakening the effect of the BI-RADS information values modified by the users by strengthening the effect of the BI-RADS information values confirmed by the users, and improving the accuracy of the third value set.
In the embodiment, the accuracy of the third value set is improved by performing weighted optimization on the image information feature vector X and the BI-RADS attribute feature vector Y respectively. And performing weighting optimization on the image information characteristic vector X and the BI-RADS attribute characteristic vector Y at the same time, namely determining a third value set through the image information characteristic vector X 'after weighting optimization and the BI-RADS attribute characteristic vector Y' after weighting optimization. For example, X '+ Y' may be used as a feature vector for determining the third value set, where "+" is used to represent fusion of two feature vectors, for example, splicing of the feature vectors may be used, and the fusion method is not limited in this embodiment.
The above embodiments illustrate the combination of the feedback information of the user in the auxiliary diagnosis of breast lesions to improve the accuracy of the auxiliary diagnosis, and the following embodiments illustrate how the feedback information of the user in other lesions can be combined to improve the accuracy of the auxiliary diagnosis, for example, in thyroid imaging report and the classification standard TI-RADS of data system. Referring to fig. 7, fig. 7 is a diagram illustrating a breast ultrasound imaging method according to another embodiment of the present invention. As shown in fig. 7, the ultrasound imaging method for a breast provided by this embodiment may include:
s601, acquiring an ultrasonic signal of target tissue of a detected person, wherein the ultrasonic signal comprises at least one of an analog signal, a digital signal, an in-phase quadrature IQ signal, a radio frequency RF signal and a signal after logarithmic compression and gray scale conversion.
In this embodiment, the ultrasound signal of the target tissue of the subject may be acquired in real time, or the ultrasound signal of the target tissue stored in the storage medium in advance may be read. The ultrasonic signal in this embodiment may be any one or more of an analog signal, a digital signal, an in-phase-quadrature IQ signal, a radio frequency RF signal, and a logarithmically compressed and grayscale-converted signal.
S602, determining a value set corresponding to the feature set of the focus in the target tissue according to the ultrasonic signal to obtain a first feature value set.
In this embodiment, a value set corresponding to a feature set of a lesion in a target tissue may be determined according to an ultrasound signal by using the existing correlation technique, for example, a feature value of a lesion in a target tissue may be determined according to an ultrasound signal based on an artificial intelligence technique, and this embodiment is not limited to a specific implementation manner.
And S603, displaying the first feature value set on a display interface.
The first set of feature values may be displayed on a display interface for easy viewing by a user and for modification or confirmation of the first set of feature values.
S604, detecting the operation of modifying or confirming the first characteristic value set by the user to obtain a second characteristic value set.
After the user views the first feature value set, if the value of a certain lesion feature is in doubt, the value of the lesion feature can be modified through an input device such as a mouse, a keyboard and the like; if the value of a certain lesion feature is confirmed, the value of the lesion feature can be confirmed through an input device such as a mouse, a keyboard and the like. In specific implementation, the value range of each focus characteristic can be displayed by adopting pull-down menus, radio boxes and other modes for a user to modify or confirm.
It is understood that the second feature value set may reflect not only the ultrasound signal information of the lesion but also the judgment information of the lesion by the user according to the clinical experience.
And S605, displaying the second feature value set on the display interface.
In order to facilitate the user to view the values of the features of each lesion after modification or confirmation, a second set of feature values may be displayed on the display interface.
S606, determining a value set corresponding to the feature set of the focus according to the first feature value set, the second feature value set and the ultrasonic signal to obtain a third feature value set.
After the first feature value set is obtained based on the ultrasound signal and the second feature value set is obtained by combining the feedback information of the user and the first feature value set, the value set corresponding to the feature set of the lesion may be determined by integrating the first feature value set, the second feature value set and the ultrasound signal to obtain the third feature value set.
And S607, displaying the third feature value set on a display interface.
In order to facilitate the user to view the final value of the focus characteristic and to facilitate the user to perform corresponding diagnosis and treatment operations according to the focus characteristic value, the third characteristic value set of the focus can be displayed on the display interface.
In the ultrasound imaging method for the breast provided by this embodiment, by acquiring an ultrasound signal of a target tissue of a subject, a value set corresponding to a feature set of a lesion in the target tissue is determined according to the ultrasound signal to obtain a first feature value set, and the first feature value set is displayed on a display interface; then detecting the operation of modifying or confirming the first characteristic value set by a user to obtain a second characteristic value set, and displaying the second characteristic value set on a display interface; and finally, determining a value set corresponding to the feature set of the focus according to the first feature value set, the second feature value set and the ultrasonic signal to obtain a third feature value set, and displaying the third feature value set on a display interface. When the focus characteristic value is determined, the ultrasonic signal of the target tissue and the feedback information of the user to the focus are utilized, and the accuracy of the focus characteristic value is improved.
On the basis of the above embodiment, determining a value set corresponding to the feature set of the lesion according to the first feature value set, the second feature value set, and the ultrasound signal to obtain a third feature value set may specifically include:
strengthening the weight of the same-valued features in the first feature value set and the second feature value set and the weight of the corresponding ultrasonic signal when a third feature value set is determined; and/or the presence of a gas in the gas,
and weakening the weight of the ultrasonic signals corresponding to the features with different values in the first feature value set and the second feature value set when the third feature value set is determined.
When the value of a focus characteristic is the same in the first characteristic value set and the second characteristic value set, the focus value obtained based on ultrasonic signal analysis is consistent with doctor evaluation, the value of the focus characteristic and the corresponding ultrasonic signal are compared and profiled, the weight of the focus characteristic and the ultrasonic signal when a third characteristic value set is determined is enhanced, and the accuracy of the third characteristic value set is improved; when the value of a focus feature is different from the first characteristic value set and the second characteristic value set, the focus value obtained based on ultrasonic signal analysis is inconsistent with the medical evaluation, the ultrasonic signal corresponding to the focus feature is not subjected to spectrum leaning, the weight of the ultrasonic signal corresponding to the focus feature in determining the third characteristic value set is weakened, and the accuracy of the third characteristic value set is improved. The above-mentioned weighted optimization strategies can be implemented individually or jointly.
Reference is made herein to various exemplary embodiments. However, those skilled in the art will recognize that changes and modifications may be made to the exemplary embodiments without departing from the scope hereof. For example, the various operational steps, as well as the components used to perform the operational steps, may be implemented in differing ways depending upon the particular application or consideration of any number of cost functions associated with operation of the system (e.g., one or more steps may be deleted, modified or incorporated into other steps).
Additionally, as will be appreciated by one skilled in the art, the principles herein may be reflected in a computer program product on a computer readable storage medium, which is pre-loaded with computer readable program code. Any tangible, non-transitory computer-readable storage medium may be used, including magnetic storage devices (hard disks, floppy disks, etc.), optical storage devices (CD-ROMs, DVDs, Blu Ray disks, etc.), flash memory, and/or the like. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including means for implementing the function specified. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified.
While the principles herein have been illustrated in various embodiments, many modifications of structure, arrangement, proportions, elements, materials, and components particularly adapted to specific environments and operative requirements may be employed without departing from the principles and scope of the present disclosure. The above modifications and other changes or modifications are intended to be included within the scope of this document.
The foregoing detailed description has been described with reference to various embodiments. However, one skilled in the art will recognize that various modifications and changes may be made without departing from the scope of the present disclosure. Accordingly, the disclosure is to be considered in an illustrative and not a restrictive sense, and all such modifications are intended to be included within the scope thereof. Also, advantages, other advantages, and solutions to problems have been described above with regard to various embodiments. However, the benefits, advantages, solutions to problems, and any solution to any element(s) that may cause any element(s) to occur or become more pronounced are not to be construed as critical, required, or essential. As used herein, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, system, article, or apparatus. Furthermore, the term "coupled," and any other variation thereof, as used herein, refers to a physical connection, an electrical connection, a magnetic connection, an optical connection, a communicative connection, a functional connection, and/or any other connection.
The present invention has been described in terms of specific examples, which are provided to aid understanding of the invention and are not intended to be limiting. For a person skilled in the art to which the invention pertains, several simple deductions, modifications or substitutions may be made according to the idea of the invention.

Claims (9)

1. A method of ultrasound imaging of a breast comprising:
acquiring an ultrasonic image of a breast area of a subject;
analyzing the ultrasonic image based on a pre-trained breast lesion intelligent analysis model to obtain a first feature value set corresponding to a BI-RADS feature set value of a breast lesion in the breast area of the subject, wherein the breast lesion intelligent analysis model is obtained based on sample ultrasonic image training labeled with the BI-RADS feature set value;
detecting user modification or confirmation operation on the first characteristic value set to obtain a second characteristic value set;
determining a BI-RADS rating of the breast lesion from the first set of feature values, the second set of feature values, and the ultrasound image.
2. The method of claim 1, wherein the BI-RADS feature set includes a shape type, a direction type, an edge type, an echo type, a posterior echo type, a calcification type, and a blood supply type.
3. The method of claim 1, wherein said determining a BI-RADS rating of said breast lesion from said first set of feature values, said second set of feature values, and said ultrasound image comprises:
obtaining image information feature vectors based on the ultrasound images
Figure RE-FDA0003231972450000011
Wherein x isiIs a feature vector, w, corresponding to the ith BI-RADS feature in the BI-RADS feature set extracted from the ultrasound imageiIs xiN is the number of BI-RADS features in the BI-RADS feature set;
if the ith BI-RADS feature is taken to be the same in the first feature value set and the second feature value set, passing w'i=wi+ Δ adjust xiThe weight of (c);if the ith BI-RADS feature is not equal to the values in the first feature value set and the second feature value set, passing w'i=wi-delta adjustment xiThe weight of (c); wherein delta is weight adjustment amount, w'iFor x after adjustmentiThe weight of (c);
determining an image information feature vector for determining a BI-RADS ranking of the breast lesion according to the feature vector corresponding to each BI-RADS feature and the corresponding adjusted weight
Figure RE-FDA0003231972450000012
Inputting the image information feature vector X' for determining the BI-RADS classification of the breast lesion into a pre-trained BI-RADS classification model to obtain the BI-RADS classification of the breast lesion, wherein the BI-RADS classification model is obtained by training based on a sample feature vector labeled with the BI-RADS classification.
4. The method of claim 3, wherein the method further comprises:
obtaining an attribute feature vector based on the second set of feature values
Figure RE-FDA0003231972450000013
Wherein, yiIs the value of the ith BI-RADS characteristic in the second characteristic value set, riIs yiThe initial weight of (a);
r 'if the ith BI-RADS feature is taken to be the same in the first and second sets of feature values'i=ri+ Delta adjustment yiThe weight of (c); if the ith BI-RADS feature is not equal to the values in the first feature value set and the second feature value set, passing r'i=ri-delta adjustment yiThe weight of (c); wherein r'iTo y after adjustmentiThe weight of (c);
determining the values of the BI-RADS characteristics in the second characteristic value set and the corresponding adjusted weightsAttribute information feature vector for determining BI-RADS ranking of the breast lesion
Figure RE-FDA0003231972450000021
Fusing the image information feature vector X 'for determining the BI-RADS classification of the breast lesion and the attribute information feature vector Y' for determining the BI-RADS classification of the breast lesion;
and inputting the fused feature vectors into the pre-trained BI-RADS classification model to obtain the BI-RADS classification of the breast lesion.
5. The method of claim 1, wherein the obtaining the ultrasound image of the breast area of the subject as a multi-frame ultrasound image, and the analyzing the ultrasound image based on the pre-trained breast lesion intelligent analysis model to obtain a first feature value set corresponding to a BI-RADS feature set value of the breast lesion in the breast area of the subject comprises:
analyzing any one frame of ultrasonic image in the multi-frame ultrasonic image based on a pre-trained breast focus intelligent analysis model to obtain a value set of a BI-RADS feature set corresponding to the any one frame of ultrasonic image;
and obtaining the first characteristic value set from a value set of a BI-RADS characteristic set corresponding to the multi-frame ultrasonic image according to a preset strategy.
6. The method of claim 5, wherein said determining a BI-RADS rating of said breast lesion from said first set of feature values, said second set of feature values, and said ultrasound image comprises:
when the value of any one BI-RADS feature in the value set of the BI-RADS feature set corresponding to any one ultrasonic image in the multi-frame ultrasonic images is the same as the value of any one BI-RADS feature in the second feature set, increasing the weight of any one ultrasonic image when the value of any one BI-RADS feature in the BI-RADS grading is determined; and when the value of any one BI-RADS feature in the value set of the BI-RADS feature set corresponding to any one ultrasonic image is different from the value of any one BI-RADS feature in the second feature set, reducing the weight of any one ultrasonic image when the value of any one BI-RADS feature in the BI-RADS is determined.
7. The method of any one of claims 1-6, wherein prior to the detecting the user's operation of modifying or confirming the first set of feature values, the method further comprises:
displaying the ultrasound image and the first set of feature values in a contrasting manner on a display interface.
8. An ultrasound imaging apparatus, comprising:
the ultrasonic probe is used for transmitting ultrasonic waves to a target tissue of a detected person, receiving echoes of the ultrasonic waves returned by the target tissue, and outputting ultrasonic echo signals based on the received echoes of the ultrasonic waves, wherein the ultrasonic echo signals carry tissue structure information of the target tissue;
the transmitting circuit is used for outputting a corresponding transmitting sequence to the ultrasonic probe according to a set mode so as to control the ultrasonic probe to transmit corresponding ultrasonic waves;
the receiving circuit is used for receiving the ultrasonic echo signal output by the ultrasonic probe and outputting ultrasonic echo data;
a display for outputting visual information;
a processor for performing the method of ultrasound imaging of the breast of any of claims 1 to 7.
9. A computer readable storage medium having stored thereon computer executable instructions for implementing the method of ultrasound imaging of the breast of any one of claims 1 to 7 when executed by a processor.
CN202110968952.2A 2021-08-23 2021-08-23 Ultrasonic imaging method and equipment for mammary gland Pending CN113768544A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110968952.2A CN113768544A (en) 2021-08-23 2021-08-23 Ultrasonic imaging method and equipment for mammary gland

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110968952.2A CN113768544A (en) 2021-08-23 2021-08-23 Ultrasonic imaging method and equipment for mammary gland

Publications (1)

Publication Number Publication Date
CN113768544A true CN113768544A (en) 2021-12-10

Family

ID=78838862

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110968952.2A Pending CN113768544A (en) 2021-08-23 2021-08-23 Ultrasonic imaging method and equipment for mammary gland

Country Status (1)

Country Link
CN (1) CN113768544A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114757953A (en) * 2022-06-15 2022-07-15 深圳瀚维智能医疗科技有限公司 Medical ultrasonic image recognition method, equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114757953A (en) * 2022-06-15 2022-07-15 深圳瀚维智能医疗科技有限公司 Medical ultrasonic image recognition method, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN106204465B (en) The enhancing of Knowledge based engineering ultrasound image
US20210177373A1 (en) Ultrasound system with an artificial neural network for guided liver imaging
CN109949271B (en) Detection method based on medical image, model training method and device
CN111768366A (en) Ultrasound imaging system, BI-RADS classification method and model training method
US9277902B2 (en) Method and system for lesion detection in ultrasound images
WO2018129737A1 (en) Method for measuring parameters in ultrasonic image and ultrasonic imaging system
CN101084511A (en) Method and apparatus for automatically developing a high performance classifier for producing medically meaningful descriptors in medical diagnosis imaging
US20240386556A1 (en) Method and system to assess medical images for suitability in clinical interpretation
CN111260606B (en) Diagnostic device and diagnostic method
US20190374194A1 (en) Ultrasound evaluation of anatomical features
Gao et al. Segmentation of ultrasonic breast tumors based on homogeneous patch
CN112568933B (en) Ultrasonic imaging method, apparatus and storage medium
Kumar et al. A Novel Approach for Breast Cancer Detection by Mammograms
US20220189613A1 (en) Analyzing apparatus and analyzing method
CN113749686A (en) Ultrasound imaging method, device and storage medium
CN113768544A (en) Ultrasonic imaging method and equipment for mammary gland
CN114159099A (en) Breast ultrasound imaging method and equipment
CN114170241A (en) Breast ultrasound image segmentation method and device
CN118717171A (en) Breast ultrasound imaging method and ultrasound imaging system
CN113229850B (en) Ultrasonic pelvic floor imaging method and ultrasonic imaging system
CN115708694A (en) Ultrasonic image processing method and equipment
CN115517709A (en) Ultrasound imaging method and ultrasound imaging system
WO2022112540A1 (en) Predicting a likelihood that an individual has one or more lesions
CN114202514A (en) Breast ultrasound image segmentation method and device
Lakide et al. Precise Lung Cancer Prediction using ResNet–50 Deep Neural Network Architecture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination