Disclosure of Invention
In order to solve the technical problem of low segmentation accuracy of the heart ultrasonic image segmentation method, the invention aims to provide an artificial intelligence-based heart ultrasonic image segmentation method and system, and the adopted technical scheme is as follows:
an embodiment of the invention provides an artificial intelligence-based heart ultrasonic image segmentation method, which comprises the following steps:
Acquiring a heart ultrasonic image sequence through a heart ultrasonic image technology, and further determining each candidate heart tissue region in each frame of heart ultrasonic image in the heart ultrasonic image sequence;
Determining the regional characteristic value of each candidate heart tissue region according to the edge change characteristic and the gray expression characteristic of each candidate heart tissue region;
analyzing the edge difference condition and the distance difference condition of two candidate heart tissue areas positioned in different heart ultrasonic images, and determining a first area difference factor of the two candidate heart tissue areas;
Determining the target region matching degree of each candidate heart tissue region according to the region characteristic value of each candidate heart tissue region and a plurality of first region difference factors;
Correcting the initial cluster radius of each candidate heart tissue region by utilizing the matching degree of the target region to obtain the corrected cluster radius of each candidate heart tissue region;
And combining the corrected cluster radius, and dividing each frame of heart ultrasonic image by using a clustering algorithm to obtain each frame of heart ultrasonic image of the marked artifact region.
Further, the determining the region feature value of each candidate heart tissue region according to the edge variation feature and the gray scale expression feature of each candidate heart tissue region comprises:
For any candidate heart tissue region, carrying out edge detection on the candidate heart tissue region to obtain each edge pixel point of the candidate heart tissue region;
Determining a gray level expression index of each pixel point according to the gray level value of each pixel point in the candidate heart tissue region;
Determining a second region characteristic factor of the candidate heart tissue region according to the average value of gray scale expression indexes of all pixel points in the candidate heart tissue region and the difference condition of gray scale expression indexes between the two adjacent edge pixel points;
And determining the region characteristic value of the candidate heart tissue region by combining the first region characteristic factor and the second region characteristic factor of the candidate heart tissue region.
Further, the determining the gray scale performance index of each pixel according to the gray scale value includes:
For any pixel point, calculating a gray difference value between the pixel point and any adjacent pixel point;
Combining the gray value of the pixel point with the gray difference value to determine the gray expression index of the pixel point;
wherein, the gray value of the pixel point and the gray expression index show positive correlation, and the gray difference value and the gray expression index show negative correlation.
Further, the determining the second region feature factor of the candidate heart tissue region according to the average value of the gray scale expression indexes of all the pixel points in the candidate heart tissue region and the difference condition of the gray scale expression indexes between the two adjacent edge pixel points includes:
Calculating a first ratio of gray scale expression indexes between two adjacent edge pixel points, calculating the absolute value of the difference between 1 and the first ratio, calculating a second ratio of the average value of gray scale expression indexes of all pixel points in a candidate heart tissue region and the absolute value of the difference between 1 and the first ratio, and adding all the second ratios to obtain a second region characteristic factor of the candidate heart tissue region.
Further, the analyzing the edge difference condition and the distance difference condition of two candidate heart tissue regions located in different heart ultrasound images, determining a first region difference factor of the two candidate heart tissue regions, includes:
Taking a candidate heart tissue region in any heart ultrasonic image as a region to be analyzed, and taking candidate heart tissue regions in heart ultrasonic images except the heart ultrasonic image as a contrast region;
for any region to be analyzed and any comparison region, determining the centroid positions of the region to be analyzed and the comparison region and the position of each edge pixel point on the boundary of the region;
Calculating the average value of the distances from the centroids of the region to be analyzed and the contrast region to all the edge pixel points on the corresponding region boundary according to the centroid position and the position of each edge pixel point on the region boundary;
Determining the distance difference degree of the to-be-analyzed area and the comparison area according to the difference between the distance average value of the to-be-analyzed area and the distance average value of the comparison area;
And determining a first region difference factor of the region to be analyzed and the comparison region according to the distance difference degree of the region to be analyzed and the comparison region, the centroid distance and the difference of the number of edge pixel points of the region boundary.
Further, the determining a first region difference factor of the region to be analyzed and the region to be compared according to the difference degree of the distance between the region to be analyzed and the region to be compared, the centroid distance and the difference of the number of edge pixel points of the region boundary includes:
mapping the mass centers of the comparison areas into the areas to be analyzed, calculating the distance between the two mass centers, and recording the distance as the mass center distance;
counting the number of edge pixel points on the area boundary of the area to be analyzed and the contrast area, and taking a difference value between the two edge pixel point numbers as the difference of the number of the edge pixel points;
and taking the product of the distance difference degree, the centroid distance and the difference of the number of the edge pixel points as a first region difference factor of the region to be analyzed and the comparison region.
Further, the determining the matching degree of the target region of each candidate heart tissue region according to the region characteristic value of each candidate heart tissue region and the plurality of first region difference factors includes:
Determining a second region difference factor of the region to be analyzed and the comparison region according to the region characteristic value of the region to be analyzed and the difference condition between the region characteristic values of the region to be analyzed and the comparison region;
determining the region matching degree of the region to be analyzed and the region to be compared by combining the first region difference factor and the second region difference factor of the region to be analyzed and the region to be compared;
Wherein the first region difference factor and the second region difference factor are inversely correlated with the region matching degree;
And acquiring the region matching degree of the region to be analyzed and each comparison region, and selecting the maximum value from all the region matching degrees as the target region matching degree of the region to be analyzed.
Further, the determining a second region difference factor of the region to be analyzed and the contrast region according to the region characteristic value of the region to be analyzed and the difference condition between the region characteristic values of the region to be analyzed and the contrast region, includes:
calculating the absolute value of the difference between the regional characteristic value of the region to be analyzed and the regional characteristic value of the comparison region, and taking the ratio of the absolute value of the difference to the regional characteristic value of the region to be analyzed as a second regional difference factor of the region to be analyzed and the comparison region.
Further, the determining an initial cluster radius for each candidate cardiac tissue region includes:
for any candidate heart tissue region, calculating a difference value between a maximum gray value and a minimum gray value in the candidate heart tissue region as a first cluster influence factor;
Calculating a distance value between any two pixel points under the same gray level, further calculating average values of all the distance values under the same gray level, marking the average values as distance average values, and taking the average values of the distance average values corresponding to all the gray levels as second-class influence factors;
combining the first clustering influence factor and the second clustering influence factor to determine an initial clustering radius of the candidate heart tissue region;
Wherein the first cluster influence factor, the second cluster influence factor, and the initial cluster radius exhibit a negative correlation.
An embodiment of the present invention further provides an artificial intelligence-based cardiac ultrasound image segmentation system, including a processor and a memory, the processor being configured to process instructions stored in the memory to implement an artificial intelligence-based cardiac ultrasound image segmentation method.
The invention has the following beneficial effects:
The invention provides a heart ultrasonic image segmentation method and a system based on artificial intelligence, which comprise the steps of firstly determining each candidate heart tissue region in each frame of heart ultrasonic image, reducing the data volume of image analysis to a certain extent, saving image segmentation steps, secondly determining a target region matching degree for each candidate heart tissue region by combining the image characteristics of a normal heart tissue region, determining the degree of fitting the candidate heart tissue region to the normal heart tissue region by depth analysis, determining the numerical accuracy and reliability of the target region matching degree by a plurality of steps, finally determining an initial clustering radius, correcting the initial clustering radius by utilizing the target region matching degree to obtain a corrected clustering radius, clustering pixels with the heart tissue region by combining the corrected clustering radius of the target region matching degree together, facilitating the follow-up realization of high-precision segmentation processing, realizing the segmentation processing of the heart ultrasonic image based on the corrected clustering radius, obtaining an accurate segmentation result, marking an artifact region or an area affected by the artifact based on the accurate segmentation result, and further improving the accuracy of the diagnosis result of the heart image by a doctor to a certain extent.
Detailed Description
In order to further describe the technical means and effects adopted by the present invention to achieve the preset purpose, the following detailed description is given below of the specific implementation, structure, features and effects of the technical solution according to the present invention with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The application scenario aimed by the invention can be:
The heart ultrasonic image can be specifically a two-dimensional echocardiogram which is an image technology commonly used for clinical diagnosis of heart diseases, but due to the physical characteristics of ultrasonic waves, the interaction of the ultrasonic waves and human tissues, the performance of an ultrasonic instrument and other factors, various artifacts can appear in ultrasonic imaging, and the heart artifacts can influence the accuracy of heart segmentation of the ultrasonic image, so that the accuracy of a doctor diagnosis result is influenced.
In order to overcome the adverse effect of the false image on the segmentation of the heart ultrasonic image, the method corrects the initial clustering radius to obtain the corrected clustering radius through the characteristics of the heart tissue region and the false image region in the heart ultrasonic image and the regional characteristic difference among a plurality of heart ultrasonic images, and the heart ultrasonic image after the false image region is marked is obtained by carrying out regional segmentation on the heart ultrasonic image by using the corrected clustering radius, so that the marked heart ultrasonic image can provide data support for accurate diagnosis of doctors.
Specifically, an embodiment of the present invention provides an artificial intelligence based cardiac ultrasound image segmentation method, as shown in fig. 1, including the following steps:
s1, acquiring a heart ultrasonic image sequence through a heart ultrasonic image technology, and further determining each candidate heart tissue region in each frame of heart ultrasonic image in the heart ultrasonic image sequence.
Here, the cardiac ultrasound image sequence includes multiple frames of cardiac ultrasound images, and the positions of the artifact regions in each frame of cardiac ultrasound image may be different, so that in order to accurately distinguish between the normal cardiac tissue region and the artifact region in the cardiac ultrasound image, multiple frames of cardiac ultrasound images need to be acquired, and all regions possibly being cardiac tissue in each frame of cardiac ultrasound image are determined, that is, candidate cardiac tissue regions are determined, where the candidate cardiac tissue regions may be the normal cardiac tissue region, the artifact region, the region affected by the artifact, and the abnormal region.
The above step S1 may be achieved by steps S11 to S12 (not shown in the drawings):
s11, acquiring a heart ultrasonic image sequence through a heart ultrasonic image technology.
In this embodiment, in order to subsequently mark an artifact region or a region affected by an artifact in an ultrasound image of the heart, the ultrasound image of the heart needs to be acquired by an ultrasound instrument. The ultrasound image of the heart is a single Zhang Jingtai image obtained by ultrasound examination of the heart, which is a part of an image showing the image of the heart at a certain moment, typically for observing the structure and evaluating abnormalities of the heart, and in addition, a single ultrasound image may provide detailed information of a certain section or view of the heart, typically requiring multiple images to obtain comprehensive diagnostic information, so that during actual operation, the ultrasound image at each moment is continuously obtained by the ultrasound apparatus, forming a dynamic sequence by which detailed images at each moment can be provided to the heart.
Specifically, the heart of a patient is scanned by an ultrasonic instrument, a heart scanning video corresponding to the patient can be obtained, the heart scanning video is converted into images, an initial heart ultrasonic image sequence can be formed, continuous artifacts can appear in images at adjacent time points in the heart scanning video due to dynamic characteristics of scanning, all images in the initial heart ultrasonic image sequence are divided into groups according to 15 frames, a first frame image is selected from each group to serve as an image representing the corresponding group, the representative images of each group form a new image set according to time sequence, namely the heart ultrasonic image sequence, and each heart ultrasonic image in the heart ultrasonic image sequence is subjected to image analysis. The number of image divisions in the initial cardiac ultrasound image sequence may be set by an implementer according to specific practical situations, without specific limitation, and an example diagram of a cardiac ultrasound image of any one frame is shown in fig. 2.
It should be noted that, regarding ultrasound images and ultrasound images, a physician may use ultrasound images in assessing heart condition because it includes all types of image data. The ultrasound image is a part of the ultrasound image, and a doctor can perform detailed analysis and diagnosis through a plurality of ultrasound images, so that image analysis is required for the ultrasound image to realize more accurate ultrasound image segmentation.
S12, determining each candidate heart tissue region in each frame of heart ultrasonic image in the heart ultrasonic image sequence.
In this embodiment, in order to determine the nature of the candidate cardiac tissue region, the artifact region in the image is marked, and first, the cardiac tissue region that may be normal cardiac tissue, artifact or affected by the artifact needs to be selected from the cardiac ultrasound image, that is, each candidate cardiac tissue region in the cardiac ultrasound image is determined.
Specifically, in order to analyze the gray features of the image, gray processing is performed on each frame of the heart ultrasonic image to obtain a heart ultrasonic image after gray processing, then a connected domain algorithm is used for identifying the connected domains in the heart ultrasonic image to obtain each connected domain in each frame of the heart ultrasonic image, then threshold segmentation is performed on each frame of the heart ultrasonic image, the region with the gray value larger than the threshold is marked as a foreground region, the region with the gray value smaller than the threshold is marked as a background region, and the foreground region is used as a candidate heart tissue region due to the fact that the gray values of the tissue region and the artifact region of the heart are higher, and the regions such as the atrium of the heart are lower in gray value and located in the background region, so that the background region cannot be confused with the artifact region under normal conditions.
The implementation processes of the graying process, the connected domain algorithm and the threshold segmentation algorithm are all in the prior art, and are not in the scope of the invention, and are not described in detail herein, and an example diagram of the heart ultrasound image after the threshold segmentation process is shown in fig. 3.
To this end, the present embodiment obtains respective candidate cardiac tissue regions in each frame of cardiac ultrasound images.
S2, determining the regional characteristic value of each candidate heart tissue region according to the edge change characteristic and the gray scale expression characteristic of each candidate heart tissue region.
The regional characteristic value refers to the image characteristic degree of heart tissue represented by the candidate heart tissue region, the larger the regional characteristic value is, the larger the possibility that the candidate heart tissue region belongs to a normal heart tissue region is, the smaller the possibility that the candidate heart tissue region belongs to an artifact region or is affected by the artifact is, the image characteristic of the heart tissue region, namely the edge change condition and the gray level representation condition of the region are quantized from two aspects, and the numerical accuracy and the representative meaning of the regional characteristic value can be effectively enhanced by determining the characteristic value from the two aspects.
The above step S2 may be implemented by steps S21 to S24 (not shown in the drawings):
And S21, for any candidate heart tissue region, carrying out edge detection on the candidate heart tissue region to obtain each edge pixel point of the candidate heart tissue region, and determining a first region characteristic factor of the candidate heart tissue region according to the difference condition of gradient amplitude values between two adjacent edge pixel points.
In this embodiment, since the edges of the heart tissue region in the ultrasound image are generally regular and clear, the gray gradient amplitude of the edges in the normal heart tissue region is relatively uniform, i.e. the gradient variation degree of the edge pixel points is relatively uniform, and the artifacts are generally caused by noise, reflection errors or other imaging problems of the ultrasound signals, the edge expression of the artifact region in the image is relatively irregular, and the edges of the artifact region may be disordered or blurred. Thus, the artifact region is significantly different from the normal heart tissue region in terms of edge pixel distribution.
Specifically, edge detection is carried out on any candidate heart tissue region by using a Canny edge detection operator to obtain each edge in the candidate heart tissue region, each edge pixel point on each edge is further obtained, gradient amplitude values are calculated based on gray values of each edge pixel point, negative correlation processing is carried out on gradient amplitude difference values of all two adjacent edge pixel points to obtain a value after the negative correlation processing, and then accumulation calculation is carried out on the value after the negative correlation processing to obtain a first region characteristic factor of the candidate heart tissue region. The process of realizing edge detection by the Canny edge detection operator is the prior art, and is not in the protection scope of the invention, and detailed description is not provided here.
As an example, the calculation formula of the first region feature factor of the i-th candidate cardiac tissue region in the kth frame of cardiac ultrasound image may be:
In the formula (I), in the formula (II), A first region feature factor representing an ith candidate cardiac tissue region in the kth frame of cardiac ultrasound images, n representing the number of edge pixel points within the ith candidate cardiac tissue region,Representing the gradient amplitude of the w-th edge pixel point in the ith candidate heart tissue area in the kth frame of heart ultrasonic image,Representing the gradient amplitude of the (w+1) th edge pixel point in the ith candidate heart tissue area in the kth frame of heart ultrasonic image,The representation takes the function of the absolute value,Representing the gradient magnitude difference value,Representation pairAnd performing negative correlation processing.
It should be noted that, in the normal case, there is no completely equal gradient amplitude of two adjacent edge pixels, and once there is a special case, in order to ensure that the denominator does not have a possibility of zero, a constant, such as 0.01, needs to be added at the position of the denominator.
S22, determining the gray scale expression index of each pixel point according to the gray scale value of each pixel point in the candidate heart tissue area.
In this embodiment, for a real region of heart tissue, because the echo signal of normal heart tissue is relatively stable, the heart exhibits a higher gray level and more gradual change characteristic by representing a lighter and uniform region of non-color in the ultrasound image, and artifacts may be due to signal noise or other imaging problems compared to the region of heart tissue, so that artifacts or other abnormal regions typically exhibit darker colors in the ultrasound image, have lower gray values and have certain changes, and the changes between pixels in the artifact region are typically greater than those in the region of heart tissue. Accordingly, the gray scale representation value may quantify the likelihood that the candidate heart tissue region is a normal heart tissue region based on the gray scale characteristics of the candidate heart tissue region, the greater the gray scale representation value, the greater the likelihood that the candidate heart tissue region is a normal heart tissue region.
Specifically, for any pixel point, calculating a gray difference value between the pixel point and any adjacent pixel point; and determining the gray level expression index of the pixel point by combining the gray level value and the gray level difference value of the pixel point, wherein the gray level value and the gray level expression index of the pixel point are positively correlated, and the gray level difference value and the gray level expression index are negatively correlated. The gray difference value refers to the absolute value of the difference between the gray values of two pixel points.
As an example, the calculation formula of the gray scale representation value of the r pixel point in the i candidate tissue region in the k-th frame of cardiac ultrasound image may be:
In the formula (I), in the formula (II), Representing a gray scale representation value of an r pixel point in an i candidate tissue region in a k-th frame of the cardiac ultrasound image,Representing the gray value of the r pixel point in the ith candidate tissue area in the kth frame of heart ultrasonic image,Representing the gray value of the (r+1) th pixel point in the ith candidate tissue region in the kth frame of heart ultrasonic image,Representing gray scale difference value of the (r) th pixel point in the (i) th candidate tissue region in the (k) th frame cardiac ultrasonic image, +0.1 is used for preventingA zero condition occurs.
In the calculation formula of the gray level representation value, the smaller the gray level difference value is, the smaller the difference between the gray level value of the r pixel point in the region and the gray level value of the (r+1) th pixel point is, namely the gray level change is flatter and more uniform, the larger the gray level value of the r pixel point in the region is, namely the color of the pixel point is lighter, and when the gray level change of the pixel point is flatter and the gray level value is larger, the larger the gray level representation value of the corresponding pixel point is, the greater the possibility that the pixel point in the region is the pixel point in the heart tissue region is.
S23, determining a second region characteristic factor of the candidate heart tissue region according to the average value of gray scale expression indexes of all pixel points in the candidate heart tissue region and the difference condition of gray scale expression indexes between two adjacent edge pixel points.
The method comprises the steps of calculating a first ratio of gray scale expression indexes between two adjacent edge pixel points, calculating a difference absolute value between 1 and the first ratio, calculating a second ratio of an average value of gray scale expression indexes of all pixel points in a candidate heart tissue region to the difference absolute value, and adding all the second ratios to obtain a second region characteristic factor of the candidate heart tissue region.
As an example, the calculation formula of the second region feature factor of the i-th candidate cardiac tissue region in the kth frame of cardiac ultrasound image may be:
In the formula (I), in the formula (II), A second region feature factor representing an ith candidate cardiac tissue region in the kth frame of cardiac ultrasound images, n representing the number of edge pixel points within the ith candidate cardiac tissue region,Represents the average value of gray scale expression indexes of all pixel points in the ith candidate heart tissue area in the kth frame of heart ultrasonic image,A gray scale representation index representing a w-th edge pixel point in an i-th candidate heart tissue region in a k-th frame of heart ultrasonic image,A gray scale representation index representing the w+1st edge pixel point in the ith candidate heart tissue area in the kth frame of heart ultrasonic image,A first ratio is indicated and a second ratio is indicated,Representing a second ratio.
In the calculation formula of the second region feature factor,There is no possibility of zero, the first ratioThe closer to 1, the more similar the gray scale representation values of adjacent edge pixel points are, the more the gray scale representation values can represent the characteristics of the heart tissue region, so that the absolute value of the difference value of the 1 and the first ratio is inversely related to the characteristic factor of the second region; The gray level of the heart tissue features of the candidate heart tissue region is represented, the larger the gray level value of the whole region is, the more the gray level image features of the candidate heart tissue region are attached to the gray level features of the heart tissue region, and therefore the average value of the gray level representation values of all pixel points in the candidate heart tissue region is positively correlated with the second region feature factor.
S24, combining the first region characteristic factors and the second region characteristic factors of the candidate heart tissue regions to determine the region characteristic values of the candidate heart tissue regions.
Specifically, the first region feature factor, the second region feature factor and the region feature value all show positive correlation, so that the product of the first region feature factor and the second region feature factor can be used as the region feature value of the corresponding candidate heart tissue region.
Of course, the value obtained by adding the first region feature factor and the second region feature factor may be used as the region feature value of the candidate heart tissue region.
Thus, the present embodiment can obtain the region feature value of each candidate cardiac tissue region in each frame of cardiac ultrasound image.
S3, analyzing the edge difference condition and the distance difference condition of two candidate heart tissue areas positioned in different heart ultrasonic images, and determining a first area difference factor of the two candidate heart tissue areas.
Here, when using an ultrasound instrument, small movements of the probe may cause slight variations in shape and gray scale performance in the image, but the relative positional movement of corresponding identical normal heart tissue regions in each frame of heart ultrasound images is typically not large, while the same heart ultrasound images are similar in gray scale performance in different heart ultrasound images. Thus, when performing a matching analysis of two candidate heart tissue regions located in different heart ultrasound images, the region difference between each frame of images needs to be considered, i.e. a first region difference factor is determined. The first region difference factor can represent the edge matching condition and the distance condition between two candidate heart tissue regions positioned in different heart ultrasonic images, and the larger the first region difference factor is, the smaller the matching degree of the two candidate heart tissue regions positioned in different heart ultrasonic images is.
The step S3 may be implemented by the steps shown in fig. 4:
and S31, taking the candidate heart tissue region in any heart ultrasonic image as a region to be analyzed, and taking the candidate heart tissue region in the heart ultrasonic images except the heart ultrasonic image as a comparison region.
In this embodiment, when analyzing the first region difference factor, the determination is based on candidate cardiac tissue regions in two different cardiac ultrasound images, each cardiac ultrasound image needs to perform region matching analysis with other cardiac ultrasound images except for the cardiac ultrasound image, for convenience of understanding, all candidate cardiac tissue regions in any cardiac ultrasound image are taken as regions to be analyzed, and candidate cardiac tissue regions in cardiac ultrasound images except for the cardiac ultrasound image are taken as contrast regions, the contrast regions are contrast regions of each region to be analyzed, and each region to be analyzed has a corresponding plurality of contrast regions.
S32, for any region to be analyzed and any comparison region, determining the centroid positions of the region to be analyzed and the comparison region and the position of each edge pixel point on the boundary of the region.
In this embodiment, taking any region to be analyzed and any comparison region as an example, a first region difference factor of the region to be analyzed and the comparison region is determined. Before analyzing the first region difference factor, in order to facilitate the subsequent determination of the degree of distance difference, the centroid positions of the region to be analyzed and the contrast region and the position of each edge pixel point on the region boundary need to be determined first. The determination process of the center of mass and the region boundary in the region is the prior art, and is not in the scope of the invention, and will not be described in detail herein, and the region boundary specifically refers to the edge located at the outermost layer of the region, which can be obtained through an edge detection algorithm.
S33, calculating the average value of the distances from the mass centers of the region to be analyzed and the contrast region to all the edge pixel points on the corresponding region boundary according to the mass center positions and the positions of each edge pixel point on the region boundary.
In this embodiment, the position condition of the centroid of the region relative to the region boundary may be represented by a distance average, where the distance average is a data index used to calculate the distance difference degree later, and the smaller the difference between the distance averages of the region to be analyzed and the region to be compared, the greater the probability that the region to be analyzed and the region to be compared belong to the same region, the higher the first region difference factor of the region to be analyzed and the region to be compared will be.
The method comprises the steps of calculating Euclidean distances from the centroid of a region to be analyzed to all edge pixel points on the boundary of the corresponding region, taking the average value of all Euclidean distances as the distance average value of the region to be analyzed, calculating the Euclidean distances from the centroid of a comparison region to all edge pixel points on the boundary of the corresponding region, and taking the average value of all Euclidean distances as the distance average value of the comparison region. The calculation formula of the euclidean distance is in the prior art, and is not in the scope of the present invention, and will not be described in detail here.
S34, determining the distance difference degree of the to-be-analyzed area and the comparison area according to the difference between the distance average value of the to-be-analyzed area and the distance average value of the comparison area.
Specifically, calculating the absolute value of the difference between the distance average value of the area to be analyzed and the distance average value of the comparison area, and taking the absolute value of the difference between the two distance average values as the distance difference degree of the area to be analyzed and the comparison area.
S35, determining a first region difference factor of the region to be analyzed and the comparison region according to the distance difference degree of the region to be analyzed and the comparison region, the centroid distance and the difference of the number of edge pixel points of the region boundary.
The above step S35 may be implemented by steps S351 to S353 (not shown in the drawing):
S351, mapping the mass centers of the comparison areas into the areas to be analyzed, calculating the distance between the two mass centers, and recording the distance as the mass center distance.
S352, counting the number of edge pixel points on the area boundary of the area to be analyzed and the contrast area, and taking the difference value between the two edge pixel point numbers as the difference of the number of the edge pixel points.
And S353, taking the product of the distance difference degree, the centroid distance and the difference of the number of the edge pixels as a first region difference factor of the region to be analyzed and the comparison region.
As an example, the calculation formula of the first region difference factor of the region to be analyzed and the comparison region may be:
wherein D represents a first region difference factor of the region to be analyzed and the contrast region, D represents a degree of difference in distance between the region to be analyzed and the contrast region, The centroid distance of the region to be analyzed and the contrast region is represented, and N represents the difference of the number of edge pixel points of the region to be analyzed and the contrast region.
In the calculation formula of the first region difference factor,The method comprises the steps of displaying the difference degree of positions of an area to be analyzed and a contrast area in a heart ultrasonic image, wherein the smaller the difference degree of the positions is, the more likely the positions are, the same area, N represents the difference condition between the number of edge pixel points of the area to be analyzed and the number of edge pixel points of the contrast area, and the smaller N is, the more similar the edge states of the area to be analyzed and the contrast area are, and the more likely the area to be analyzed is a normal heart tissue area.
It should be noted that, the area to be analyzed and each comparison area are subjected to calculation analysis of the first area difference factors, so that a plurality of first area difference factors corresponding to the area to be analyzed can be obtained, and the calculation mode of the first area difference factors is referred to, so that a plurality of first area difference factors of each candidate heart tissue area are obtained.
To this end, the present embodiment obtains a number of first region difference factors for each candidate cardiac tissue region.
And S4, determining the matching degree of the target areas of the candidate heart tissue areas according to the area characteristic values of the candidate heart tissue areas and the first area difference factors.
Here, when the artifact region labeling is performed, the artifact region is false information appearing in the image due to various factors, which affects the accurate segmentation of the normal heart tissue region. The artifacts are formed in various ways, and the same form is not usually appeared or repeated in a plurality of images, so that the artifacts may be differently represented in different images, specifically, the artifacts may be connected with a normal heart tissue region, so as to change the boundary and width of the normal region, or the artifacts may be independent of the normal tissue region, so as to form a region different from surrounding tissues, and all the regions distort the characteristics of the normal heart tissue region, so that the accuracy of region segmentation is affected.
Therefore, the candidate heart tissue region in the current heart ultrasonic image needs to be compared with the candidate heart tissue regions in other heart ultrasonic images, if the matching degree of the candidate heart tissue region in the current heart ultrasonic image and a certain candidate heart tissue region in other heart ultrasonic images is higher, the candidate heart tissue region in the current heart ultrasonic image is likely to be a real heart tissue region and accords with the characteristics of normal heart tissue, and if the matching result shows that the matching degree of the current candidate heart tissue region and the candidate heart tissue region in other heart ultrasonic images is lower, the current candidate heart tissue region is likely to be subjected to the image of an artifact region or is likely to be a single artifact region.
In this embodiment, the target region matching degree is determined by combining the region feature value and the first region difference factor, where the target region matching degree refers to a matching degree corresponding to a region that is most matched with the candidate heart tissue region, and the target region matching degree may represent a degree of an image feature of the heart tissue region represented by the candidate heart tissue region, and the greater the target region matching degree, the greater the likelihood that the candidate heart tissue region is a normal heart tissue region.
The step S4 may be implemented by the steps shown in fig. 5:
S41, determining a second region difference factor of the region to be analyzed and the comparison region according to the region characteristic value of the region to be analyzed and the difference condition between the region characteristic values of the region to be analyzed and the comparison region.
Specifically, calculating the absolute value of the difference between the regional characteristic value of the region to be analyzed and the regional characteristic value of the comparison region, and taking the ratio of the absolute value of the difference to the regional characteristic value of the region to be analyzed as a second regional difference factor of the region to be analyzed and the comparison region.
S42, combining the first region difference factor and the second region difference factor of the region to be analyzed and the comparison region, and determining the region matching degree of the region to be analyzed and the comparison region.
In this embodiment, the first region difference factor and the second region difference factor are all inversely related to the region matching degree.
As an example, the calculation formula of the region matching degree of the region to be analyzed and the comparison region may be:
Wherein E represents the region matching degree of the region to be analyzed and the comparison region, exp represents an exponential function based on a natural constant, R represents the region characteristic value of the region to be analyzed, The region characteristic value representing the contrast region,The absolute value of the difference between the region characteristic value representing the region to be analyzed and the region characteristic value of the comparison region,Representing the second region difference factor and D representing the first region difference factor.
In the calculation formula of the region matching degree, the region to be analyzed and each comparison region have a region matching degree,The method is characterized in that the difference of the regional characteristic values between the region to be analyzed and the comparison region is represented, the smaller the difference of the regional characteristic values is, the higher the similarity degree between the two regions is, the more likely the region to be analyzed is a normal heart tissue region, the higher the regional characteristic value of the region to be analyzed is, the more obvious the characteristic of the heart tissue region is, the higher the possibility of matching is, the higher the regional matching degree is, the smaller the first regional difference factor is, the smaller the central difference degree between the region to be analyzed and the comparison region is, the more likely the two regions are the same region, the greater the corresponding regional matching degree is, exp (-) can be used for realizing the negative correlation of data, and of course, an implementer can also realize the negative correlation of the data in other calculation modes, such as reciprocal.
S43, obtaining the region matching degree of the region to be analyzed and each comparison region, and selecting the maximum value from all the region matching degrees as the target region matching degree of the region to be analyzed.
In this embodiment, referring to the calculation process of the region matching degree of the region to be analyzed and any comparison region, the region matching degree of the region to be analyzed and each comparison region can be obtained, and the larger the region matching degree is, the more similar the region to be analyzed is to a certain comparison region, the greater the possibility that the region to be analyzed is used for subsequently correcting the initial cluster radius of the region to be analyzed is, so that the maximum value can be used as the target region matching degree of the region to be analyzed.
Thus far, the present embodiment obtains the target region matching degree for each candidate heart tissue region.
And S5, determining the initial cluster radius of each candidate heart tissue region, and correcting the initial cluster radius of each candidate heart tissue region by utilizing the matching degree of the target region to obtain the corrected cluster radius of each candidate heart tissue region.
Here, the initial cluster radius can be obtained through quantification of gray level distribution conditions of pixel points, and the initial cluster radius does not consider the multi-aspect image characteristic difference between a normal heart tissue region and an artifact region, so that a cluster segmentation result determined by the initial cluster radius has deviation, the heart tissue region influenced by the artifact region is difficult to accurately segment, the difficulty of diagnosing the heart state of a doctor is increased, and therefore the initial cluster radius needs to be corrected by utilizing the matching degree of a target region, and the corrected cluster radius is more reliable and more accurate.
The above step S5 may be implemented by steps S51 to S52 (not shown in the drawings):
S51, determining initial cluster radii of each candidate heart tissue region.
The method comprises the steps of calculating a difference value between a maximum gray value and a minimum gray value in a candidate heart tissue region for any candidate heart tissue region to be used as a first clustering influence factor, calculating a distance value between any two pixel points under the same gray level, further calculating an average value of all the distance values under the same gray level to be recorded as a distance average value, taking the average value of the distance average values corresponding to all the gray levels as a second clustering influence factor, and determining an initial clustering radius of the candidate heart tissue region by combining the first clustering influence factor and the second clustering influence factor.
Wherein the first cluster influence factor and the second cluster influence factor are in negative correlation with the initial cluster radius.
As an example, the calculation formula of the initial cluster radius of the ith candidate cardiac tissue region in the ith frame of cardiac ultrasound image may be:
In the formula (I), in the formula (II), Representing the initial cluster radius of the ith candidate cardiac tissue region in the ith frame of cardiac ultrasound image,Representing the maximum gray value in the ith candidate cardiac tissue region in the ith frame of cardiac ultrasound image,Representing the minimum gray value in the ith candidate cardiac tissue region in the ith frame of cardiac ultrasound image,Representing a first cluster influence factor, I representing the number of gray levels in the ith candidate heart tissue region, m representing the number of pixels at the jth gray level,Representing the distance between the (r) th pixel point and the (f) th pixel point in the (j) th gray level in the (i) th candidate heart tissue region,Representing a second aggregate imaging factor.
In the calculation formula of the initial cluster radius,The larger the difference between the maximum gray value and the minimum gray value in the ith candidate heart tissue area is, the more likely the ith candidate heart tissue area is an artifact area or an area affected by the artifact is, and in order to distinguish the artifact area from a normal heart tissue area, the smaller the value of the initial cluster radius is; The smaller the average distance between all pixel points representing the same gray value in the ith candidate heart tissue area, the more concentrated the distribution of pixel points representing similar gray values in the candidate heart tissue area, the more likely it is a normal heart tissue area, and the larger the initial cluster radius should be.
S52, correcting the initial cluster radius of each candidate heart tissue region by utilizing the matching degree of the target region, and obtaining the corrected cluster radius of each candidate heart tissue region.
Specifically, the initial cluster radius of each candidate heart tissue region is multiplied by the matching degree of the target region to be used as the corrected cluster radius of the corresponding candidate heart tissue region, so that the corrected cluster radius of each candidate heart tissue region is obtained.
It should be noted that, based on the correction clustering radius, the segmentation processing of the cardiac ultrasound image is realized, an accurate segmentation result can be obtained, and the artifact region and the region affected by the artifact can be marked based on the accurate segmentation result, so that the degree of the influence of the artifact on the cardiac ultrasound image is reduced to a certain extent, and the accuracy of the diagnosis result of a doctor can be further effectively improved.
Thus far, the present embodiment obtains a modified cluster radius for each candidate cardiac tissue region.
And S6, combining the corrected cluster radius, and carrying out segmentation processing on each frame of heart ultrasonic image by using a clustering algorithm to obtain each frame of heart ultrasonic image of the marked artifact region.
In this embodiment, a DPC (DENSITY PEAKS Clustering, density peak clustering algorithm) clustering algorithm is used to cluster images in combination with a modified cluster radius, the modified cluster radius may be used to calculate a neighborhood range of local density, the magnitude of the local density is calculated using the obtained modified cluster radius, then a relative distance is obtained, a decision graph is drawn according to the local density and the relative distance to obtain a cluster center, and other pixel points are allocated to the closest cluster center to obtain different clusters. After the images are clustered, the heart tissue area and the artifact area can be segmented, and marking processing is carried out on the heart tissue area and the artifact area respectively, so that each frame of heart ultrasonic image of the marked artifact area is obtained, and the marked artifact area comprises an independent artifact area and an area influenced by the artifact. The implementation process of the DPC clustering algorithm is in the prior art, and is not in the scope of the present invention, and will not be described in detail herein.
Specifically, an embodiment of the present invention further provides an artificial intelligence-based cardiac ultrasound image segmentation system, which includes a processor and a memory, where the processor is configured to process the instructions stored in the memory to implement the artificial intelligence-based cardiac ultrasound image segmentation method as described above.
Although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those skilled in the art that the foregoing embodiments may be modified or equivalents may be substituted for some of the features thereof, and that the modification or substitution does not depart from the scope of the embodiments of the present invention.