CN114648527A - Urothelium cell slide image classification method, device, equipment and medium - Google Patents
Urothelium cell slide image classification method, device, equipment and medium Download PDFInfo
- Publication number
- CN114648527A CN114648527A CN202210543459.0A CN202210543459A CN114648527A CN 114648527 A CN114648527 A CN 114648527A CN 202210543459 A CN202210543459 A CN 202210543459A CN 114648527 A CN114648527 A CN 114648527A
- Authority
- CN
- China
- Prior art keywords
- cell
- classification
- depth
- image
- classification result
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 210000003741 urothelium Anatomy 0.000 title claims abstract description 39
- 238000013145 classification model Methods 0.000 claims abstract description 72
- 238000001914 filtration Methods 0.000 claims abstract description 23
- 230000004927 fusion Effects 0.000 claims abstract description 13
- 238000000605 extraction Methods 0.000 claims description 31
- 230000011218 segmentation Effects 0.000 claims description 22
- 238000004590 computer program Methods 0.000 claims description 18
- 230000008569 process Effects 0.000 abstract description 9
- 210000004027 cell Anatomy 0.000 description 213
- 238000010586 diagram Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 210000003855 cell nucleus Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 210000000805 cytoplasm Anatomy 0.000 description 1
- 230000001086 cytosolic effect Effects 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 229910021389 graphene Inorganic materials 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 238000012847 principal component analysis method Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The application relates to the technical field of computer vision, and provides a urothelial cell slide image classification method, device, equipment and medium. The method and the device can improve the applicability of the data generated in the classification process, enlarge the application range of the data, and improve the accuracy of the classification result. The method comprises the following steps: inputting a plurality of urothelium cell slide sub-images into a depth classification model to obtain a depth classification result output by the depth classification model; filtering the depth classification result to obtain a suspicious target sub-graph set; inputting the suspicious target sub-image set into a cell characteristic classification model to obtain cell characteristics and cell characteristic classification results output by the cell characteristic classification model; performing feature fusion on the deep classification result, the cell features and the cell feature classification result to obtain overall classification features; and determining a classification result of the urothelium cell slide image according to the overall classification characteristic.
Description
Technical Field
The application relates to the technical field of computer vision, in particular to a method, a device, equipment and a medium for classifying urothelial cell slide images.
Background
With the development of computer vision technology, more and more technicians choose to apply the technology to the field of image classification of cell slides. The method can effectively improve the efficiency of classifying the urothelium cell slide images.
At present, a deep neural network model is generally applied to image classification of urothelial cell slides, but data generated in a classification process performed by the model has the problems of poor applicability and limited application range, and the accuracy of a classification result is still to be improved.
Disclosure of Invention
In view of the above, it is necessary to provide a method, an apparatus, a device, and a medium for classifying urothelial cell slide images in order to solve the above-mentioned technical problems.
In a first aspect, the present application provides a urothelial cell slide image classification method. The method comprises the following steps:
inputting a plurality of urothelium cell slide subimages into a depth classification model to obtain a depth classification result output by the depth classification model; wherein the plurality of urothelium cell slide sub-images are obtained by segmenting a urothelium cell slide image;
filtering the depth classification result to obtain a suspicious target sub-image set;
inputting the suspicious target sub-image set into a cell feature classification model to obtain cell features and cell feature classification results output by the cell feature classification model;
performing feature fusion on the deep classification result, the cell features and the cell feature classification result to obtain overall classification features;
and determining the classification result of the urothelial cell slide image according to the overall classification characteristics.
In a second aspect, the application also provides a urothelial cell slide image sorting device. The device comprises:
the depth classification unit is used for inputting a plurality of urothelium cell slide subimages into a depth classification model to obtain a depth classification result output by the depth classification model; wherein the plurality of urothelium cell slide sub-images are obtained by segmenting a urothelium cell slide image;
the depth classification result filtering unit is used for filtering the depth classification result to obtain a suspicious target sub-image set;
the cell characteristic classification unit is used for inputting the suspicious target sub-image set into a cell characteristic classification model to obtain cell characteristics and cell characteristic classification results output by the cell characteristic classification model;
the overall classification characteristic acquisition unit is used for carrying out characteristic fusion on the deep classification result, the cell characteristics and the cell characteristic classification result to obtain overall classification characteristics;
and the result output unit is used for determining the classification result of the urothelial cell slide image according to the overall classification characteristics.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the following steps when executing the computer program:
inputting a plurality of urothelium cell slide subimages into a depth classification model to obtain a depth classification result output by the depth classification model; wherein the plurality of urothelium cell slide sub-images are obtained by segmenting a urothelium cell slide image; filtering the depth classification result to obtain a suspicious target sub-image set; inputting the suspicious target sub-image set into a cell feature classification model to obtain cell features and cell feature classification results output by the cell feature classification model; performing feature fusion on the deep classification result, the cell features and the cell feature classification result to obtain overall classification features; and determining the classification result of the urothelial cell slide image according to the overall classification characteristics.
In a fourth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
inputting a plurality of urothelium cell slide subimages into a depth classification model to obtain a depth classification result output by the depth classification model; wherein the plurality of urothelium cell slide sub-images are obtained by segmenting a urothelium cell slide image; filtering the depth classification result to obtain a suspicious target sub-image set; inputting the suspicious target sub-image set into a cell feature classification model to obtain cell features and cell feature classification results output by the cell feature classification model; performing feature fusion on the deep classification result, the cell features and the cell feature classification result to obtain overall classification features; and determining the classification result of the urothelial cell slide image according to the overall classification characteristics.
In a fifth aspect, the present application further provides a computer program product. The computer program product comprising a computer program which when executed by a processor performs the steps of:
inputting a plurality of urothelium cell slide subimages into a depth classification model to obtain a depth classification result output by the depth classification model; wherein the plurality of urothelium cell slide sub-images are obtained by segmenting a urothelium cell slide image; filtering the depth classification result to obtain a suspicious target sub-image set; inputting the suspicious target sub-image set into a cell feature classification model to obtain cell features and cell feature classification results output by the cell feature classification model; performing feature fusion on the deep classification result, the cell features and the cell feature classification result to obtain overall classification features; and determining the classification result of the urothelial cell slide image according to the overall classification characteristics.
According to the method, the device, the equipment, the medium and the program product for classifying the urothelial cell slide images, firstly, a plurality of sub-images of the urothelial cell slide are input into a depth classification model, and a depth classification result output by the depth classification model is obtained; wherein, the plurality of sub images of the urothelial cell slide are obtained by segmenting the urothelial cell slide image. And then, filtering the depth classification result to obtain a suspicious target sub-image set. And then, inputting the suspicious target sub-image set into a cell characteristic classification model to obtain cell characteristics and cell characteristic classification results output by the cell characteristic classification model. And then, performing feature fusion on the deep classification result, the cell features and the cell feature classification result to obtain a total classification feature. And finally, determining the classification result of the urothelial cell slide image according to the overall classification characteristics. According to the scheme, the method combining the artificial learning method and the deep learning method is adopted for classifying the urothelial cell slide images, so that the applicability of data generated in the classifying process of the urothelial cell slide images can be improved, the application range of the data is expanded, and the accuracy of the urothelial cell slide image classifying result can be effectively improved.
Drawings
FIG. 1 is a schematic flow chart diagram of a method for classifying urothelial cell slide images in one embodiment;
FIG. 2 is a schematic flow chart illustrating how a cell feature classification model yields cell feature classification results according to one embodiment;
FIG. 3 is a flow diagram that illustrates the manner in which the depth segmentation module generates image data in one embodiment;
FIG. 4 is a schematic flow chart illustrating the manner in which the cell feature extraction module extracts cell features according to one embodiment;
FIG. 5 is a flowchart illustrating how a depth classification model yields depth classification results in one embodiment;
FIG. 6 is a block diagram showing the configuration of an image classification device for a urothelial cell slide in one embodiment;
FIG. 7 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The urothelial cell slide image classification method provided by the embodiment of the application can be applied to server execution. The server may be implemented by an independent server or a server cluster composed of a plurality of servers.
In one embodiment, as shown in fig. 1, a method for classifying images of urothelial cell slides is provided, which is described by taking the method as an example applied to a server, and comprises the following steps:
step S101, inputting a plurality of urothelium cell slide sub-images into a depth classification model to obtain a depth classification result output by the depth classification model; wherein, the plurality of sub images of the urothelium cell slide are obtained by segmenting the urothelium cell slide image.
In this step, the depth classification model may be a pre-trained depth learning classification model capable of outputting a depth classification result based on depth classification features; the depth classification result refers to a depth classification result corresponding to a plurality of urothelium cell slide sub-images; the size of the multiple sub-images of urothelial cell slides may be identical; the method for segmenting the urothelial cell slide image can be a sliding window; the plurality of sub-images of the urothelial cell slide obtained by segmenting the urothelial cell slide image can be in accordance with the size of a preset sub-image; the urothelial cell slide image may be a pathology slide digital image of urothelial cells.
And S102, filtering the depth classification result to obtain a suspicious target sub-image set.
In this step, the depth classification result may be filtered by removing other subgraphs, which do not belong to the suspicious target subgraph, from the depth classification result; by filtering the depth classification result, the number of subgraphs to be processed by the cell characteristic classification model can be reduced, and the overall efficiency of classifying the urothelial cell slide image is improved.
And step S103, inputting the suspicious target sub-image set into the cell characteristic classification model to obtain the cell characteristics and the cell characteristic classification result output by the cell characteristic classification model.
In this step, the cell feature classification result may be obtained by the cell feature classification model based on the cell features.
And step S104, performing feature fusion on the deep classification result, the cell feature and the cell feature classification result to obtain a total classification feature.
In this step, the specific way of performing feature fusion on the deep classification result, the cell feature and the cell feature classification result may be based on an algorithm optimization strategy, and a data direct matching combination method or a principal component analysis method in machine learning is selected from feature engineering technologies such as dimension reduction, matching and the like; the overall classification characteristic can be an overall classification characteristic with comprehensive characteristics of a deep classification result, a cell characteristic and a cell characteristic classification result.
And step S105, determining a classification result of the urothelial cell slide image according to the overall classification characteristics.
In this step, the mode of determining the classification result of the urothelial cell slide image may be to perform whole-slice classification on a plurality of urothelial cell slide sub-images based on the overall classification characteristics to determine the classification result of the urothelial cell slide image; after the classification result of the urothelial cell slide image is obtained, machine learning technologies such as decision trees and the like can be adopted, and the classification result of the urothelial cell slide image is further optimized and judged based on a strategy of maximizing the classification probability of the current category, so that the classification result of the urothelial cell slide image can be further optimized.
Firstly, inputting a plurality of sub-images of the urothelial cell slide into a depth classification model to obtain a depth classification result output by the depth classification model; wherein, the plurality of sub images of the urothelial cell slide are obtained by segmenting the urothelial cell slide image. And then, filtering the depth classification result to obtain a suspicious target sub-image set. And then, inputting the suspicious target sub-image set into a cell characteristic classification model to obtain cell characteristics and cell characteristic classification results output by the cell characteristic classification model. And then, performing feature fusion on the deep classification result, the cell features and the cell feature classification result to obtain a total classification feature. And finally, determining the classification result of the urothelial cell slide image according to the overall classification characteristics. According to the scheme, the method combining the artificial learning method and the deep learning method is adopted for classifying the urothelial cell slide images, so that the applicability of data generated in the classifying process of the urothelial cell slide images can be improved, the application range of the data is expanded, and the accuracy of the urothelial cell slide image classifying result can be effectively improved.
As to the manner in which the cell feature classification model generates the cell feature classification result, in an embodiment, as shown in fig. 2, the cell feature classification model is configured with a depth segmentation module, a cell feature extraction module, and a cell feature classification module, where the step S103 specifically includes:
step S201, inputting the suspicious target sub-image set to a depth segmentation module to obtain image data output by the depth segmentation module.
In this step, the depth segmentation module may be formed by a pre-trained depth learning segmentation model; the image data may be image data derived from a subset of suspicious objects.
Step S202, inputting the image data to a cell feature extraction module to obtain the cell features output by the cell feature extraction module.
In this step, the cell feature extraction module may be composed of a cell feature extractor based on a paris report system; the cell feature may be a cell feature statistically obtained from the image data.
Step S203, inputting the cell characteristics to a cell characteristic classification module to obtain a cell characteristic classification result output by the cell characteristic classification module.
In this step, the cell feature classification result may be a cell feature classification result obtained based on the cell feature.
In the embodiment, the cell level secondary classification is performed on the suspicious target sub-atlas output by the depth classification model through the cell characteristic classification model, secondary confirmation and correction of the classification result obtained by the depth classification model are realized based on the paris report system, and the problem that the accuracy of the classification result is required to be improved in the traditional artificial learning method is solved, so that the rationality and the accuracy of the classification result of the urothelial cell slide image are ensured, the applicability of data generated in the classification process of the urothelial cell slide image is improved, and the application range of the data is expanded.
As to the manner of generating the image data by the depth segmentation module, in an embodiment, as shown in fig. 3, the step S201 specifically includes:
step S301, inputting the suspicious target sub-atlas into a depth segmentation module, and identifying and calculating each pixel point data of the suspicious target sub-atlas one by the depth segmentation module to determine an image area to which each pixel point data belongs; the image area includes a target area and a background area.
In this step, the target region may be a nuclear region and a cytoplasmic region; the depth segmentation module identifies and calculates each pixel point data of the suspicious target sub-atlas one by one, or identifies each pixel point data of the suspicious target sub-atlas one by one, and judges the image area in which each pixel point data falls through calculation.
Step S302, removing the pixel point data of which the image region belongs to as the background region from each pixel point data, and obtaining the target pixel point data of which the image region belongs to as the target region.
In this step, the pixel point data of which the image region belongs to is the background region in each pixel point data is removed, which may be filtering the pixel point data of which the image region belongs to is the background region from each pixel point data of the suspicious target sub-image set; the target pixel point data of which the image area belongs to is the target area may be pixel point data of which the image area belongs to is a cell nucleus area or a cell cytoplasm area.
Step S303, obtaining image data according to the target pixel point data.
In this step, the image data may be generated according to the target pixel point data, and the image data is consistent with each suspicious sub-image in the suspicious target sub-image set in size and consistent with the pixel point position.
In the embodiment, the data range to be processed by the cell characteristic classification model is narrowed in a mode of filtering the pixel point data of the background region of the image region to which the suspicious target subgraph belongs, so that the operation efficiency of the cell characteristic classification model is improved, and the accuracy of the classification result of the urothelial cell slide image is effectively improved.
As to the manner of extracting the cell features by the cell feature extraction module, in an embodiment, as shown in fig. 4, the step S202 specifically includes:
step S401, inputting the image data to a cell feature extraction module, and converting each pixel point data in the image data into statistical index data; the statistical index data includes the kernel area, the pulp area, the kernel roundness and the kernel color intensity.
In this step, the manner of converting each pixel point data in the image data into statistical index data may be to sequentially identify each pixel point data in the image data, and to respectively count each pixel point data into statistical index data such as kernel area, pulp area, kernel circularity, kernel color intensity, and the like; the statistical index data of the nuclear area, the pulp area, the nuclear roundness and the nuclear color intensity can be calculated by the following formulas:
Wherein, INIn order to be the intensity of the color of the core,is the average of the R values of all points in the kernel,is the average of the G values of all points in the kernel,is the average of all B values at the core.
Step S402, according to the statistical index data, determining the cell characteristics corresponding to the image data.
In this step, the cell feature corresponding to the image data may be determined by a cell feature extractor based on a paris report system according to the statistical index data and a mathematical formula conforming to the cell morphology definition.
In the embodiment, the cell features corresponding to the image data are obtained through the cell feature extractor based on the paris report system, and a reasonable basis is provided for determining the cell feature classification result according to the cell features, so that the advantage of classifying the urothelial cell slide image based on the pathological theory is fully exerted, the applicability of the data generated in the classification process of the urothelial cell slide image is improved, and the application range of the data is expanded.
As for the manner of the depth classification model producing the depth classification result, in an embodiment, as shown in fig. 5, the depth classification model is configured with a depth feature extraction module and a depth classification module, and the step S101 specifically includes:
step S501, a plurality of urothelium cell slide sub-images are input into a depth classification feature extraction module, and depth classification features output by the depth classification feature extraction module are obtained.
In this step, the depth classification feature extraction module may be formed by a pre-trained depth classification feature extraction model; the extraction process of the depth classification features can be that a pre-trained depth classification feature extraction model is used for carrying out overall recognition and calculation on data in a plurality of urothelium cell slide sub-images, removing data representing background colors and background image textures in the data in the plurality of urothelium cell slide sub-images, extracting data representing cell edges, cell colors and cell image textures in the data in the plurality of urothelium cell slide sub-images as depth classification feature information, and finally generating a high-dimensional depth feature vector based on the depth classification feature information and using the vector as the depth classification features output by a depth classification feature extraction module.
Step S502, inputting the depth classification features into a depth classification module to obtain a depth classification result output by the depth classification module.
In this step, the depth classification result may be determined based on the depth classification characteristic.
In the embodiment, the depth classification result is output by using the depth classification model, so that the automation degree of the urothelial cell slide image classification process and the efficiency of classifying the urothelial cell slide image are improved, and the problem of insufficient accuracy of the classification result possibly caused by adopting an ergonomic learning method to classify the urothelial cell slide image is effectively solved.
As to the manner of filtering the depth classification result, in an embodiment, the step S102 specifically includes:
and filtering the depth classification result, and removing each background sub-graph in the depth classification result to obtain a suspicious target sub-graph set.
Each sub-graph in the depth classification result can be divided into a background sub-graph and a suspicious target sub-graph.
According to the embodiment, the efficiency of the cell characteristic classification model for generating the cell characteristic classification result is improved by filtering the background subgraph in the depth classification result, and the accuracy of the cell characteristic classification model for generating the cell characteristic classification result is effectively improved.
It should be understood that, although the steps in the flowcharts related to the embodiments as described above are sequentially displayed as indicated by arrows, the steps are not necessarily performed sequentially as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts related to the embodiments described above may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the execution order of the steps or stages is not necessarily sequential, but may be rotated or alternated with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the present application further provides a urothelial cell slide image classification device for implementing the urothelial cell slide image classification method. The implementation scheme of the device for solving the problem is similar to the implementation scheme described in the above method, so specific limitations in one or more embodiments of the urothelial cell slide image sorting device provided below can be referred to the limitations in the urothelial cell slide image sorting method above, and are not described herein again.
In one embodiment, as shown in fig. 6, there is provided a urothelial cell slide image sorting apparatus 600 comprising:
the depth classification unit 601 is used for inputting a plurality of urothelium cell slide sub-images into a depth classification model to obtain a depth classification result output by the depth classification model; wherein the plurality of urothelium cell slide sub-images are obtained by segmenting a urothelium cell slide image;
a depth classification result filtering unit 602, configured to filter the depth classification result to obtain a suspicious target sub-atlas;
the cell feature classification unit 603 inputs the suspicious target sub-image set into a cell feature classification model to obtain cell features and cell feature classification results output by the cell feature classification model;
a total classification feature obtaining unit 604, configured to perform feature fusion on the deep classification result, the cell feature, and the cell feature classification result to obtain a total classification feature;
a result output unit 605, which determines the classification result of the urothelial cell slide image according to the overall classification features.
In one embodiment, the cell feature classification unit 603 is configured with a depth segmentation module, a cell feature extraction module, and a cell feature classification module; the suspicious target sub-atlas is used for inputting the suspicious target sub-atlas to the depth segmentation module to obtain image data output by the depth segmentation module; inputting the image data into the cell feature extraction module to obtain the cell features output by the cell feature extraction module; and inputting the cell characteristics into the cell characteristic classification module to obtain a cell characteristic classification result output by the cell characteristic classification module.
In one embodiment, the cell feature classifying unit 603 is configured to input the suspicious target sub-atlas into the depth segmentation module, and the depth segmentation module identifies and calculates pixel point data of the suspicious target sub-atlas one by one, so as to determine an image area to which the pixel point data belongs; the image area comprises a target area and a background area; removing the pixel point data of which the image area belongs to the background area in each pixel point data to obtain target pixel point data; and obtaining the image data based on the target pixel point data.
In one embodiment, the cell feature classifying unit 603 is configured to input the image data to the cell feature extracting module, and convert each pixel data in the image data into statistical index data; the statistical index data comprises a nuclear area, a pulp area, a nuclear roundness and a nuclear color intensity; and determining the cell characteristics corresponding to the image data according to the statistical index data.
In one embodiment, the depth classification unit 601, the depth classification model is configured with a depth feature extraction module and a depth classification module; the device is used for inputting the plurality of urothelium cell slide subimages into the depth classification characteristic extraction module to obtain depth classification characteristics output by the depth classification characteristic extraction module; and inputting the depth classification features into the depth classification module to obtain a depth classification result output by the depth classification module.
In an embodiment, the result filtering unit 602 is configured to filter the depth classification result, remove each background sub-graph in the depth classification result, and obtain the suspicious target sub-graph set.
All or part of each unit in the urothelial cell slide image sorting device can be realized by software, hardware and a combination thereof. The units can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the units.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 7. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing data such as urothelial cell slide images. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a urothelial cell slide image classification method.
Those skilled in the art will appreciate that the architecture shown in fig. 7 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is further provided, which includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In an embodiment, a computer program product is provided, comprising a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), Magnetic Random Access Memory (MRAM), Ferroelectric Random Access Memory (FRAM), Phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing based data processing logic devices, etc., without limitation.
It should be noted that, the user information (including but not limited to user device information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.
Claims (9)
1. A urothelial cell slide image classification method, the method comprising:
inputting a plurality of urothelium cell slide subimages into a depth classification model to obtain a depth classification result output by the depth classification model; wherein the plurality of urothelium cell slide sub-images are obtained by segmenting a urothelium cell slide image;
filtering the depth classification result to obtain a suspicious target sub-graph set;
inputting the suspicious target sub-image set into a cell feature classification model to obtain cell features and cell feature classification results output by the cell feature classification model;
performing feature fusion on the deep classification result, the cell features and the cell feature classification result to obtain overall classification features;
and determining the classification result of the urothelial cell slide image according to the overall classification characteristics.
2. The method of claim 1, wherein the cell feature classification model is configured with a depth segmentation module, a cell feature extraction module, and a cell feature classification module; the step of inputting the suspicious target sub-image set into a cell feature classification model to obtain cell features and cell feature classification results output by the cell feature extraction model comprises the following steps:
inputting the suspicious target sub-atlas into the depth segmentation module to obtain image data output by the depth segmentation module;
inputting the image data into the cell feature extraction module to obtain the cell features output by the cell feature extraction module;
inputting the cell characteristics into the cell characteristic classification module to obtain the cell characteristic classification result output by the cell characteristic classification module.
3. The method of claim 2, wherein inputting the suspicious target sub-atlas to the depth segmentation module to obtain image data output by the depth segmentation module comprises:
inputting the suspicious target sub-atlas to the depth segmentation module, and identifying and calculating pixel point data of the suspicious target sub-atlas one by the depth segmentation module to determine an image area to which the pixel point data belong; the image area comprises a target area and a background area;
removing the pixel point data of which the image area belongs to the background area in each pixel point data to obtain target pixel point data of which the image area belongs to the target area;
and obtaining the image data according to the target pixel point data.
4. The method of claim 2, wherein inputting the image data to the cell feature extraction module to obtain the cell features output by the cell feature extraction module comprises:
inputting the image data into the cell feature extraction module, and converting each pixel point data in the image data into statistical index data; the statistical index data comprises a nuclear area, a pulp area, a nuclear roundness and a nuclear color intensity;
and determining the cell characteristics corresponding to the image data according to the statistical index data.
5. The method of claim 1, wherein the depth classification model is configured with a depth feature extraction module and a depth classification module; the method comprises the following steps of inputting a plurality of urothelium cell slide subimages into a depth classification model to obtain a depth classification result output by the depth classification model, wherein the depth classification result comprises the following steps:
inputting the sub-images of the urothelial cell slides into the depth classification feature extraction module to obtain depth classification features output by the depth classification feature extraction module;
and inputting the depth classification features into the depth classification module to obtain the depth classification result output by the depth classification module.
6. The method according to any one of claims 1 to 5, wherein the filtering the depth classification result to obtain a suspicious target sub-atlas includes:
and filtering the depth classification result, and removing each background sub-graph in the depth classification result to obtain the suspicious target sub-graph set.
7. A urothelial cell slide image sorting device, comprising:
the deep classification unit is used for inputting the sub-images of the urothelium cell slides into a deep classification model to obtain a deep classification result output by the deep classification model; wherein the plurality of urothelium cell slide sub-images are obtained by segmenting the urothelium cell slide image;
the depth classification result filtering unit is used for filtering the depth classification result to obtain a suspicious target sub-graph set;
the cell characteristic classification unit inputs the suspicious target sub-image set into a cell characteristic classification model to obtain cell characteristics and cell characteristic classification results output by the cell characteristic classification model;
the overall classification feature acquisition unit is used for performing feature fusion on the deep classification result, the cell features and the cell feature classification result to obtain overall classification features;
and the result output unit is used for determining the classification result of the urothelial cell slide image according to the overall classification characteristics.
8. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 6.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210543459.0A CN114648527B (en) | 2022-05-19 | 2022-05-19 | Urothelial cell slide image classification method, device, equipment and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210543459.0A CN114648527B (en) | 2022-05-19 | 2022-05-19 | Urothelial cell slide image classification method, device, equipment and medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114648527A true CN114648527A (en) | 2022-06-21 |
CN114648527B CN114648527B (en) | 2022-08-16 |
Family
ID=81997454
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210543459.0A Active CN114648527B (en) | 2022-05-19 | 2022-05-19 | Urothelial cell slide image classification method, device, equipment and medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114648527B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180211380A1 (en) * | 2017-01-25 | 2018-07-26 | Athelas Inc. | Classifying biological samples using automated image analysis |
CN110119710A (en) * | 2019-05-13 | 2019-08-13 | 广州锟元方青医疗科技有限公司 | Cell sorting method, device, computer equipment and storage medium |
CN110135271A (en) * | 2019-04-19 | 2019-08-16 | 上海依智医疗技术有限公司 | A kind of cell sorting method and device |
CN110852288A (en) * | 2019-11-15 | 2020-02-28 | 苏州大学 | Cell image classification method based on two-stage convolutional neural network |
CN113902669A (en) * | 2021-08-24 | 2022-01-07 | 苏州深思考人工智能科技有限公司 | A method and system for reading urine exfoliated cytosol-based smears |
CN114202494A (en) * | 2020-08-31 | 2022-03-18 | 中移(成都)信息通信科技有限公司 | Method, device and device for classifying cells based on cell classification model |
-
2022
- 2022-05-19 CN CN202210543459.0A patent/CN114648527B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180211380A1 (en) * | 2017-01-25 | 2018-07-26 | Athelas Inc. | Classifying biological samples using automated image analysis |
CN110135271A (en) * | 2019-04-19 | 2019-08-16 | 上海依智医疗技术有限公司 | A kind of cell sorting method and device |
CN110119710A (en) * | 2019-05-13 | 2019-08-13 | 广州锟元方青医疗科技有限公司 | Cell sorting method, device, computer equipment and storage medium |
CN110852288A (en) * | 2019-11-15 | 2020-02-28 | 苏州大学 | Cell image classification method based on two-stage convolutional neural network |
CN114202494A (en) * | 2020-08-31 | 2022-03-18 | 中移(成都)信息通信科技有限公司 | Method, device and device for classifying cells based on cell classification model |
CN113902669A (en) * | 2021-08-24 | 2022-01-07 | 苏州深思考人工智能科技有限公司 | A method and system for reading urine exfoliated cytosol-based smears |
Non-Patent Citations (1)
Title |
---|
张梦倩等: "粗-细两阶段卷积神经网络算法", 《计算机科学与探索》 * |
Also Published As
Publication number | Publication date |
---|---|
CN114648527B (en) | 2022-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110728295B (en) | Semi-supervised landform classification model training and landform graph construction method | |
CN112132279A (en) | Convolutional neural network model compression method, device, equipment and storage medium | |
CN113674288A (en) | An automatic tissue segmentation method for digital pathological images of non-small cell lung cancer | |
CN111178196B (en) | Cell classification method, device and equipment | |
CN112001399B (en) | Image scene classification method and device based on local feature saliency | |
JP2025508569A (en) | Image detection method, device, equipment, and computer program | |
CN112651953A (en) | Image similarity calculation method and device, computer equipment and storage medium | |
US20230298314A1 (en) | Image clustering method and apparatus, computer device, and storage medium | |
CN110390312A (en) | Chromosome automatic classification method and classifier based on convolutional neural network | |
CN116541792A (en) | Method for carrying out group partner identification based on graph neural network node classification | |
CN114639102A (en) | Cell segmentation method and device based on key point and size regression | |
CN114792397A (en) | SAR image urban road extraction method, system and storage medium | |
CN111415360B (en) | Tobacco leaf image cutting method, device, equipment and medium | |
CN114648527B (en) | Urothelial cell slide image classification method, device, equipment and medium | |
CN113688715A (en) | Facial expression recognition method and system | |
CN117058554A (en) | Power equipment target detection method, model training method and device | |
CN115861604A (en) | Cervical tissue image processing method, cervical tissue image processing apparatus, computer device, and storage medium | |
CN115995024A (en) | Image Classification Method Based on Graph-like Neural Network | |
CN111414992A (en) | Method and apparatus for performing convolution calculation on image using convolution neural network | |
Wang et al. | Target-Guided Adversarial Point Cloud Transformer Towards Recognition Against Real-world Corruptions | |
CN111898496A (en) | Table reconstruction method and device, computer equipment and storage medium | |
CN118587128B (en) | Occluded image restoration method, device, computer equipment and readable storage medium | |
CN118038192B (en) | A few-sample graph convolution tongue image classification method, device, equipment and storage medium | |
CN117765531B (en) | Cell image sample enhancement method, device, equipment and medium | |
CN114926721B (en) | A scene image analysis method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |