CN116978005B - Microscope image processing system based on attitude transformation - Google Patents
Microscope image processing system based on attitude transformation Download PDFInfo
- Publication number
- CN116978005B CN116978005B CN202311226987.4A CN202311226987A CN116978005B CN 116978005 B CN116978005 B CN 116978005B CN 202311226987 A CN202311226987 A CN 202311226987A CN 116978005 B CN116978005 B CN 116978005B
- Authority
- CN
- China
- Prior art keywords
- sample
- measuring platform
- pixel point
- pixel points
- moves relative
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000009466 transformation Effects 0.000 title claims abstract description 33
- 238000012545 processing Methods 0.000 title claims abstract description 20
- 238000001000 micrograph Methods 0.000 title claims abstract description 19
- 238000012216 screening Methods 0.000 claims abstract description 25
- 238000004458 analytical method Methods 0.000 claims abstract description 17
- 238000007781 pre-processing Methods 0.000 claims abstract description 10
- 238000005259 measurement Methods 0.000 claims description 34
- 238000007499 fusion processing Methods 0.000 claims description 14
- 238000000034 method Methods 0.000 claims description 12
- 230000004927 fusion Effects 0.000 claims description 9
- 238000013519 translation Methods 0.000 claims description 9
- 230000000007 visual effect Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000007792 addition Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/695—Preprocessing, e.g. image segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/806—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Molecular Biology (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The microscope image processing system based on gesture transformation comprises an image preprocessing module, a feature screening module, a gesture feature matching module, a gesture transformation analysis module and a splicing position tracking module; analyzing the rotation angle of the sample to be measured before and after moving relative to the measuring platform, rotating the placing angle of the measuring platform according to the rotation angle, and tracking and adjusting the relative positions of the measuring platform and the camera. According to the invention, the position of the characteristic pixel point in the microscopic image acquired after the sample to be measured moves relative to the measuring platform can be accurately positioned by establishing the minimum external polygon corresponding to each characteristic pixel point, the measuring platform can be adjusted according to the rotation angle, the relative position of the measuring platform and the camera can be tracked and adjusted, the condition that the sample to be measured can be accurately positioned and adjusted to the splicing position of the sample to be measured before the sample to be measured moves relative to the measuring platform under different gesture changes is satisfied, and the continuous splicing precision of the large-size sample to be measured before and after the sample to be measured moves is improved.
Description
Technical Field
The invention belongs to the technical field of microscope image processing, and relates to a microscope image processing system based on attitude transformation.
Background
The microscope image stitching range is limited, the range is determined by a screw rod which is controlled by a measuring platform to move along the X axis and the Y axis, when the length or the width of a sample to be measured is larger than the range of the measuring platform, the sample to be measured cannot be subjected to image acquisition and processing due to the limitation of the size of the measuring platform, and the large-size complete morphology image of the sample to be measured cannot be stitched.
Because the position of the sample to be measured changes, the placement angle and the position of the sample to be measured before and after moving are also changed, the difficulty of matching the pixels before and after moving relative to the measuring platform and the splicing difficulty are increased, the position of the microscopic image acquired after moving relative to the measuring platform, in the spliced image before moving relative to the measuring platform, of the sample to be measured cannot be quickly and accurately obtained, the spliced microscopic image before moving cannot be accurately spliced, and the splicing efficiency is low and the precision is poor.
Disclosure of Invention
The invention discloses a microscope image processing system based on gesture transformation, which solves the problems existing in the prior art.
The invention provides a microscope image processing system based on gesture transformation in one application aspect, which comprises an image preprocessing module, a feature screening module, a gesture feature matching module, a gesture transformation analysis module and a splicing position tracking module; the image preprocessing module acquires the gray value of each pixel point in the microscopic image, and fusion preprocessing is carried out on the pixel points at the central position by adopting the gray value of the surrounding pixel points of each pixel point to acquire the gray value of each pixel point after fusion processing;
the characteristic screening module is used for comparing each pixel point in the microscopic image before and after the sample to be tested after the fusion treatment moves relative to the measuring platform with the gray value of each surrounding pixel point respectively to obtain the gray value difference value between the pixel point and the surrounding pixel points, and screening out the characteristic pixel points;
the gesture feature matching module adopts a gray value matching mode to screen out a similar feature pixel point set with the error of gray values of all feature pixel points in a microscopic image after a sample to be detected moves relative to the measurement platform within an allowable range from the spliced image, and analyzes the position coordinates of the similar feature pixel points matched with the distance between any two feature pixel points in the microscopic image after the sample to be detected moves relative to the measurement platform based on the distance between all feature pixel points;
the attitude transformation analysis module is used for analyzing the coordinates of the characteristic pixel points in the spliced image before the sample to be detected is moved relative to the measuring platform, wherein the characteristic pixel points are matched with the characteristic pixel points in the microscopic image after the sample to be detected is moved relative to the measuring platform, and the rotation amount and the translation amount between two plane coordinate systems are analyzed based on the attitude transformation analysis model;
and the splicing position tracking module rotates the placing angle of the measuring platform according to the rotating angle and tracks and adjusts the relative position of the measuring platform and the camera.
Further, the gray values of surrounding pixels of each pixel are adopted to fuse the pixels at the central position, and a fusion formula is adopted:k is an odd number greater than 1, f' (x, y) is a gray value obtained by image fusion processing of a pixel point with a position coordinate (x, y), and f (x, y) is a gray value obtained by image fusion processing of a pixel point with a position coordinate (x, y).
Further, the calculation formula of the gray value difference between each pixel point and surrounding pixel points is as follows:
f (x-2+i, y-2+j) is the gray value of the (x-2+i, y-2+j) pixel in the microscopic image after the image fusion processing, and Δf' (x, y) is the difference between the gray values of the (x, y) pixel and the surrounding pixels in the microscopic image.
Further, a gray value matching mode is adopted for matching analysis between a microscopic image acquired after a sample to be measured moves relative to a measuring platform and an image spliced before the sample to be measured moves relative to the measuring platform, and the method comprises the following steps:
a1, extracting gray values of all characteristic pixel points spliced by all microscopic images before a sample to be detected moves relative to a measuring platform and gray values of all characteristic pixel points after the sample to be detected moves relative to the measuring platform;
step A2, extracting all characteristic pixel points with the error between gray values of the v th characteristic pixel point after the sample to be measured moves relative to the measuring platform from the front of the sample to be measured moves relative to the measuring platform in sequence until D characteristic pixel points after the sample to be measured moves relative to the measuring platform are extracted;
a3, constructing a minimum external polygon of each characteristic pixel point after the sample to be measured moves relative to the measuring platform;
step A4, screening the number W of similar characteristic pixel point groups with the same distance between coordinates of two similar characteristic pixel points in a position set of the 1 st and the d th characteristic pixel points forming the minimum external polygon and the distance between coordinates of the two similar characteristic pixel points in the error allowable range;
step A5, judging whether the number W of the similar characteristic pixel point groups meets W & gt 1, if W=1, obtaining characteristic pixel points corresponding to characteristic pixel points forming the minimum external polygon on a spliced image before a sample to be measured moves relative to the measuring platform, and if W & gt 1, executing step A6;
and A6, counting the number W of the distances between the positions of the similar feature pixels in the position set of the similar feature pixels corresponding to any one similar feature pixel in the similar feature pixel group and the d+1th feature pixel in the minimum external polygon, wherein the distances between the positions of the similar feature pixels in the position set of the similar feature pixels are equal to the number W of the distances between the two feature pixels forming the minimum external polygon corresponding to the position set of the two similar feature pixels, and executing the step A5 until W=1, so as to obtain the feature pixels corresponding to all the feature pixels forming the minimum external polygon on the spliced image before the sample to be measured moves relative to the measuring platform.
Further, the method for constructing the minimum circumscribed polygon in the step A3 includes the following steps:
q1, taking one of the characteristic pixel points after the sample to be measured moves relative to the measuring platform as a starting point, and taking the starting point as a1 st boundary pixel point, and respectively calculating the included angles between the line segments of the other characteristic pixel points and the positive direction of the X-axis of the microscopic image;
q2, screening out a characteristic pixel point with the smallest included angle as a next boundary pixel point;
q3, taking the next boundary pixel point as a starting point, judging whether the included angle between the connecting line segments of the next boundary pixel point and the 1 st boundary pixel point is smaller than or equal to the minimum value of the included angle between the connecting line of the next boundary pixel point and other characteristic pixel points along the X-axis positive direction of the microscopic image, if the included angle is larger than the minimum value of the included angle, repeatedly executing the step Q2, and if the included angle is smaller than the minimum value of the included angle between the connecting line of the next boundary pixel point and the other characteristic pixel points along the X-axis positive direction of the microscopic image, executing the step Q4;
and Q4, judging whether the next boundary pixel point is overlapped with the boundary pixel points which are sequentially used as starting points.
Further, calculating whether the sum of the distances between the feature pixel points in the minimum circumscribed polygon in the spliced image is in the range of LD (1+/-0.1 percent);
wherein,
ID is expressed as the sum of distances among characteristic pixel points in a microscopic image acquired by a microscope lens after a sample to be measured moves relative to a measuring platform, (x) v ,y v ) And (x) u ,y u ) The position coordinates of the v-th and the u-th characteristic pixel points in the microscopic image are respectively expressed, and D is the number of the characteristic pixel points in the microscopic image after the sample to be measured moves relative to the measuring platform.
Further, a second plane coordinate system is established by using the spliced image before the sample to be measured moves relative to the measuring platform, and the position coordinates of each characteristic pixel point on the spliced image in the second plane coordinate system are analyzed, and the analysis method comprises the following steps:
step B1, acquiring the left upper corner position coordinates (Sx, sy) of each single-frame microscopic image forming a spliced image before a sample to be measured moves relative to a measurement platform;
step B2, analyzing the relative position coordinates (delta xv, delta yv) of each characteristic pixel point on the microscopic image of the position mark (Sx, sy) at the upper left corner in the microscopic image;
the absolute position coordinates (Xv, yv) of each feature pixel point in the step B3, step B2 under the first plane coordinate system, x 'v=sx+Δxv, y' v=sy+Δyv, v=1, 2, D;
further, a transformation association model between position coordinates of the same key pixel point in a first plane coordinate system where the sample to be measured is before moving relative to the measuring platform and in a second plane coordinate system where the sample to be measured is before moving relative to the measuring platform is established:and (3) calculating the rotation angle beta, the translation delta x and delta y by respectively screening the position coordinates of at least 3 characteristic pixel points in the first plane coordinate system and the second plane coordinate system.
Further, the method for tracking and adjusting the relative position of the measuring platform and the camera comprises the following steps:
step C1, controlling the measuring platform to rotate by an angle beta, wherein the plane of the measuring platform after rotation is in a horizontal state, so that the placing angle of the sample to be measured placed on the measuring platform under the visual field of a camera is kept unchanged before and after the sample to be measured moves;
step C2, extracting translation amounts delta X and delta y along an X 'axis and a y' axis between the first plane coordinate system and the second plane coordinate system;
step C3, taking the upper left corner acquired by the camera after the sample to be measured moves relative to the measuring platform as an origin, establishing a coordinate system x '0y of the measurement platform after the measurement platform rotates by an angle beta "establish a camera coordinate system x '" 0y ' "under the camera view; step C4, analyzing the splicing movement amount (x 1, y 1) required by the camera to continue splicing the splicing graph before the sample to be measured moves relative to the measuring platform after the sample to be measured moves relative to the measuring platform;
step C5, calculating the position coordinates within the camera coordinate system x '"0 y'" (x 1, y 1) transforming to a position coordinate (x 2, y 2) under the measurement platform coordinate system x "0 y";
and C6, respectively controlling the measuring platform to move along the X axis by a distance X2 and move along the Y axis by a distance Y2 so as to continue splicing the spliced patterns before the sample to be measured moves relative to the measuring platform.
The beneficial effects are that:
according to the microscope image processing system based on gesture transformation, the gray values of the surrounding pixels of each pixel are used for carrying out fusion pretreatment on the pixels at the central position, the difference of the gray values of the corresponding pixels of the same characteristic region in a microscope image due to different placement angles of a sample to be detected is eliminated, the correlation of the gray values of the surrounding pixels to the gray values of the pixels at the central position is established, the accuracy of matching of the same characteristic region before and after the position adjustment of the sample to be detected is improved, and the matching difficulty caused by the placement position is reduced.
And the method comprises the steps of carrying out gray value analysis on each pixel point subjected to fusion processing and surrounding pixel points, screening out characteristic pixel points with gray value difference larger than the set gray value difference, establishing a minimum external polygon corresponding to each characteristic pixel point, carrying out accurate screening based on the minimum external polygon and each characteristic pixel point before the sample to be tested moves relative to the measuring platform, reducing the matching calculation amount, improving the matching speed and the matching precision, accurately positioning the position of the characteristic pixel point in the microscopic image acquired after the sample to be tested moves relative to the measuring platform before the movement, and simultaneously, checking the position corresponding to the current microscopic image screened based on the minimum external polygon according to the sum of the distances between the characteristic pixel points in the minimum external polygon, thereby improving the positioning accuracy of the sample to be tested relative to the measuring platform before and after the movement.
The rotation angles of the sample to be measured before and after the sample to be measured moves relative to the measuring platform are analyzed through the transformation correlation model, the measuring platform is adjusted according to the rotation angles, the placing angles of the sample to be measured before and after the movement relative to the camera are guaranteed to be consistent, the relative positions of the measuring platform and the camera are tracked and adjusted, the fact that the sample to be measured can be accurately positioned and adjusted to the splicing position of the sample to be measured before the movement relative to the measuring platform under different posture position transformation is met, the efficiency and the accuracy of splicing the moving microscopic images are improved, the fact that the situation that all the shapes of the sample to be measured, the size of which is larger than that of the measuring platform, cannot be spliced and obtained due to the relative position change among the measuring platform, the camera and the sample to be measured are avoided, and the continuous splicing accuracy of the large-size sample to be measured before and after the movement is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a microscopic image collected under the initial placement position and angle of a sample to be measured;
FIG. 2 is a microscopic image taken at the next placement position and angle of the sample to be measured;
FIG. 3 is a schematic view of the distribution of feature pixels in a microscopic image;
FIG. 4 is a schematic diagram illustrating a set of positions of similar feature pixels corresponding to feature pixels;
FIG. 5 is a schematic illustration of a minimum circumscribing polygon;
FIG. 6 is a schematic diagram of a second planar coordinate system;
FIG. 7 is a schematic view of a first planar coordinate system and a partial enlargement;
FIG. 8 is a schematic diagram of a multi-coordinate system;
fig. 9 is a schematic plan view of a measurement platform.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings of the embodiments of the present disclosure. It will be apparent that the described embodiments are some, but not all, of the embodiments of the present disclosure. All other embodiments, which can be made by one of ordinary skill in the art without the need for inventive faculty, are within the scope of the present disclosure, based on the described embodiments of the present disclosure.
When the size of the sample to be measured is larger than that of the measuring platform, the sample to be measured cannot be subjected to image acquisition and processing due to the limitation of the size of the measuring platform, at present, when the sample to be measured with the large size is subjected to image acquisition, the position of the sample to be measured relative to the measuring platform is required to be manually adjusted continuously so as to meet the requirements of image acquisition and measurement, and meanwhile, the position of the sample to be measured is changed, so that the placement angle and the position of the sample to be measured are also changed before and after movement, the splicing precision of microscopic images acquired before and after movement of the sample to be measured is poor, and then the image information of the complete surface morphology of the sample to be measured cannot be obtained.
The microscope image processing system based on gesture transformation comprises an image preprocessing module, a feature screening module, a gesture feature matching module, a gesture transformation analysis module and a splicing position tracking module.
The image preprocessing module acquires the gray value of each pixel point in the microscopic image, and fusion preprocessing is carried out on the pixel points at the central position by adopting the gray value of the surrounding pixel points of each pixel point, so as to acquire the gray value of each pixel point after fusion processing.
As shown in fig. 1 and fig. 2, when the sample to be measured is readjusted and placed on the measurement platform, the next placement position and angle of the sample to be measured are changed from the first placement position and angle, and the view direction collected by the camera on the microscope is parallel to the placement position of the measurement platform, so that the gray values of the corresponding pixels in the microscopic images of the same characteristic region on the sample to be measured under the front and rear placement positions and angles are different.
The gray values of all the pixels in the microscopic images collected under the initial placement and the next placement position and angle are fused, the gray value correlation of the gray values of surrounding pixels taking all the pixels as the center to the gray value of the pixels at the center position is established, the size and the number of the pixel areas covered by the same characteristic area A in the front and back placement states are different due to the fact that the camera collection view angle is unchanged, and the difficulty of matching and comparing the same characteristic area A before and after the position adjustment of a sample to be measured is increased because the pixel areas covered by the same characteristic area A are different in size and the number according to the fact that the pixel areas covered by the same characteristic area A are different in number as shown in the figures 1 and 2.
Fusion formula:
k is an odd number greater than 1, the value 3 is generally taken, f' (x, y) is a gray value obtained by performing image fusion processing on the pixel point with the position coordinate (x, y), and f (x, y) is a gray value obtained by performing image fusion processing on the pixel point with the position coordinate (x, y).
The gray values of the surrounding pixels taking each pixel as the center are used for carrying out fusion processing on the gray values of the pixels at the center, so that the interference of the gray values of the pixels taking each pixel as the center on the gray values of the surrounding pixels is reduced, and the difference between the gray values corresponding to each pixel in the same characteristic area before and after the sample to be measured moves due to the change between the placement angles of the camera and the sample to be measured is effectively avoided.
The feature screening module is used for comparing each pixel point in the microscopic image before and after the sample to be tested is moved relative to the measuring platform after the fusion processing with the gray value of each surrounding pixel point to obtain the gray value difference value between the pixel point and the surrounding pixel points, and screening out the feature pixel points.
And taking each pixel point with the gray value difference value between the pixel point and the gray values of all surrounding pixel points being larger than the set gray value difference value as a characteristic pixel point.
Gray value difference:
f (x-2+i, y-2+j) is the gray value of the (x-2+i, y-2+j) pixel point in the microscopic image after the image fusion processing, Δf' (x, y) is the difference value between the gray value of the (x, y) pixel point and the gray value of the surrounding pixel point in the microscopic image, the abrupt change value between the gray values of the pixel point and the surrounding pixel point is reflected through the gray value difference value, if the difference between the pixel point and the surrounding pixel point is more prominent, the gray value difference value is larger, the screening of the characteristic pixel point is adopted, and the characteristic pixel point with the gray value difference value between the pixel point and the surrounding pixel point larger than the set gray value difference value can be provided for matching the later microscopic image after the sample to be measured moves relative to the measuring platform with the pixel point in the image spliced by a plurality of microscopic images before the sample to be measured moves relative to the measuring platform.
The gesture feature matching module adopts a gray value matching mode to screen out a similar feature pixel point set with the error of gray values of all feature pixel points in the microscopic image after the sample to be detected moves relative to the measuring platform within an allowable range from the spliced image, and analyzes the position coordinates of the similar feature pixel points matched with the distance of any two feature pixel points in the microscopic image after the sample to be detected moves relative to the measuring platform based on the distance of all feature pixel points.
As shown in fig. 3, after the sample to be measured moves relative to the measurement platform, the sum of distances between feature pixels in the microscopic image collected by the microscope lens:
LD is expressed as the sum of the distances between characteristic pixel points in a microscopic image acquired by a microscope lens after a sample to be measured moves relative to a measuring platform, (x) v ,y v ) And (x) u ,y u ) The position coordinates of the v-th and the u-th characteristic pixel points in the microscopic image are respectively expressed, and D is the number of the characteristic pixel points in the microscopic image after the sample to be measured moves relative to the measuring platform.
When the length of a sample to be measured is greater than the length of a measuring platform or the width of the sample to be measured is greater than the width of the measuring platform, firstly, image acquisition is carried out on a local sample region to be measured which is placed in the measuring range of the measuring platform, the sample to be measured is moved, the sample region to be measured which is positioned outside the measuring range of the measuring platform is placed on the measuring platform, so that each characteristic pixel point in a microscopic image acquired after movement is compared with each characteristic pixel point in a microscopic image spliced before movement, and the fact that each characteristic pixel point in the microscopic image after movement is not completely in the microscopic image of a single frame before movement of the sample to be measured relative to the measuring platform is eliminated, and the matching precision among the characteristic pixel points is improved.
As shown in fig. 4, a gray value matching mode is adopted for matching analysis between a microscopic image acquired after a sample to be measured moves relative to a measurement platform and an image spliced before the sample to be measured moves relative to the measurement platform, and the method comprises the following steps:
a1, extracting gray values of all characteristic pixel points spliced by all microscopic images before a sample to be detected moves relative to a measuring platform and gray values of all characteristic pixel points after the sample to be detected moves relative to the measuring platform;
step A2, extracting all characteristic pixel points with the error between gray values of the v th characteristic pixel point after the sample to be measured moves relative to the measuring platform from the front of the sample to be measured moves relative to the measuring platform in sequence until D characteristic pixel points after the sample to be measured moves relative to the measuring platform are extracted;
d in each characteristic pixel point before the sample to be measured moves relative to the measuring platform and after the sample to be measured moves relative to the measuring platform
The set Gv (x, y) = { Gv1 (x, y), gv2 (x, y),..priority, gvf) x, y),..priority, gvs (x, y) } gvf (x, y) of the positions of the similar feature pixels in the spliced image before the sample to be measured moves relative to the measurement platform and in the microscopic image after the sample to be measured moves relative to the measurement platform are expressed as the s-th similar feature pixel position of the gray value of the v-th feature pixel in the error allowance range.
A3, constructing a minimum external polygon of each characteristic pixel point after the sample to be measured moves relative to the measuring platform;
the connection points of the minimum circumscribed polygon are all composed of characteristic pixel points.
As shown in fig. 4 and 5, the construction method of the minimum circumscribing polygon includes the following steps:
q1, taking one of the characteristic pixel points after the sample to be measured moves relative to the measuring platform as a starting point, and taking the starting point as a1 st boundary pixel point, and respectively calculating the included angles between the line segments of the other characteristic pixel points and the positive direction of the X-axis of the microscopic image;
q2, screening out a characteristic pixel point with the smallest included angle as a next boundary pixel point;
when a plurality of included angles exist, the characteristic pixel point with the smallest included angle is selected from all included angles, the starting point is used as a first boundary pixel point, and the connecting line segments u10 and u10 between the starting point and the next boundary pixel point are expressed as effective rectangular edges, the connecting line segments u11 and u11 between the starting point and other characteristic pixel points (other than the next boundary pixel point) are expressed as invalid rectangular edges, when the connecting line segment between the next boundary pixel point and the next boundary pixel point is used as the effective rectangular edges after the next boundary pixel point is used as the starting point, the connecting line segment is marked as un0, and when the connecting line segment is used as the invalid rectangular edges, the connecting line segment is marked as un1, and n is the edge number of the smallest polygon formed.
Q3, taking the next boundary pixel point as a starting point, judging whether the included angle between the connecting line segments of the next boundary pixel point and the 1 st boundary pixel point is smaller than or equal to the minimum value of the included angle between the connecting line of the next boundary pixel point and other characteristic pixel points along the X-axis positive direction of the microscopic image, if the included angle is larger than the minimum value of the included angle, repeatedly executing the step Q2, and if the included angle is smaller than the minimum value of the included angle between the connecting line of the next boundary pixel point and the other characteristic pixel points along the X-axis positive direction of the microscopic image, executing the step Q4;
q4, judging whether the next boundary pixel point is overlapped with the boundary pixel points serving as starting points in sequence, if so, forming a closed polygon, judging whether all the characteristic pixel points are positioned in the closed polygon, if so, determining the minimum external polygon corresponding to the characteristic pixel points in the microscopic image after the sample to be measured moves relative to the measuring platform, and if not, losing the boundary pixel points serving as the starting points.
When the next boundary pixel point is overlapped with the boundary pixel point serving as the starting point, the starting point in the step Q1 is discarded according to the boundary pixel point serving as the starting point serving as the final starting point, so that the minimum circumscribed polygonal area corresponding to the characteristic pixel point in the microscopic image after the sample to be measured moves relative to the measurement platform can be effectively determined, and all the characteristic pixel points in the microscopic image are positioned on or in the edge of the minimum circumscribed polygonal area, so that the position areas of all the characteristic pixel points after the movement are conveniently limited.
Step A4, screening out the number W of similar characteristic pixel point groups with the same distance between coordinates of two similar characteristic pixel points in a position set of the 1 st and the d th characteristic pixel points forming the minimum external polygon and within an error allowable range, wherein y=2, and y represents the number of the characteristic pixel points forming the minimum external polygon;
step A5, judging whether the number W of the similar characteristic pixel point groups meets W & gt 1, if W=1, obtaining characteristic pixel points corresponding to characteristic pixel points forming the minimum external polygon on a spliced image before a sample to be measured moves relative to the measuring platform, and if W & gt 1, executing step A6;
and A6, counting the number W of the distances between the similar feature pixel points in the similar feature pixel point group and the two feature pixel points forming the minimum external polygon corresponding to the position set of the similar feature pixel point corresponding to the d+1th feature pixel point in the minimum external polygon, wherein y=y+1, y is less than or equal to D, and executing the step A5 until W=1, so as to obtain the feature pixel points corresponding to all the feature pixel points forming the minimum external polygon on the spliced image before the sample to be measured moves relative to the measuring platform.
The position coordinates of the characteristic pixel points after the sample to be measured is spliced relative to the microscopic images before the movement of the measuring platform take the upper left corner of the spliced image as an origin, and the movement splicing direction of the camera is respectively an X axis and a Y axis.
sd1 is expressed as the distance between the 1 st and the D th feature pixel points in the minimum circumscribing polygon, sd (d+1) is expressed as the distance between the D th and the d+1 th feature pixel points in the minimum circumscribing polygon, D and y are any one of values 1 to H, H is the number of feature pixel points in the minimum circumscribing polygon, and H is less than or equal to D.
The method comprises the steps of obtaining position coordinates of similar feature pixel points which are formed by a plurality of similar feature pixel points screened from a similar feature pixel point set and are matched with the minimum external polygon corresponding to each feature pixel point after a sample to be detected moves relative to a measuring platform, obtaining feature pixel points in the minimum external polygon in a spliced image before the sample to be detected moves relative to the measuring platform, calculating whether the sum of distances between each feature pixel point in the minimum external polygon in the spliced image is in a range of LD (1+/-0.1%), and if the sum of distances between the feature pixel points in the minimum external polygon is in the range, meeting the requirement of feature matching search with high accuracy, realizing the detection of feature pixel point positioning, and accurately positioning and screening the positions of each feature pixel point in the moving microscopic image from the spliced image.
The method comprises the steps of carrying out boundary processing on all feature pixel points of a sample to be measured after moving relative to a measurement platform to determine the minimum external polygon containing all feature pixel points, and screening out the number of the most matched similar feature pixel point groups according to the distance between any two feature pixel points forming the minimum external polygon and the gray value of all feature pixel points of the sample to be measured after moving relative to the measurement platform within an allowable range, wherein the number of the similar feature pixel point groups meeting the distance requirement is equal to 1 by continuously increasing the number of the feature pixel points, so that the minimum external polygon can be screened out effectively, accurately and quickly, and further the feature pixel points contained in the minimum external polygon can be compared through the screened minimum external polygon, so that the method for screening a plurality of feature pixel points before and after the sample to be measured relative to the measurement platform can accurately position the feature pixel point matching with the feature pixel point on the image of the sample to be measured after moving relative to the measurement platform on the image to be measured.
Wherein a first planar coordinate system and a second planar coordinate system are established respectively.
A first planar coordinate system x0y: the upper left corner of the microscopic image after the sample to be measured moves relative to the measuring platform is taken as an origin of coordinates, the length direction of the microscopic image is taken as an x axis, and the width direction of the microscopic image is taken as a y axis.
A second planar coordinate system x '0y': and calculating the coordinates of characteristic pixel points on the spliced image before the sample to be measured moves relative to the measuring platform, wherein the coordinates of the characteristic pixel points on the spliced image before the sample to be measured moves relative to the measuring platform are calculated by taking the upper left corner of the spliced image before the sample to be measured moves relative to the measuring platform as an origin of coordinates, taking the length direction parallel to the measuring platform as an x 'axis and taking the width direction parallel to the measuring platform as a y' axis, and the characteristic pixel points in the microscopic image after the sample to be measured moves relative to the measuring platform, as shown in fig. 6.
The position coordinates of each characteristic pixel point on the spliced image in the second plane coordinate system are analyzed as follows:
step B1, acquiring the left upper corner coordinates (Sx, sy) of each single-frame microscopic image forming a spliced image before a sample to be detected moves relative to a measuring platform, wherein Sx= (a-1) Lx, sy= (B-1) Ly, a is the number of the microscopic images acquired along the x axis direction under a first plane coordinate system, B is the number of the microscopic images acquired along the y axis direction under the first plane coordinate system, and Lx and Ly are the number of pixel points on the x 'and y' axes of the single-frame microscopic images respectively;
step B2, analyzing the relative position coordinates (delta xv, delta yv) of each characteristic pixel point on the microscopic image of the position mark (Sx, sy) at the upper left corner in the microscopic image;
in the step B3, the absolute position coordinates (Xv, yv) of each feature pixel point in the first plane coordinate system, x 'v=sx+Δxv, y' v=sy+Δyv, v=1, 2.
The gesture transformation analysis module is used for analyzing the coordinates of the characteristic pixel points in the spliced image before the sample to be detected is moved relative to the measuring platform, wherein the characteristic pixel points are matched with the characteristic pixel points in the microscopic image after the sample to be detected is moved relative to the measuring platform, and the rotation and translation between the two plane coordinate systems are analyzed based on the gesture transformation analysis model.
As shown in fig. 7 and 8, a transformation correlation model between position coordinates of the same key pixel point in a first plane coordinate system where the sample to be measured is before moving relative to the measurement platform and in a second plane coordinate system where the sample to be measured is before moving relative to the measurement platform is established:and (3) calculating the rotation angle beta, the translation delta x and delta y by respectively screening the position coordinates of at least 3 characteristic pixel points in the first plane coordinate system and the second plane coordinate system.
The splicing position tracking module rotates the placing angle of the measuring platform according to the rotation angle, tracks and adjusts the relative position of the measuring platform and the camera, is convenient for accurately tracking and adjusting the position, ensures that the sample to be measured after movement can be accurately positioned and adjusted to the splicing position of the sample to be measured relative to the position of the splicing position before the movement of the measuring platform under the placing position and the angle of the sample to be measured after the movement is not limited, and improves the efficiency and the accuracy of splicing the moving microscopic images.
The method for tracking and adjusting the relative positions of the measuring platform and the camera comprises the following steps:
step C1, controlling the measuring platform to rotate by an angle beta, wherein the plane of the measuring platform after rotation is in a horizontal state, so that the placing angle of the sample to be measured placed on the measuring platform under the visual field of a camera is kept unchanged before and after the sample to be measured moves;
according to the rotation angle between the first plane coordinate system and the second plane coordinate system analyzed by the gesture transformation analysis module, the measurement platform which is used for moving the sample to be measured relative to the measurement platform rotates by an angle beta, so that the X-axis and the X' -axis in the first plane coordinate system and the second plane coordinate system are parallel, and the position of the sample to be measured at the view angle of the camera is unchanged at the positions before and after the movement;
step C2, extracting translation amounts delta x and delta y along an x 'axis and a y' axis between the first plane coordinate system and the second plane coordinate system;
step C3, taking the upper left corner acquired by the camera after the sample to be measured moves relative to the measuring platform as an origin, establishing a coordinate system x '0y of the measurement platform after the measurement platform rotates by an angle beta "and establishing a camera coordinate system x '" 0y ' "under the camera view, as shown in fig. 8 and 9;
the movement amount of the measuring platform is determined by a motor for controlling the measuring platform to move along the X-axis direction of the measuring platform and a motor for controlling the measuring platform to move along the Y-axis direction of the measuring platform.
After the sample to be measured moves relative to the measuring platform, the position coordinates (X0, Y0) of the measuring platform along the X-axis direction and the Y-axis direction corresponding to the microscopic image acquired by the camera.
And C4, analyzing a splicing movement amount (x 1, y 1) required by a camera to continuously splice a sample to be detected relative to a splicing image before the sample to be detected moves relative to the measuring platform after the sample to be detected moves relative to the measuring platform, wherein x1=m (dx 1-dx 0), y1=deltay, dx1 is the actual width of an object to be detected corresponding to the acquired single-frame microscopic image, dx0 is the splicing overlapping width of two adjacent microscopic images of the sample to be detected relative to the measuring platform in the x 'direction, and m is the number of single-frame images spliced in the x' axis direction of the sample to be detected relative to the measuring platform before the sample to be detected moves in a first plane coordinate system.
When the sample to be measured moves relative to the measuring platform, the camera continues to splice splicing patterns before the sample to be measured moves relative to the measuring platform, wherein the splicing movement amount is equal to the position coordinates (x 1, y 1) in the camera coordinate system x '0 y';
in step C5, the position coordinates (X1, y 1) in the camera coordinate system X ' "0 y '" are transformed to the position coordinates (X2, y 2) in the measurement platform coordinate system X "0 y", i.e. x2=m (dx 1-dx 0) cos β - Δysin β, y2=m (dx 1-dx 0) sin β+Δycos β, wherein the X ' "axis in the camera coordinate system is parallel to the X axis when the measurement platform is not rotating.
And C6, respectively controlling the measuring platform to move along the X axis by a distance X2 and move along the Y axis by a distance Y2 so as to continue splicing the spliced patterns before the sample to be measured moves relative to the measuring platform, and splicing the spliced overlapping width along the X axis of the measuring platform to be dx0.
By establishing a transformation relation between the rotated measuring platform coordinate system and the camera coordinate system, the splicing movement amount of the camera for continuously splicing the splicing patterns before the sample to be measured moves relative to the measuring platform after the sample to be measured moves relative to the measuring platform can be transformed, the movement distance of the measuring platform under the rotated measuring platform coordinate system in the X-axis and Y-axis directions of the measuring platform is obtained, and the precision of continuous splicing due to the arrangement angle and position change of the sample to be measured before and after the movement is improved.
The foregoing is merely illustrative and explanatory of the principles of the invention, as various modifications and additions may be made to the specific embodiments described, or similar thereto, by those skilled in the art, without departing from the principles of the invention or beyond the scope of the appended claims.
Claims (8)
1. The microscope image processing system based on gesture transformation is characterized by comprising an image preprocessing module, a feature screening module, a gesture feature matching module, a gesture transformation analysis module and a splicing position tracking module;
the image preprocessing module acquires the gray value of each pixel point in the microscopic image, and fusion preprocessing is carried out on the pixel points at the central position by adopting the gray value of the surrounding pixel points of each pixel point to acquire the gray value of each pixel point after fusion processing;
the characteristic screening module is used for comparing each pixel point in the microscopic image before and after the sample to be tested after the fusion treatment moves relative to the measuring platform with the gray value of each surrounding pixel point respectively to obtain the gray value difference value between the pixel point and the surrounding pixel points, and screening out the characteristic pixel points;
the gesture feature matching module adopts a gray value matching mode to screen out a similar feature pixel point set with the error of gray values of all feature pixel points in a microscopic image after a sample to be detected moves relative to the measurement platform within an allowable range from the spliced image, and analyzes the position coordinates of the similar feature pixel points matched with the distance between any two feature pixel points in the microscopic image after the sample to be detected moves relative to the measurement platform based on the distance between all feature pixel points;
the matching analysis between the microscopic image acquired after the sample to be measured moves relative to the measuring platform and the spliced image before the sample to be measured moves relative to the measuring platform adopts a gray value matching mode, and the matching analysis comprises the following steps:
a1, extracting gray values of all characteristic pixel points spliced by all microscopic images before a sample to be detected moves relative to a measuring platform and gray values of all characteristic pixel points after the sample to be detected moves relative to the measuring platform;
step A2, extracting all characteristic pixel points with the error between gray values of the v th characteristic pixel point after the sample to be measured moves relative to the measuring platform from the front of the sample to be measured moves relative to the measuring platform in sequence until D characteristic pixel points after the sample to be measured moves relative to the measuring platform are extracted;
a3, constructing a minimum external polygon of each characteristic pixel point after the sample to be measured moves relative to the measuring platform;
step A4, screening the number W of similar characteristic pixel point groups with the same distance between coordinates of two similar characteristic pixel points in a position set of the 1 st and the d th characteristic pixel points forming the minimum external polygon and the distance between coordinates of the two similar characteristic pixel points in the error allowable range;
step A5, judging whether the number W of the similar characteristic pixel point groups meets W & gt 1, if W=1, obtaining characteristic pixel points corresponding to characteristic pixel points forming the minimum external polygon on a spliced image before a sample to be measured moves relative to the measuring platform, and if W & gt 1, executing step A6;
step A6, counting the number W of the distances between the positions of the similar feature pixels in the position set of the similar feature pixels corresponding to any one similar feature pixel in the similar feature pixel group and the d+1th feature pixel in the minimum external polygon, wherein the distances between the positions of the similar feature pixels in the position set of the similar feature pixels are equal to the number W of the distances between the two feature pixels forming the minimum external polygon corresponding to the position set of the two similar feature pixels, and executing the step A5 until W=1, so as to obtain feature pixels corresponding to all feature pixels forming the minimum external polygon on a spliced image before the sample to be measured moves relative to the measuring platform;
the attitude transformation analysis module is used for analyzing the coordinates of the characteristic pixel points in the spliced image before the sample to be detected is moved relative to the measuring platform, wherein the characteristic pixel points are matched with the characteristic pixel points in the microscopic image after the sample to be detected is moved relative to the measuring platform, and the rotation amount and the translation amount between two plane coordinate systems are analyzed based on the attitude transformation analysis model;
and the splicing position tracking module rotates the placing angle of the measuring platform according to the rotating angle and tracks and adjusts the relative position of the measuring platform and the camera.
2. The attitude transformation-based microscope image processing system according to claim 1, wherein the pixels at the center position are fused by using gray values of surrounding pixels of each pixel, and a fusion formula is as follows:k is an odd number greater than 1, +.>Gray value of pixel point with position coordinates (x, y) after image fusion processing, +.>The gray value of the pixel point with the position coordinates (x, y) which is not subjected to the image fusion processing.
3. The attitude transformation-based microscope image processing system according to claim 1, wherein the calculation formula of the gray value difference between each pixel and surrounding pixels is as follows:,/>represented as a warp image fusionGray value of (x-2+i, y-2+j) pixel point in the processed microscopic image, +.>Is the difference between the gray values of the (x, y) pixel point and the surrounding pixel points in the microscopic image.
4. The attitude transformation-based microscope image processing system according to claim 1, wherein the method for constructing the minimum circumscribing polygon in the step A3 comprises the steps of:
q1, taking one of the characteristic pixel points after the sample to be measured moves relative to the measuring platform as a starting point, and taking the starting point as a1 st boundary pixel point, and respectively calculating the included angles between the line segments of the other characteristic pixel points and the positive direction of the X-axis of the microscopic image;
q2, screening out a characteristic pixel point with the smallest included angle as a next boundary pixel point;
q3, taking the next boundary pixel point as a starting point, judging whether the included angle between the connecting line segments of the next boundary pixel point and the 1 st boundary pixel point is smaller than or equal to the minimum value of the included angle between the connecting line of the next boundary pixel point and other characteristic pixel points along the X-axis positive direction of the microscopic image, if the included angle is larger than the minimum value of the included angle, repeatedly executing the step Q2, and if the included angle is smaller than the minimum value of the included angle between the connecting line of the next boundary pixel point and the other characteristic pixel points along the X-axis positive direction of the microscopic image, executing the step Q4;
and Q4, judging whether the next boundary pixel point is overlapped with the boundary pixel points which are sequentially used as starting points.
5. The attitude transformation-based microscope image processing system according to claim 4, wherein a sum of distances between feature pixels within a minimum circumscribing polygon in the stitched image is calculated to be within a range of LD × (1±0.1%);
wherein,,/>expressed as the sum of the distances between characteristic pixel points in a microscopic image acquired by a microscope lens after a sample to be measured moves relative to a measuring platform, < >>And->The position coordinates of the v-th and the u-th characteristic pixel points in the microscopic image are respectively expressed, and D is the number of the characteristic pixel points in the microscopic image after the sample to be measured moves relative to the measuring platform.
6. The attitude transformation-based microscope image processing system according to claim 1, wherein a second planar coordinate system is established from the stitched image before the sample to be measured moves relative to the measurement platform, and the position coordinates of each feature pixel point on the stitched image in the second planar coordinate system are analyzed, and the analyzing method comprises:
step B1, acquiring the left upper corner position coordinates (Sx, sy) of each single-frame microscopic image forming a spliced image before a sample to be measured moves relative to a measurement platform;
step B2, analyzing the relative position coordinates of each characteristic pixel point on the microscopic image of the position mark (Sx, sy) of the upper left corner;
In the step B3 and the step B2, absolute position coordinates (Xv, yv) of each characteristic pixel point under a first plane coordinate system,,/>,v=1,2,...,D。
7. a base according to any one of claims 1 to 6The microscope image processing system for posture transformation is characterized by establishing a transformation correlation model between a first plane coordinate system of the same key pixel point before a sample to be measured moves relative to a measuring platform and a position coordinate of the sample to be measured under a second plane coordinate system before the sample to be measured moves relative to the measuring platform:calculating rotation angle by screening the position coordinates of at least 3 characteristic pixel points in a first plane coordinate system and a second plane coordinate system respectively>Translation amount->And->。
8. The pose transformation-based microscope image processing system according to claim 7, wherein tracking adjustment is performed on the relative position of the measurement platform and the camera, the method comprising:
step C1, controlling the measuring platform to rotate fromThe angle, and the plane of the rotated measuring platform is in a horizontal state, so that the placing angle of the sample to be measured placed on the measuring platform under the visual field of the camera is kept unchanged before and after the sample to be measured moves;
step C2, extracting the edge between the first plane coordinate system and the second plane coordinate systemShaft and->The amount of translation of the shaft->And;
step C3, taking the upper left corner acquired by the camera after the sample to be measured moves relative to the measuring platform as an origin, and establishing the rotation of the measuring platformPost-angle measurement platform coordinate System->Establishing a camera coordinate system under the camera view field +.>;
Step C4, analyzing the splicing movement amount (x 1, y 1) required by the camera to continue splicing the splicing graph before the sample to be measured moves relative to the measuring platform after the sample to be measured moves relative to the measuring platform;
step C5, calculating a camera coordinate systemThe position coordinates (x 1, y 1) in the measuring table are transformed into the measuring table coordinate system +.>Lower position coordinates (x 2, y 2);
and C6, respectively controlling the measuring platform to move along the X axis by a distance X2 and move along the Y axis by a distance Y2 so as to continue splicing the spliced patterns before the sample to be measured moves relative to the measuring platform.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311226987.4A CN116978005B (en) | 2023-09-22 | 2023-09-22 | Microscope image processing system based on attitude transformation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311226987.4A CN116978005B (en) | 2023-09-22 | 2023-09-22 | Microscope image processing system based on attitude transformation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116978005A CN116978005A (en) | 2023-10-31 |
CN116978005B true CN116978005B (en) | 2023-12-19 |
Family
ID=88483510
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311226987.4A Active CN116978005B (en) | 2023-09-22 | 2023-09-22 | Microscope image processing system based on attitude transformation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116978005B (en) |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102009054703A1 (en) * | 2009-12-15 | 2011-06-16 | Carl Zeiss Imaging Solutions Gmbh | Microscope calibrating method, involves calibrating recording unit for adjusted enlargement based on received images and measured positioning or control values of control module during recording of images |
CN107958442A (en) * | 2017-12-07 | 2018-04-24 | 中国科学院自动化研究所 | Gray correction method and device in several Microscopic Image Mosaicings |
JP2019049549A (en) * | 2015-08-26 | 2019-03-28 | 財團法人工業技術研究院Industrial Technology Research Institute | Surface measurement device and method thereof |
CN111179170A (en) * | 2019-12-18 | 2020-05-19 | 深圳北航新兴产业技术研究院 | Rapid panoramic stitching method for microscopic blood cell images |
CN111626936A (en) * | 2020-05-22 | 2020-09-04 | 湖南国科智瞳科技有限公司 | Rapid panoramic stitching method and system for microscopic images |
CN112164001A (en) * | 2020-09-29 | 2021-01-01 | 南京理工大学智能计算成像研究院有限公司 | Digital microscope image rapid splicing and fusing method |
CN112750078A (en) * | 2020-12-28 | 2021-05-04 | 广州市明美光电技术有限公司 | Microscopic image real-time splicing method and storage medium based on electric platform |
CN114240845A (en) * | 2021-11-23 | 2022-03-25 | 华南理工大学 | Surface roughness measuring method by adopting light cutting method applied to cutting workpiece |
CN114463231A (en) * | 2021-12-15 | 2022-05-10 | 麦克奥迪实业集团有限公司 | Intelligent splicing method, device, medium and equipment for microscope images |
CN115131350A (en) * | 2022-08-30 | 2022-09-30 | 南京木木西里科技有限公司 | Large-field-depth observation and surface topography analysis system |
CN115760654A (en) * | 2023-01-10 | 2023-03-07 | 南京木木西里科技有限公司 | Industrial microscope image processing system |
CN116309079A (en) * | 2023-05-10 | 2023-06-23 | 南京凯视迈科技有限公司 | Dynamic image acquisition, splicing and optimizing system |
CN116358841A (en) * | 2023-06-01 | 2023-06-30 | 南京木木西里科技有限公司 | Microscope lens self-identification calibration system |
CN116542857A (en) * | 2023-06-28 | 2023-08-04 | 南京凯视迈科技有限公司 | Multi-image self-adaptive splicing method based on large similarity |
CN116643393A (en) * | 2023-07-27 | 2023-08-25 | 南京木木西里科技有限公司 | Microscopic image deflection-based processing method and system |
CN116703723A (en) * | 2023-05-16 | 2023-09-05 | 中山依数科技有限公司 | High-resolution microscopic image scanning and stitching method based on microscope system |
-
2023
- 2023-09-22 CN CN202311226987.4A patent/CN116978005B/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102009054703A1 (en) * | 2009-12-15 | 2011-06-16 | Carl Zeiss Imaging Solutions Gmbh | Microscope calibrating method, involves calibrating recording unit for adjusted enlargement based on received images and measured positioning or control values of control module during recording of images |
JP2019049549A (en) * | 2015-08-26 | 2019-03-28 | 財團法人工業技術研究院Industrial Technology Research Institute | Surface measurement device and method thereof |
CN107958442A (en) * | 2017-12-07 | 2018-04-24 | 中国科学院自动化研究所 | Gray correction method and device in several Microscopic Image Mosaicings |
CN111179170A (en) * | 2019-12-18 | 2020-05-19 | 深圳北航新兴产业技术研究院 | Rapid panoramic stitching method for microscopic blood cell images |
CN111626936A (en) * | 2020-05-22 | 2020-09-04 | 湖南国科智瞳科技有限公司 | Rapid panoramic stitching method and system for microscopic images |
CN112164001A (en) * | 2020-09-29 | 2021-01-01 | 南京理工大学智能计算成像研究院有限公司 | Digital microscope image rapid splicing and fusing method |
CN112750078A (en) * | 2020-12-28 | 2021-05-04 | 广州市明美光电技术有限公司 | Microscopic image real-time splicing method and storage medium based on electric platform |
CN114240845A (en) * | 2021-11-23 | 2022-03-25 | 华南理工大学 | Surface roughness measuring method by adopting light cutting method applied to cutting workpiece |
CN114463231A (en) * | 2021-12-15 | 2022-05-10 | 麦克奥迪实业集团有限公司 | Intelligent splicing method, device, medium and equipment for microscope images |
CN115131350A (en) * | 2022-08-30 | 2022-09-30 | 南京木木西里科技有限公司 | Large-field-depth observation and surface topography analysis system |
CN115760654A (en) * | 2023-01-10 | 2023-03-07 | 南京木木西里科技有限公司 | Industrial microscope image processing system |
CN116309079A (en) * | 2023-05-10 | 2023-06-23 | 南京凯视迈科技有限公司 | Dynamic image acquisition, splicing and optimizing system |
CN116703723A (en) * | 2023-05-16 | 2023-09-05 | 中山依数科技有限公司 | High-resolution microscopic image scanning and stitching method based on microscope system |
CN116358841A (en) * | 2023-06-01 | 2023-06-30 | 南京木木西里科技有限公司 | Microscope lens self-identification calibration system |
CN116542857A (en) * | 2023-06-28 | 2023-08-04 | 南京凯视迈科技有限公司 | Multi-image self-adaptive splicing method based on large similarity |
CN116643393A (en) * | 2023-07-27 | 2023-08-25 | 南京木木西里科技有限公司 | Microscopic image deflection-based processing method and system |
Non-Patent Citations (3)
Title |
---|
MIST: Accurate and Scalable Microscopy Image Stitching Tool with Stage Modeling and Error Minimization;Joe Chalfoun等;Scientific Reports;第7卷;1-10 * |
基于Floyd算法的活性污泥显微图像的多图像拼接;赵立杰等;激光与光电子学进展;第59卷(第22期);122-129 * |
基于目标形状与关系约束水平集的图像分割方法研究;陈雪盈;中国优秀硕士学位论文全文数据库 信息科技辑;I138-980 * |
Also Published As
Publication number | Publication date |
---|---|
CN116978005A (en) | 2023-10-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111750805B (en) | A three-dimensional measurement device and measurement method based on binocular camera imaging and structured light technology | |
US6751338B1 (en) | System and method of using range image data with machine vision tools | |
WO2017067321A1 (en) | Pcb card matching method and device based on outer contour | |
CN111260731A (en) | Checkerboard sub-pixel level corner point self-adaptive detection method | |
CN103902953B (en) | A kind of screen detecting system and method | |
CN110390696A (en) | A visual detection method of circular hole pose based on image super-resolution reconstruction | |
CN109029299A (en) | The double camera measuring device and measuring method of bay section pin hole docking corner | |
US20130113897A1 (en) | Process and arrangement for determining the position of a measuring point in geometrical space | |
JP2022169723A (en) | System and method for efficiently scoring probe in image with vision system | |
CN110852213B (en) | Pointer instrument multi-condition automatic reading method based on template matching | |
CN109191527A (en) | A kind of alignment method and device based on minimum range deviation | |
CN111126381A (en) | A method for insulator tilt location and identification based on R-DFPN algorithm | |
CN110018170A (en) | A kind of small-sized damage positioning method of aircraft skin based on honeycomb moudle | |
CN109724586A (en) | A spacecraft relative pose measurement method integrating depth map and point cloud | |
CN112419224B (en) | Spherical pin chip positioning method and system | |
Wang et al. | Robust vision-based method for wing deflection angle measurement with defocus images | |
CN112329880A (en) | Template fast matching method based on similarity measurement and geometric features | |
Wang et al. | Bolt loosening angle detection based on binocular vision | |
CN116978005B (en) | Microscope image processing system based on attitude transformation | |
CN110211148A (en) | A kind of underwater picture pre-segmentation method estimated based on dbjective state | |
Peña-Haro et al. | Geometric correction and stabilization of images collected by UASs in river monitoring | |
CN111260561A (en) | A fast multi-image stitching method for reticle defect detection | |
CN108898585A (en) | A kind of axial workpiece detection method and its device | |
Wang et al. | Full Period Three-dimensional (3-D) Reconstruction Method for a Low Cost Singlelayer Lidar. | |
Wang et al. | Visual measurement method for large-space dynamic angles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |