CN115616018A - Positioning method and device for scanning electron microscope, electronic equipment and storage medium - Google Patents
Positioning method and device for scanning electron microscope, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN115616018A CN115616018A CN202211388684.8A CN202211388684A CN115616018A CN 115616018 A CN115616018 A CN 115616018A CN 202211388684 A CN202211388684 A CN 202211388684A CN 115616018 A CN115616018 A CN 115616018A
- Authority
- CN
- China
- Prior art keywords
- sample
- coordinate
- camera
- target
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N23/00—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
- G01N23/22—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by measuring secondary emission from the material
- G01N23/225—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by measuring secondary emission from the material using electron or ion
- G01N23/2251—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by measuring secondary emission from the material using electron or ion using incident electron beams, e.g. scanning electron microscopy [SEM]
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/02—Details
- H01J37/20—Means for supporting or positioning the object or the material; Means for adjusting diaphragms or lenses associated with the support
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/02—Details
- H01J37/22—Optical, image processing or photographic arrangements associated with the tube
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/26—Electron or ion microscopes; Electron or ion diffraction tubes
- H01J37/261—Details
- H01J37/265—Controlling the tube; circuit arrangements adapted to a particular application not otherwise provided, e.g. bright-field-dark-field illumination
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J37/00—Discharge tubes with provision for introducing objects or material to be exposed to the discharge, e.g. for the purpose of examination or processing thereof
- H01J37/26—Electron or ion microscopes; Electron or ion diffraction tubes
- H01J37/28—Electron or ion microscopes; Electron or ion diffraction tubes with scanning beams
Landscapes
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Length-Measuring Devices Using Wave Or Particle Radiation (AREA)
Abstract
The application provides a positioning method and device for a scanning electron microscope, electronic equipment and a storage medium, and relates to the technical field of scanning electron microscopes. The method comprises the following steps: acquiring an initial image of a sample to be detected at a target shooting position; determining the position information of the sample cup and the sample to be detected in the initial image; determining a first coordinate correction quantity according to the position information of the sample cup and the sample to be measured and the calibration parameters of the camera; acquiring pixel coordinates of a target observation point and target position coordinates in a second coordinate system; determining the initial position coordinate of the target observation point in a second coordinate system according to the pixel coordinate, the first coordinate correction quantity and the calibration parameter of the camera; and controlling the sample stage to move according to the position coordinates. According to the method and the device, the image of the sample to be detected is collected, the real position information of the target observation point is determined according to the pixel coordinate of the target observation point and the image related information, the accurate and rapid positioning of the sample observation point is realized, and the observation efficiency of the scanning electron microscope is effectively improved.
Description
Technical Field
The present disclosure relates to the field of scanning electron microscopes, and in particular, to a positioning method and apparatus for a scanning electron microscope, an electronic device, and a storage medium.
Background
A scanning electron microscope (scanning electron microscope for short) is a large-scale scientific instrument used for observing and researching the shape, components and photoelectric characteristics of micron-nanometer scale materials. When using a scanning electron microscope to observe a sample to be studied, the sample is placed on a sample stage of the scanning electron microscope.
In order to observe different samples on the sample stage or different positions on the same sample, the observed sample needs to be searched and located. Currently, a user generally determines a sample position based on a region feature observed in a field of a scanning electron microscope and performs small-amplitude movement within the field of view.
Because the observation visual field of the scanning electron microscope is often very small, the observation point is very difficult to find on the sample, and the sample searching time is long. Therefore, it is very important to research how to quickly and accurately locate the observation point of the sample.
Disclosure of Invention
The present application is directed to solving, at least to some extent, one of the technical problems in the related art.
An embodiment of a first aspect of the present application provides a positioning method for a scanning electron microscope, including:
acquiring an initial image of a sample to be detected positioned at a target shooting position and a distance between the target shooting position and a camera, wherein the sample to be detected is positioned on a sample cup, and the sample cup is positioned on a sample platform;
identifying the initial image to determine the position information of the sample cup and the sample to be detected in a first coordinate system;
determining a first coordinate correction quantity in the first coordinate system according to the distance between the target shooting position and the camera, the position information of the sample cup and the sample to be detected and the calibration parameters of the camera;
acquiring pixel coordinates of a target observation point in the initial image and target position coordinates of the target observation point in a second coordinate system;
determining the initial position coordinate of the target observation point in a second coordinate system according to the pixel coordinate, the first coordinate correction quantity and the calibration parameter of the camera;
and controlling the sample stage to move according to the initial position coordinates and the target position coordinates so as to enable the target observation point to move to a target position.
The embodiment of the second aspect of the present application provides a positioning device for a scanning electron microscope, including:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring an initial image of a sample to be detected at a target shooting position and a distance between the target shooting position and a camera, the sample to be detected is positioned on a sample cup, and the sample cup is positioned on a sample platform;
the identification module is used for identifying the initial image so as to determine the position information of the sample cup and the sample to be detected in a first coordinate system;
the first determining module is used for determining a first coordinate correction amount in the first coordinate system according to the distance between the target shooting position and the camera, the position information of the sample cup and the sample to be detected and the calibration parameters of the camera;
the second acquisition module is used for acquiring the pixel coordinates of the target observation point in the initial image and the target position coordinates of the target observation point in a second coordinate system;
the second determining module is used for determining the corresponding initial position coordinate of the target observation point in a second coordinate system according to the pixel coordinate, the first coordinate correction quantity and the calibration parameter of the camera;
and the control module controls the sample stage to move according to the initial position coordinates and the target position coordinates so as to enable the target observation point to move to a target position.
An embodiment of a third aspect of the present application provides an electronic device, including: the device comprises a memory, a processor and computer instructions stored on the memory and executable on the processor, wherein the processor executes the computer instructions to realize the method as set forth in the embodiment of the first aspect of the application.
An embodiment of a fourth aspect of the present application provides a scanning electron microscope including an electronic device as set forth in an embodiment of the third aspect of the present application.
An embodiment of a fifth aspect of the present application provides a non-transitory computer-readable storage medium storing computer instructions, which when executed by a processor implement the method as set forth in the embodiment of the first aspect of the present application.
An embodiment of the sixth aspect of the present application proposes a computer program product, which, when being executed by an instruction processor, performs the method proposed by the embodiment of the first aspect of the present application.
The positioning method, the positioning device, the electronic equipment and the storage medium for the scanning electron microscope have the following beneficial effects:
firstly, acquiring an initial image of a sample to be detected at a target shooting position and a distance between the target shooting position and a camera, and then identifying the initial image to determine position information of a sample cup and the sample to be detected in a first coordinate system; then determining a first coordinate correction amount in a first coordinate system according to the distance between the target shooting position and the camera, the position information of the sample cup and the sample to be detected and the calibration parameters of the camera; then, acquiring a pixel coordinate of the target observation point in the initial image and a target position coordinate of the target observation point in a second coordinate system, and determining a corresponding initial position coordinate of the target observation point in the second coordinate system according to the pixel coordinate, the first coordinate correction amount and a calibration parameter of the camera; and finally, controlling the sample stage to move according to the initial position coordinates and the target position coordinates so as to enable the target observation point to move to the target position.
According to the method and the device, the image of the sample to be detected is acquired through the camera, the coordinate correcting quantity is determined according to the position information of the sample to be detected and the sample cup in the image, the distance between the target shooting position and the camera and the calibration parameters of the camera, and then the coordinate correcting quantity and the calibration parameters of the camera are combined to convert the pixel coordinates of the target observation point in the image, so that the real position information of the target observation point is determined, the accurate and quick positioning of the sample observation point is realized, the observation efficiency of the scanning electron microscope is effectively improved, and the time and the energy for searching and positioning a large number of samples are saved for a user.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flowchart of a positioning method for a scanning electron microscope according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram illustrating a position shift of a sample to be measured when the sample is observed from a top view;
FIG. 3 is a schematic flowchart of a positioning method for a scanning electron microscope according to another embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a camera projected from a top view according to an embodiment of the present application;
FIG. 5 is a diagram illustrating a computed imaging offset according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a positioning device for a scanning electron microscope according to an embodiment of the present disclosure;
FIG. 7 illustrates a block diagram of an exemplary electronic device suitable for use in implementing embodiments of the present application.
Detailed Description
Reference will now be made in detail to the embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
A positioning method, an apparatus, an electronic device, and a storage medium for a scanning electron microscope according to embodiments of the present application are described below with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of a positioning method for a scanning electron microscope according to an embodiment of the present disclosure.
The embodiment of the present application is exemplified by the positioning method for a scanning electron microscope being configured in a positioning apparatus for a scanning electron microscope, and the positioning apparatus for a scanning electron microscope may be applied to a hardware device having an operating system, a touch screen and/or a display screen, so that the device may perform a positioning function for a scanning electron microscope.
As shown in fig. 1, the positioning method for a scanning electron microscope may include the following steps:
When a sample is observed using a scanning electron microscope, the sample to be measured is usually placed on a sample cup, and the sample cup is placed on a sample stage. And then, the sample to be measured appears in the field of view of the scanning electron microscope through position adjustment.
The sample to be measured can be any type of article requiring observation of microscopic morphology. For example, the sample may be a biological sample, a nano material, etc., which is not limited in this application.
In the embodiment of the application, the camera is used for collecting the initial image of the sample to be detected. Specifically, the camera may be fixed to the sample stage, and a position on the sample stage directly below the camera may be set as the target shooting position. And when the sample to be detected appears at the target shooting position, triggering the camera to acquire an initial image.
For example, a micro-optical sensor may be used to detect a sample cup carrying a sample to be tested. When the sample to be detected is detected to be located at the target shooting position, the camera is synchronously triggered to collect images, and the sample to be detected and the sample cup are shot.
The distance between the target shooting position and the camera is fixed and can be set as needed. For example, the target shooting position may be 90mm, 80mm, etc. from the camera, which is not limited in this application.
And 102, identifying the initial image to determine the position information of the sample cup and the sample to be detected in the first coordinate system.
The first coordinate system is an image coordinate system and is a two-dimensional rectangular coordinate system. The relative position of the sample cup and the sample to be measured in the image can be described by means of a first coordinate system. For example, the first coordinate system may be a two-dimensional rectangular coordinate system established with the optical center of the camera in the initial image as the origin and two adjacent sides of the initial image as the x and y axes.
It should be noted that, since the sample to be measured is located on the sample cup, the sample to be measured in the image partially overlaps the sample cup. The color difference between the sample to be detected and the sample cup is often larger, so that the position information of the sample to be detected and the sample cup can be determined by identifying the edges of the sample to be detected and the sample cup.
For example, the image convolution can be performed using a perspective transformation method in computer vision. And extracting different edge operators according to the convolution sum, and preliminarily calibrating the positions of the sample cup and the sample to be detected according to a three-phase similarity principle of space perspective.
And 103, determining a first coordinate correction amount in a first coordinate system according to the distance between the target shooting position and the camera, the position information of the sample cup and the sample to be detected and the calibration parameters of the camera.
It should be noted that the sample to be measured often has a certain thickness. When a sample to be measured is observed from a top view angle, the projection position of the sample to be measured can generate certain offset. As shown in fig. 2, when the sample to be measured is photographed from the point O, the projection of the point a of the sample to be measured on the initial image is A1, and the projection of the point B is B1.
In order to accurately obtain the real position information of each point on the initial image, the offset of the initial image after projection needs to be determined. In the embodiment of the application, the first coordinate correction amount can be determined according to the distance H between the target shooting position and the camera, the position information of the sample cup and the sample to be detected, and the calibration parameters of the camera.
And 104, acquiring pixel coordinates of the target observation point in the initial image and target position coordinates of the target observation point in a second coordinate system.
The target observation point on the initial image can be any position of the sample to be detected. For example, when the sample to be detected is an insect specimen, the target observation point may be a spot on a wing of the insect specimen.
Specifically, the user can click the corresponding position on the initial image by using a mouse according to any point to be observed, and then the pixel coordinate of the target observation point is determined.
It should be noted that, in order to facilitate the user to select the observation point, the display image may use the mouse click point as the central point of magnification, so as to obtain images with different magnifications. The acquired initial picture is ensured to have higher resolution, and the positioning precision is improved. In addition, the measurement grids can be displayed on the image in real time, and the measurement is convenient for a client to measure.
The second coordinate system is a world coordinate system (also called a measurement coordinate system), which is a three-dimensional rectangular coordinate system, and the spatial positions of the camera, the scanning electron microscope and the sample to be measured can be described by using the second coordinate system as a reference.
In the embodiment of the application, the camera and the scanning electron microscope are relatively fixed in position on the sample stage. The origin of the second coordinate system may be an optical center of the camera, the x-axis and the y-axis are respectively parallel to two adjacent edges of the sample stage, and the z-axis is perpendicular to a plane where the sample stage is located.
When a sample is observed by using a scanning electron microscope, the sample needs to be moved from below the camera to below the scanning electron microscope, and a target observation point of the sample is aligned with an optical center of the scanning electron microscope. Therefore, the target position coordinate of the target observation point in the second coordinate system can be the projection position coordinate of the optical center of the scanning electron microscope on the sample stage.
And 105, determining the corresponding initial position coordinate of the target observation point in a second coordinate system according to the pixel coordinate, the first coordinate correction quantity and the calibration parameter of the camera.
It can be understood that, when the camera is used for shooting the image of the real scene, the correlation between the three-dimensional geometric position of a certain point on the surface of the space object and the corresponding point in the image can be determined based on the calibration parameters of the camera, so as to reconstruct the three-dimensional scene of the image.
The calibration parameters comprise internal parameters, external parameters and distortion parameters of the camera. The image can be subjected to distortion correction according to the distortion parameters of the camera, the corresponding relation among pixel coordinates, image coordinates, camera coordinates and world coordinates can be established according to internal parameters and external parameters of the camera, the real world coordinates can be determined according to the pixel coordinates, and the accurate positioning relation can be established.
Therefore, in the embodiment of the present application, determining the initial position coordinate of the target observation point in the second coordinate system according to the pixel coordinate, the first coordinate correction amount, and the calibration parameter of the camera may include the following steps:
firstly, determining the image coordinates of a target observation point in a first coordinate system according to the pixel coordinates and the pixel length;
wherein the pixel length comprises a per-pixel length in the x-direction and the y-direction. The pixel length can be obtained by the camera parameters, or can be determined according to the number of pixels in a certain length. For example, the length L/a per pixel can be calculated from the diameter L of the sample cup in the initial image and the number a of pixels included in the diameter.
In the embodiment of the present application, the first coordinate system is an image coordinate system. The first coordinate system takes the optical center position G of the camera on the initial image as an original point, and according to the pixel coordinates and the pixel length, the horizontal distance and the vertical distance between the target observation point and the original point can be calculated, so that the image coordinates of the target observation point in the first coordinate system can be determined.
And secondly, correcting the image coordinates according to the first coordinate correction quantity to determine the corrected coordinates of the target observation point.
And finally, determining the initial position coordinate of the target observation point in the second coordinate system according to the corrected coordinate and the calibration parameter of the camera.
Wherein the second coordinate system is a world coordinate system. The position coordinates of the target observation point in reality can be described through the second coordinate system.
Because the target observation point only needs to be positioned in the horizontal plane, the second coordinate system can ignore the displacement in the vertical direction and only determine the two-dimensional rectangular coordinate in the horizontal plane.
And 106, controlling the sample stage to move according to the initial position coordinates and the target position coordinates so as to enable the target observation point to move to the target position.
After the initial position coordinates of the target observation point in reality are determined, the initial position coordinates can be input into a control system of the sample stage, so that the sample stage moves in the x direction and the y direction, and the target observation point moves to the target position, namely, just below the optical center of the scanning electron microscope. Furthermore, the user can directly observe the target observation point through the scanning electron microscope.
In the embodiment of the application, the image of the sample to be detected is collected through the camera, the coordinate correcting quantity is determined according to the position information of the sample to be detected and the sample cup in the image, the distance between the target shooting position and the camera and the calibration parameters of the camera, the pixel coordinate of the target observation point in the image is converted by combining the coordinate correcting quantity and the calibration parameters of the camera, the real position information of the target observation point is determined, the accurate and quick positioning of the sample observation point is further realized, the observation efficiency of the scanning electron microscope is effectively improved, and a large amount of time and energy for searching and positioning the sample are saved for a user.
Fig. 3 is a schematic flowchart of a positioning method for a scanning electron microscope according to another embodiment of the present disclosure. As shown in fig. 3, the positioning method for a scanning electron microscope may include the following steps:
The specific implementation manner of step 201 may refer to the detailed description of the embodiment in step 101 in this application, and is not described herein again.
The specific implementation manner of step 202 may refer to the detailed description of the embodiment in step 102 in this application, and is not described herein again.
And step 203, determining the position coordinates of the first reference point on the sample cup in the first coordinate system according to the position information of the sample cup.
And 204, determining the position coordinate of the second reference point on the sample to be detected in the first coordinate system and the length of the sample to be detected according to the position information of the sample to be detected.
Wherein the first reference point may be any point on the sample cup. The second reference point may be any point on the sample to be measured. Because the lower surface of the sample to be detected and the upper surface of the sample cup are positioned on the same horizontal plane, the first reference point can represent one point on the plane where the lower surface of the sample to be detected is positioned, and the second reference point can represent one point on the plane where the upper surface of the sample to be detected is positioned.
Furthermore, based on the position information of the specimen cup, the position coordinates of the selected first reference point in the first coordinate system can be determined. And determining the position coordinates of the selected second reference point in the first coordinate system according to the position information of the sample to be detected in the first coordinate system.
It should be noted that, after the position information of the sample to be measured is determined, the length of the sample to be measured can be determined according to the position coordinates of the two ends of the sample to be measured.
And step 205, determining the thickness of the sample to be measured according to the position coordinates of the first reference point, the position coordinates of the second reference point and the calibration parameters of the camera.
It can be understood that, according to the calibration parameters of the camera, the corresponding relationship between the first coordinate system and the second coordinate system can be established, and finally, the real world coordinates can be determined according to the image coordinates, and the accurate positioning relationship can be established.
In the embodiment of the present application, the second coordinate system is a world coordinate system, which is a three-dimensional rectangular coordinate system, and the spatial positions of the camera, the scanning electron microscope, and the sample to be measured can be described based on the second coordinate system.
For example, the origin of the second coordinate system may be the optical center of the camera, the x-axis and the y-axis are respectively parallel to two adjacent edges of the sample stage, and the z-axis is perpendicular to the plane of the sample stage.
Therefore, based on the image coordinates of the first reference point and the calibration parameters of the camera, the three-dimensional coordinates of the first reference point in the second coordinate system can be calculated; based on the image coordinates of the second reference point and the calibration parameters of the camera, the three-dimensional coordinates of the second reference point in the second coordinate system may be calculated.
And further, calculating a coordinate difference value of the first reference point and the second reference point in the z-axis direction, namely the height direction, namely the thickness of the sample to be measured.
And step 206, determining the imaging offset according to the thickness of the sample to be detected, the length of the sample to be detected and the distance between the target shooting position and the camera.
As shown in fig. 4, since the sample to be measured has a certain thickness, when the camera is used to look down and image from the space, a certain deviation may occur between the actual position and the projected position of the sample to be measured. In the figure, the solid line is the actual position of the sample to be measured, the dotted line is the projection position of the sample to be measured, and the projection of the point a of the sample to be measured on the initial image along the z-axis direction is A2. When the sample to be measured is photographed from the point O, the projection of the point a of the sample to be measured on the initial image is A1. In the initial image, the point A1, the point A2 and the point G are on the same straight line, and the angle between the straight line and the y direction is an angle θ.
It should be noted that, in practical applications, when the sample cup is transferred to the sample stage, the actual position of the sample cup may deviate from the target shooting position, so that the optical center of the camera in the initial image does not coincide with the center of the sample cup. As shown in fig. 5, G is the optical center of the camera, O1 is the center of the sample cup, and the projection of the point a of the sample to be measured on the initial image along the z-axis direction is A2.
Thus, the camera optical center G can be determined from the center point of the initial image. When the sample to be detected is shot from the point O, the projection of the point A of the sample to be detected on the initial image is A1, and the projection of the point B is B1. The height of the camera from the target position is H, the thickness of the sample to be detected is H, the point O, the point A and the point A1 are on the same straight line, and the included angle between the straight line and the z-axis direction is an angle alpha.
As shown in fig. 5, from the principle of similar triangles, it can be known that:
wherein,A 1 A 2 is the imaging offset.
And step 207, determining a first coordinate correction amount according to the imaging offset and the position information of the sample to be detected.
As shown in fig. 4, from the principle of similar triangles, it can be known that:
wherein,∆xfor the corner point of the sample to be measuredA 1 The distance in the x-direction from the camera optical center G,∆yis a corner pointA 1 Distance in the y-direction from the camera optical center G.x1 is the first coordinate correction in the x-direction,y1 is the first coordinate correction in the y-direction.
And step 208, acquiring the pixel coordinates of the target observation point in the initial image and the target position coordinates of the target observation point in the second coordinate system.
The specific implementation manner of step 208 may refer to the detailed description of the embodiment in step 104 in this application, and is not described herein again.
In step 209, the location information of the optical center of the camera and the center of the sample cup in the initial image in the first coordinate system is determined.
The optical center of the camera in the initial image is the central point of the initial image, and the center of the sample cup can be obtained by calculating the edge of the sample cup, which is not described herein again.
When the camera is calibrated, the positions of the camera, the sample cup and the scanning electron microscope are relatively fixed, the optical center of the camera and the center of the sample cup coincide at this time, but in the actual observation, the sample cup is placed on the sample table, the sample table is conveyed to the position under the camera by using the conveyer belt, at the moment, the center of the sample cup is not necessarily under the camera, namely, the center of the sample cup deviates from the target shooting position, and the error is corrected by using the second coordinate correcting quantity.
As shown in fig. 5, after determining the position information of the camera optical center G and the sample cup center O1, it can be determined that the second coordinate correction amount is:
and step 211, determining the corresponding initial position coordinate of the target observation point in the second coordinate system according to the pixel coordinate, the first coordinate correction amount, the second coordinate correction amount and the calibration parameter of the camera.
Firstly, determining the image coordinates of a target observation point in a first coordinate system according to the pixel coordinates and the pixel length;
wherein the pixel length includes pixel lengths in the x-direction and the y-direction. The pixel length can be obtained by the parameters of the camera, and can also be determined according to the number of pixels in a certain length. For example, the pixel length may be L/A based on the diameter L of the sample cup in the initial image and the number of pixels A contained in the diameter.
In the embodiment of the present application, the first coordinate system is an image coordinate system. The first coordinate system takes the optical center position G of the camera on the initial image as an original point, and according to the pixel coordinates and the pixel length, the distances between the target observation point and the original point in the x direction and the y direction can be calculated, so that the image coordinates of the target observation point in the first coordinate system can be determined.
And secondly, correcting the image coordinates according to the first coordinate correction amount and the second coordinate correction amount to determine the corrected coordinates of the target observation point.
For example, the image coordinates of the target observation point m areAnd then the corrected coordinates of the target observation point m are as follows:
and the signs are determined according to the quadrant of the target observation point in the image coordinate system. For example, when the target observation point is located in the first quadrant, the sign before the corrected coordinates is positive.
And finally, determining the initial position coordinate of the target observation point in the second coordinate system according to the corrected coordinate and the calibration parameter of the camera.
In the embodiment of the present application, the second coordinate system is a world coordinate system. Since the target observation point only needs to be located in the horizontal plane, the second coordinate system may be a two-dimensional rectangular coordinate system. The position coordinates of the target observation point in reality can be described through the second coordinate system.
The specific implementation manner of step 212 may refer to the detailed description of the embodiment in step 106 in this application, and is not described herein again.
In the embodiment of the application, the position deviation caused by the sample thickness and the shooting position deviation is synchronously considered, the first coordinate correction quantity and the second coordinate correction quantity are determined according to the sample thickness and the shooting position respectively, and then the pixel coordinates of the target observation point in the image are converted by combining the coordinate correction quantity and the calibration parameters of the camera, so that the real position information of the target observation point is determined, the positioning precision of the sample observation point is further improved, and the use requirement of a user is met.
In order to implement the above embodiments, the present application further provides a positioning device for a scanning electron microscope.
Fig. 6 is a schematic structural diagram of a positioning device for a scanning electron microscope according to an embodiment of the present disclosure.
As shown in fig. 6, the positioning apparatus 100 for a scanning electron microscope may include: the system comprises a first obtaining module 110, a recognition module 120, a first determining module 130, a second obtaining module 140, a second determining module 150 and a control module 160.
The first obtaining module 110 is configured to obtain an initial image of a sample to be detected located at a target shooting position and a distance between the target shooting position and a camera, where the sample to be detected is located on a sample cup, and the sample cup is located on a sample stage;
the identification module 120 is configured to identify the initial image to determine position information of the sample cup and the sample to be detected in the first coordinate system;
a first determining module 130, configured to determine a first coordinate correction amount in a first coordinate system according to a distance between the target shooting position and the camera, position information of the sample cup and the sample to be measured, and calibration parameters of the camera;
a second obtaining module 140, configured to obtain a pixel coordinate of the target observation point in the initial image and a target position coordinate of the target observation point in a second coordinate system;
the second determining module 150 is configured to determine an initial position coordinate of the target observation point in the second coordinate system according to the pixel coordinate, the first coordinate correction amount, and the calibration parameter of the camera;
the control module 160 controls the sample stage to move according to the initial position coordinates and the target position coordinates, so that the target observation point moves to the target position.
In one possible implementation manner, the second determining module is configured to:
determining the image coordinates of the target observation point in a first coordinate system according to the pixel coordinates and the pixel length;
correcting the image coordinates according to the first coordinate correction quantity to determine corrected coordinates of the target observation point;
and determining the initial position coordinate of the target observation point in the second coordinate system according to the corrected coordinate and the calibration parameter of the camera.
In one possible implementation, the identification module is configured to:
identifying the initial image to determine the edges of the sample cup and the sample to be detected;
and determining the position information of the sample cup and the sample to be detected in the first coordinate system based on the edges of the sample cup and the sample to be detected.
In one possible implementation manner, the first determining module is configured to:
determining the position coordinates of a first reference point on the sample cup in a first coordinate system according to the position information of the sample cup;
determining the position coordinate of a second reference point on the sample to be detected in the first coordinate system and the length of the sample to be detected according to the position information of the sample to be detected;
determining the thickness of the sample to be measured according to the position coordinates of the first reference point, the position coordinates of the second reference point and the calibration parameters of the camera;
determining imaging offset according to the thickness of the sample to be detected, the length of the sample to be detected and the distance between the target shooting position and the camera;
and determining a first coordinate correction amount according to the imaging offset and the position information of the sample to be detected.
In one possible implementation, the apparatus further includes:
the third determining module is used for determining the position information of the optical center of the camera and the center of the sample cup in the initial image in the first coordinate system;
the fourth determining module is used for determining a second coordinate correction quantity in the first coordinate system according to the position information of the optical center of the camera and the center position information of the sample cup;
a second determination module to:
and determining the corresponding initial position coordinate of the target observation point in a second coordinate system according to the pixel coordinate, the first coordinate correction quantity, the second coordinate correction quantity and the calibration parameter of the camera.
The functions and specific implementation principles of the above modules in the embodiments of the present application may refer to the above method embodiments, which are not described herein again.
The positioning device for the scanning electron microscope acquires the image of the sample to be detected through the camera, determines the coordinate correction amount according to the position information of the sample to be detected and the sample cup in the image, the distance between the target shooting position and the camera and the calibration parameter of the camera, further combines the coordinate correction amount and the calibration parameter of the camera, converts the pixel coordinate of the target observation point in the image, determines the real position information of the target observation point, further realizes the accurate and quick positioning of the sample observation point, effectively improves the observation efficiency of the scanning electron microscope, and saves a large amount of time and energy for searching and positioning samples for users.
In order to implement the above embodiments, the present application also provides an electronic device, including: the computer program product comprises a memory, a processor and computer instructions stored on the memory and executable on the processor, wherein the processor executes the computer instructions to implement the method according to the previous embodiment of the present application.
In order to implement the foregoing embodiments, the present application further provides a scanning electron microscope including the electronic device as set forth in the foregoing embodiments of the present application.
To achieve the above embodiments, the present application further proposes a non-transitory computer readable storage medium storing computer instructions, which when executed by a processor, implement the method as proposed by the foregoing embodiments of the present application.
In order to implement the foregoing embodiments, the present application also proposes a computer program product, wherein when the instructions in the computer program product are executed by a processor, the method as proposed by the foregoing embodiments of the present application is executed.
FIG. 7 illustrates a block diagram of an exemplary electronic device suitable for use in implementing embodiments of the present application. The electronic device 12 shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the electronic device 12 is represented in the form of a general electronic device. The components of electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including but not limited to an operating system, one or more application programs, other program modules, and program data, each of which or some combination of which may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the embodiments described herein.
The processing unit 16 executes various functional applications and data processing, for example, implementing the methods mentioned in the foregoing embodiments, by executing programs stored in the system memory 28.
According to the technical scheme, an initial image of a sample to be detected at a target shooting position and a distance between the target shooting position and a camera are obtained, and then the initial image is identified to determine position information of a sample cup and the sample to be detected in a first coordinate system; then determining a first coordinate correction amount in a first coordinate system according to the distance between the target shooting position and the camera, the position information of the sample cup and the sample to be detected and the calibration parameters of the camera; then, acquiring a pixel coordinate of the target observation point in the initial image and a target position coordinate of the target observation point in a second coordinate system, and determining a corresponding initial position coordinate of the target observation point in the second coordinate system according to the pixel coordinate, the first coordinate correction amount and a calibration parameter of the camera; and finally, controlling the sample stage to move according to the initial position coordinates and the target position coordinates so as to enable the target observation point to move to the target position.
The image of the sample to be detected is collected through the camera, according to the position information of the sample to be detected and the sample cup in the image, the distance between the target shooting position and the camera and the calibration parameters of the camera, the coordinate correcting quantity is determined, then the coordinate correcting quantity and the calibration parameters of the camera are combined, the pixel coordinates of the target observation point in the image are converted, the real position information of the target observation point is determined, the accurate and quick positioning of the sample observation point is further realized, the observation efficiency of a scanning electron microscope is effectively improved, and the time and the energy for searching and positioning a large number of samples are saved for a user.
In the description of the present specification, reference to the description of "one embodiment," "some embodiments," "an example," "a specific example," or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Moreover, various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without being mutually inconsistent.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or to implicitly indicate the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one of the feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless explicitly specified otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware that is related to instructions of a program, and the program may be stored in a computer-readable storage medium, and when executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a separate product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are exemplary and should not be construed as limiting the present application and that changes, modifications, substitutions and alterations in the above embodiments may be made by those of ordinary skill in the art within the scope of the present application.
Claims (10)
1. A positioning method for a scanning electron microscope is characterized by comprising the following steps:
acquiring an initial image of a sample to be detected positioned at a target shooting position and a distance between the target shooting position and a camera, wherein the sample to be detected is positioned on a sample cup, and the sample cup is positioned on a sample platform;
identifying the initial image to determine the position information of the sample cup and the sample to be detected in a first coordinate system;
determining a first coordinate correction quantity in the first coordinate system according to the distance between the target shooting position and the camera, the position information of the sample cup and the sample to be detected and the calibration parameters of the camera;
acquiring pixel coordinates of a target observation point in the initial image and target position coordinates of the target observation point in a second coordinate system;
determining the corresponding initial position coordinate of the target observation point in the second coordinate system according to the pixel coordinate, the first coordinate correction quantity and the calibration parameter of the camera;
and controlling the sample stage to move according to the initial position coordinates and the target position coordinates so as to enable the target observation point to move to a target position.
2. The method of claim 1, wherein determining the corresponding initial position coordinates of the target observation point in the second coordinate system according to the pixel coordinates, the first coordinate correction amount, and the calibration parameters of the camera comprises:
determining the image coordinates of the target observation point in the first coordinate system according to the pixel coordinates and the pixel length;
correcting the image coordinates according to the first coordinate correction amount to determine corrected coordinates of the target observation point;
and determining the initial position coordinate of the target observation point in the second coordinate system according to the corrected coordinate and the calibration parameter of the camera.
3. The method of claim 1, wherein the identifying the initial image to determine the position information of the sample cup and the sample to be measured in a first coordinate system comprises:
identifying the initial image to determine the edges of the sample cup and the sample to be detected;
and determining the position information of the sample cup and the sample to be detected in the first coordinate system based on the edges of the sample cup and the sample to be detected.
4. The method of claim 1, wherein the determining a first coordinate correction amount in the first coordinate system according to the distance between the target shooting position and the camera, the position information of the sample cup and the sample to be measured, and the calibration parameters of the camera comprises:
determining the position coordinates of a first reference point on the sample cup in the first coordinate system according to the position information of the sample cup;
determining the position coordinate of a second reference point on the sample to be detected in the first coordinate system and the length of the sample to be detected according to the position information of the sample to be detected;
determining the thickness of the sample to be measured according to the position coordinates of the first reference point, the position coordinates of the second reference point and the calibration parameters of the camera;
determining imaging offset according to the thickness of the sample to be detected, the length of the sample to be detected and the distance between the target shooting position and the camera;
and determining the first coordinate correction amount according to the imaging offset and the position information of the sample to be detected.
5. The method of any one of claims 1-4, wherein determining the corresponding initial position coordinates of the target observation point in the second coordinate system based on the pixel coordinates, the first coordinate correction amount, and calibration parameters of the camera, further comprises:
determining positional information of a camera optical center in the initial image and a center of the sample cup in the first coordinate system;
determining a second coordinate correction quantity in the first coordinate system according to the position information of the optical center of the camera and the center position information of the sample cup;
and determining the initial position coordinate of the target observation point in the second coordinate system according to the pixel coordinate, the first coordinate correction quantity, the second coordinate correction quantity and the calibration parameter of the camera.
6. A positioning device for a scanning electron microscope, comprising:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring an initial image of a sample to be detected at a target shooting position and a distance between the target shooting position and a camera, the sample to be detected is positioned on a sample cup, and the sample cup is positioned on a sample platform;
the identification module is used for identifying the initial image so as to determine the position information of the sample cup and the sample to be detected in a first coordinate system;
the first determining module is used for determining a first coordinate correction quantity in the first coordinate system according to the distance between the target shooting position and the camera, the position information of the sample cup and the sample to be detected and the calibration parameters of the camera;
the second acquisition module is used for acquiring the pixel coordinates of the target observation point in the initial image and the target position coordinates of the target observation point in a second coordinate system;
the second determining module is used for determining the corresponding initial position coordinate of the target observation point in a second coordinate system according to the pixel coordinate, the first coordinate correction quantity and the calibration parameter of the camera;
and the control module controls the sample stage to move according to the initial position coordinates and the target position coordinates so as to enable the target observation point to move to a target position.
7. An electronic device comprising a memory, a processor, and computer instructions stored on the memory and executable on the processor, when executing the computer instructions, implementing the method of any of claims 1-5.
8. A scanning electron microscope comprising the electronic device of claim 7.
9. A computer-readable storage medium storing computer instructions, wherein the computer instructions, when executed by a processor, implement the method of any one of claims 1-5.
10. A computer program product comprising computer instructions which, when executed by a processor, implement the method of any one of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211388684.8A CN115616018B (en) | 2022-11-08 | 2022-11-08 | Positioning method and device for scanning electron microscope, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211388684.8A CN115616018B (en) | 2022-11-08 | 2022-11-08 | Positioning method and device for scanning electron microscope, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115616018A true CN115616018A (en) | 2023-01-17 |
CN115616018B CN115616018B (en) | 2023-03-21 |
Family
ID=84878364
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211388684.8A Active CN115616018B (en) | 2022-11-08 | 2022-11-08 | Positioning method and device for scanning electron microscope, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115616018B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118073158A (en) * | 2024-01-10 | 2024-05-24 | 北京中科科仪股份有限公司 | Vacuum inner detection device and scanning electron microscope |
CN118623760A (en) * | 2024-05-22 | 2024-09-10 | 深圳精智达技术股份有限公司 | A method, system and device for detecting defects of a microdisplay |
CN119444627A (en) * | 2025-01-10 | 2025-02-14 | 匠岭科技(上海)有限公司 | Image correction method, device, semiconductor quantity detection method, apparatus, medium, and product |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105225909A (en) * | 2015-09-17 | 2016-01-06 | 北京大学 | A kind of sample platform of scanning electronic microscope positioner and localization method thereof |
US20190122026A1 (en) * | 2016-05-17 | 2019-04-25 | Horiba France Sas | Micro-localisation method and device for an imaging instrument and a measuring apparatus |
CN112630242A (en) * | 2020-12-03 | 2021-04-09 | 成都先进金属材料产业技术研究院有限公司 | Navigation method for scanning electron microscope sample |
CN112945996A (en) * | 2021-01-26 | 2021-06-11 | 西安科技大学 | Rapid in-situ comparison method based on scanning electron microscope |
CN113594076A (en) * | 2021-07-22 | 2021-11-02 | 上海精测半导体技术有限公司 | Method for aligning patterned wafer and semiconductor device |
CN114778583A (en) * | 2022-03-24 | 2022-07-22 | 重庆大学 | Scanning electron microscope target point in-situ observation rapid positioning method and device and storage medium |
-
2022
- 2022-11-08 CN CN202211388684.8A patent/CN115616018B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105225909A (en) * | 2015-09-17 | 2016-01-06 | 北京大学 | A kind of sample platform of scanning electronic microscope positioner and localization method thereof |
US20190122026A1 (en) * | 2016-05-17 | 2019-04-25 | Horiba France Sas | Micro-localisation method and device for an imaging instrument and a measuring apparatus |
CN112630242A (en) * | 2020-12-03 | 2021-04-09 | 成都先进金属材料产业技术研究院有限公司 | Navigation method for scanning electron microscope sample |
CN112945996A (en) * | 2021-01-26 | 2021-06-11 | 西安科技大学 | Rapid in-situ comparison method based on scanning electron microscope |
CN113594076A (en) * | 2021-07-22 | 2021-11-02 | 上海精测半导体技术有限公司 | Method for aligning patterned wafer and semiconductor device |
CN114778583A (en) * | 2022-03-24 | 2022-07-22 | 重庆大学 | Scanning electron microscope target point in-situ observation rapid positioning method and device and storage medium |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118073158A (en) * | 2024-01-10 | 2024-05-24 | 北京中科科仪股份有限公司 | Vacuum inner detection device and scanning electron microscope |
CN118623760A (en) * | 2024-05-22 | 2024-09-10 | 深圳精智达技术股份有限公司 | A method, system and device for detecting defects of a microdisplay |
CN119444627A (en) * | 2025-01-10 | 2025-02-14 | 匠岭科技(上海)有限公司 | Image correction method, device, semiconductor quantity detection method, apparatus, medium, and product |
Also Published As
Publication number | Publication date |
---|---|
CN115616018B (en) | 2023-03-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115616018B (en) | Positioning method and device for scanning electron microscope, electronic equipment and storage medium | |
CN111179358B (en) | Calibration method, device, equipment and storage medium | |
EP3848901B1 (en) | Method and apparatus for calibrating external parameters of image acquisition device, device and storage medium | |
KR101781670B1 (en) | Microscope slide coordinate system registration | |
CN101839692B (en) | Method for measuring three-dimensional position and stance of object with single camera | |
CN106408609B (en) | A kind of parallel institution end movement position and posture detection method based on binocular vision | |
US7305109B1 (en) | Automated microscopic image acquisition compositing, and display | |
Herráez et al. | 3D modeling by means of videogrammetry and laser scanners for reverse engineering | |
CN109801333B (en) | Volume measurement method, device and system and computing equipment | |
US20140132729A1 (en) | Method and apparatus for camera-based 3d flaw tracking system | |
CN111263142B (en) | Method, device, equipment and medium for testing optical anti-shake of camera module | |
CN114952856B (en) | Method, system, computer and readable storage medium for calibrating hand and eye of mechanical arm | |
CN107274450B (en) | Information processing apparatus and control method thereof | |
CN112132908B (en) | Camera external parameter calibration method and device based on intelligent detection technology | |
Wohlfeil et al. | Automatic camera system calibration with a chessboard enabling full image coverage | |
CN113689397A (en) | Workpiece circular hole feature detection method and workpiece circular hole feature detection device | |
CN112504156A (en) | Structural surface strain measurement system and measurement method based on foreground grid | |
US12190543B2 (en) | Lens calibration method for digital imaging apparatus | |
CN106556350B (en) | The measuring method and a kind of microscope of microscopic slide curved surface height value | |
CN115631245A (en) | Correction method, terminal device and storage medium | |
CN115409693A (en) | Two-dimensional positioning method based on pipeline foreign matters in three-dimensional image | |
CN115307865A (en) | Model deformation measurement method for high-temperature hypersonic flow field | |
CN113358038A (en) | Thickness measuring device and method and electronic equipment | |
CN107945236B (en) | Sub-pixel level nonlinear calibration method suitable for general optical system | |
CN118759564B (en) | High-precision transmission imaging method of cosmic ray muons based on virtual space point constraints |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |