[go: up one dir, main page]

CN112634377B - Camera calibration method, terminal and computer readable storage medium of sweeping robot - Google Patents

Camera calibration method, terminal and computer readable storage medium of sweeping robot Download PDF

Info

Publication number
CN112634377B
CN112634377B CN202011595775.XA CN202011595775A CN112634377B CN 112634377 B CN112634377 B CN 112634377B CN 202011595775 A CN202011595775 A CN 202011595775A CN 112634377 B CN112634377 B CN 112634377B
Authority
CN
China
Prior art keywords
calibration
image
camera
sweeping robot
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011595775.XA
Other languages
Chinese (zh)
Other versions
CN112634377A (en
Inventor
杨勇
宫海涛
罗志康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen 3irobotix Co Ltd
Original Assignee
Shenzhen 3irobotix Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen 3irobotix Co Ltd filed Critical Shenzhen 3irobotix Co Ltd
Priority to CN202011595775.XA priority Critical patent/CN112634377B/en
Publication of CN112634377A publication Critical patent/CN112634377A/en
Application granted granted Critical
Publication of CN112634377B publication Critical patent/CN112634377B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4005Arrangements of batteries or cells; Electric power supply arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a camera calibration method, a terminal and a computer readable storage medium of a sweeping robot, wherein the camera calibration method of the sweeping robot comprises the following steps: obtaining calibration images of calibration objects shot by cameras of the sweeping robot at different angles, wherein the calibration objects are stuck to a charging pile for charging the sweeping robot; performing image segmentation operation on the calibration image to obtain a segmented image, and performing filtering operation on the segmented image to obtain a binary image; performing edge extraction operation on the binary image to obtain an edge image, and calculating the center coordinates of the edge image; and calculating camera parameters of the sweeping robot according to the center coordinates to obtain internal parameters and external parameters of the camera. The automatic calibration of the binocular floor sweeping robot is carried out through the combination of the charging pile and the calibration object and the self-calibration algorithm, so that the problem that the existing camera calibration mode is not suitable for the binocular floor sweeping robot is solved, and the accuracy of camera calibration and the robustness of the calibration algorithm are improved.

Description

Camera calibration method, terminal and computer readable storage medium of sweeping robot
Technical Field
The present application relates to the field of camera calibration technologies, and in particular, to a camera calibration method, a terminal, and a computer readable storage medium for a sweeping robot.
Background
At present, camera calibration techniques are commonly available in several types: the traditional camera calibration method comprises an active vision camera calibration method and a camera self-calibration method. The traditional camera calibration method always needs position conversion of the calibration object in the calibration process, the manufacturing precision of the calibration object can influence the calibration result, and meanwhile, the traditional camera calibration method is limited in application due to the fact that the calibration object is not suitable for being placed in some occasions. The self-calibration method has strong flexibility, can calibrate the camera on line, but has poor algorithm robustness because the method is based on an absolute conic or curved surface. The active vision camera calibration method is simple in algorithm, linear solutions can be obtained, so that the robustness is high, but the system is high in cost, high in experimental equipment and high in experimental condition requirement, and is not suitable for occasions where motion parameters are unknown or cannot be controlled. For a binocular floor sweeping robot, due to frequent collision, the binocular is required to be calibrated again frequently, and the manual calibration by workers is not practical by using a traditional calibration method, the requirement of active vision calibration is too high, the self-calibration algorithm has poor robustness, and the calibration error is large, mainly because the center coordinates of the calibration objects are not well calculated. Therefore, the existing camera calibration mode is not suitable for the binocular sweeping robot.
Disclosure of Invention
The embodiment of the application aims to solve the problem that the existing camera calibration mode is not suitable for a binocular sweeping robot by providing a camera calibration method, a terminal and a computer readable storage medium of the sweeping robot.
In order to achieve the above object, according to one aspect of the present application, there is provided a camera calibration method of a sweeping robot, the camera calibration method of the sweeping robot comprising the steps of:
Obtaining calibration images of calibration objects shot by cameras of the sweeping robot at different angles, wherein the calibration objects are stuck to a charging pile for charging the sweeping robot;
performing image segmentation operation on the calibration image to obtain a segmented image, and performing filtering operation on the segmented image to obtain a binary image;
Performing edge extraction operation on the binary image to obtain an edge image, and calculating the center coordinates of the edge image;
and calculating camera parameters of the sweeping robot according to the center coordinates to obtain internal parameters and external parameters of the camera.
Optionally, the step of performing an image segmentation operation on the calibration image to obtain a segmented image, and performing a filtering operation on the segmented image to obtain a binary image includes:
Performing dynamic threshold segmentation operation on the calibration image to obtain the segmented image;
And filtering the segmented image to filter isolated points in the segmented image, so as to obtain the binary image.
Optionally, the step of performing an edge extraction operation on the binary image to obtain an edge image, and calculating a center coordinate of the edge image includes:
performing edge extraction on different black lattice blocks in the binary image to obtain an edge image of the black lattice blocks;
acquiring edge point data in the edge image, and calculating fitting parameters according to the edge point data and a set equation;
and calculating the center coordinates of the edge image according to the fitting parameters.
Optionally, the step of calculating the camera parameters of the sweeping robot according to the center coordinates to obtain the internal parameters and the external parameters of the camera includes:
acquiring a two-dimensional computer frame memory coordinate and a three-dimensional space coordinate corresponding to the center coordinate;
substituting the two-dimensional computer frame memory coordinates and the three-dimensional space coordinates into a camera model, and performing linear solution based on a set calibration mode to obtain internal parameters and external parameters of the camera.
Optionally, the step of obtaining the internal parameter and the external parameter of the camera by performing linear solution based on a set calibration mode includes:
acquiring the external parameters based on the relation between the world coordinate system and the camera coordinate system in the set calibration mode, the radial arrangement constraint condition and the orthogonality of the external parameter matrix;
And acquiring the internal parameters according to the external parameters and the relation between the pixel coordinates and the actual physical coordinates.
Optionally, after the step of calculating the camera parameters of the sweeping robot according to the center coordinates to obtain the internal parameters and the external parameters of the camera, the method further includes:
obtaining calibration parameters and calibration images of the calibration objects shot at different heights;
Matching the calibration parameters with the center coordinates of the calibration images of the calibration objects shot at different heights, and calculating three-dimensional space coordinates of the center coordinates;
and checking the calibration parameters based on the three-dimensional space coordinates of the center coordinates.
Optionally, before the step of calculating the center coordinates of the edge image, the method further includes:
Tracking the edge information of the edge image and acquiring the characteristic information of the edge image;
And screening the edge image based on the characteristic information so as to enable the edge image and the characteristic graph in the calibration image to form a corresponding relation.
Optionally, the step of acquiring calibration images of the calibration object photographed by the camera of the sweeping robot at different angles includes:
when the sweeping robot is recharged, the camera is controlled to shoot the calibration object from different angles, and the calibration images shot from different angles are obtained.
In addition, in order to achieve the above object, another aspect of the present application provides a terminal, which includes a memory, a processor, and a camera calibration program of a sweeping robot stored on the memory and running on the processor, wherein the processor implements the steps of the camera calibration method of the sweeping robot as described above when executing the camera calibration program of the sweeping robot.
In addition, in order to achieve the above object, another aspect of the present application provides a computer-readable storage medium having stored thereon a camera calibration program of a sweeping robot, which when executed by a processor, implements the steps of the camera calibration method of the sweeping robot as described above.
According to the embodiment, the calibration images of the calibration objects shot by the cameras of the sweeping robot at different angles are obtained, and the calibration objects are stuck to the charging piles of the sweeping robot for charging; performing image segmentation operation on the calibration image to obtain a segmented image, and performing filtering operation on the segmented image to obtain a binary image; performing edge extraction operation on the binary image to obtain an edge image, and calculating the center coordinates of the edge image; and calculating camera parameters of the sweeping robot according to the center coordinates to obtain internal parameters and external parameters of the camera. The automatic calibration of the binocular floor sweeping robot is carried out through the combination of the charging pile and the calibration object and the self-calibration algorithm, so that the problem that the existing camera calibration mode is not suitable for the binocular floor sweeping robot is solved, and the accuracy of camera calibration and the robustness of the calibration algorithm are improved.
Drawings
FIG. 1 is a schematic diagram of a terminal structure of a hardware operating environment according to an embodiment of the present application;
FIG. 2 is a flowchart of a first embodiment of a camera calibration method of the sweeping robot of the present application;
FIG. 3 is a flowchart of a second embodiment of a camera calibration method of the sweeping robot of the present application;
FIG. 4 is a flowchart of a third embodiment of a camera calibration method of the sweeping robot of the present application;
FIG. 5 is a schematic flow chart of a method for calibrating a camera of a robot cleaner according to the present application, wherein the method comprises the steps of performing an image segmentation operation on a calibration image to obtain a segmented image, and performing a filtering operation on the segmented image to obtain a binary image;
FIG. 6 is a schematic flow chart of calculating the center coordinates of an edge image obtained by performing an edge extraction operation on the binary image in the camera calibration method of the sweeping robot;
FIG. 7 is a schematic flow chart of calculating camera parameters of the sweeping robot according to the center coordinates to obtain internal parameters and external parameters of the camera in the camera calibration method of the sweeping robot;
fig. 8 is a schematic flow chart of obtaining internal parameters and external parameters of a camera by performing linear solution based on a set calibration mode in the camera calibration method of the sweeping robot.
The achievement of the objects, functional features and advantages of the present application will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The main solutions of the embodiments of the present application are: obtaining calibration images of calibration objects shot by cameras of the sweeping robot at different angles, wherein the calibration objects are stuck to a charging pile for charging the sweeping robot; performing image segmentation operation on the calibration image to obtain a segmented image, and performing filtering operation on the segmented image to obtain a binary image; performing edge extraction operation on the binary image to obtain an edge image, and calculating the center coordinates of the edge image; and calculating camera parameters of the sweeping robot according to the center coordinates to obtain internal parameters and external parameters of the camera.
For the binocular floor sweeping robot, the binocular is required to be frequently recalibrated due to frequent collision, the manual calibration by workers is not practical by the aid of the traditional calibration method, the requirement of the active vision calibration is too high, the self-calibration algorithm is poor in robustness, the calibration error is large, and the conventional camera calibration mode is not suitable for the binocular floor sweeping robot. According to the application, the calibration images of the calibration objects shot by the cameras of the sweeping robot at different angles are obtained, and the calibration objects are stuck to the charging piles for charging the sweeping robot; performing image segmentation operation on the calibration image to obtain a segmented image, and performing filtering operation on the segmented image to obtain a binary image; performing edge extraction operation on the binary image to obtain an edge image, and calculating the center coordinates of the edge image; and calculating camera parameters of the sweeping robot according to the center coordinates to obtain internal parameters and external parameters of the camera. The automatic calibration of the binocular floor sweeping robot is carried out through the combination of the charging pile and the calibration object and the self-calibration algorithm, so that the problem that the existing camera calibration mode is not suitable for the binocular floor sweeping robot is solved, and the accuracy of camera calibration and the robustness of the calibration algorithm are improved.
As shown in fig. 1, fig. 1 is a schematic diagram of a terminal structure of a hardware running environment according to an embodiment of the present application.
As shown in fig. 1, the terminal may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display, an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may further include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
Optionally, the terminal may further include a camera, an RF (Radio Frequency) circuit, a sensor, a remote control, an audio circuit, a WiFi module, a detector, and the like. Of course, the terminal may be further configured with other sensors such as a gyroscope, a barometer, a hygrometer, a temperature sensor, etc., which will not be described herein.
It will be appreciated by those skilled in the art that the terminal structure shown in fig. 1 is not limiting of the terminal device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 1, an operating system, a network communication module, a user interface module, and a camera calibration program of the sweeping robot may be included in a memory 1005 as one type of computer-readable storage medium.
In the terminal shown in fig. 1, the network interface 1004 is mainly used for connecting to a background server and performing data communication with the background server; the user interface 1003 is mainly used for connecting a client (user side) and performing data communication with the client; and the processor 1001 may be configured to call a camera calibration program of the sweeping robot in the memory 1005, and perform the following operations:
Obtaining calibration images of calibration objects shot by cameras of the sweeping robot at different angles, wherein the calibration objects are stuck to a charging pile for charging the sweeping robot;
performing image segmentation operation on the calibration image to obtain a segmented image, and performing filtering operation on the segmented image to obtain a binary image;
Performing edge extraction operation on the binary image to obtain an edge image, and calculating the center coordinates of the edge image;
and calculating camera parameters of the sweeping robot according to the center coordinates to obtain internal parameters and external parameters of the camera.
Referring to fig. 2, fig. 2 is a flowchart of a first embodiment of a camera calibration method of the sweeping robot according to the present application.
The embodiments of the present application provide embodiments of a camera calibration method for a sweeping robot, it being noted that although a logic sequence is shown in the flow chart, in some cases the steps shown or described may be performed in a different order than that shown or described herein.
The camera calibration method of the sweeping robot comprises the following steps:
Step S10, obtaining calibration images of calibration objects shot by cameras of the sweeping robot at different angles, wherein the calibration objects are stuck to a charging pile for charging the sweeping robot;
An important part of the navigation technology based on vision is the calibration of a camera, and accurate navigation calculation can be performed only by the camera with parameter calibration and the projection of an external scenery on an image sensor. Parameters of the camera are divided into internal parameters, which refer to focal length, distortion, center point, etc. of the camera, and external parameters, which refer mainly to spatial displacement and rotation angle of the camera with respect to the machine. Since the parameters of the camera cannot be guaranteed to be consistent, and the installation also has a consistency problem, each robot needs to calibrate the internal and external parameters of the camera in order to guarantee the accuracy of visual calculation.
A calibration object (also called a calibration plate) is stuck on the charging pile of the binocular sweeping robot, wherein according to different self-calibration algorithms, different calibration objects can be collocated, for example: black and white square checkerboard or black and white round checkerboard. In the application, black and white square checkerboards are selected as calibration objects in order to facilitate shooting the calibration objects and calculating the center points in different directions by using the binocular sweeping robot. When the binocular floor sweeping robot is recharged, the calibration images of the calibration objects are shot in different directions through the two cameras, specifically, in the recharging process, when the calibration objects appear in the common view field of the two cameras, one calibration image shot by the left camera and one calibration image shot by the right camera are acquired, and then after a certain distance is moved, the calibration images shot by the left camera and the right camera are acquired in different directions.
Step S20, performing image segmentation operation on the calibration image to obtain a segmented image, and performing filtering operation on the segmented image to obtain a binary image;
After the calibration image is acquired by the binocular sweeping robot, the calibration image is required to be subjected to image segmentation, and the segmented image is further filtered to obtain a binary image. Referring to fig. 5, the step of performing an image segmentation operation on the calibration image to obtain a segmented image, and performing a filtering operation on the segmented image to obtain a binary image includes:
S21, performing dynamic threshold segmentation operation on the calibration image to obtain the segmented image;
and S22, filtering the segmented image to remove isolated points in the segmented image, so as to obtain the binary image.
The histogram of the calibration image generally has a double peak, wherein the target is represented as a sharp single peak, so that the dynamic thresholding method can be used for threshold segmentation to obtain a binary image. Specifically, different colors in the calibration image correspond to different gray values, and the gray values range from 0 to 255, wherein black is 0, and white is 255. The calibration image is divided into a plurality of mutually disjoint areas based on gray features, so that the gray features show consistency or similarity in the same area and obviously different areas, namely, in one image, the target is separated from the background, namely, black grids in the calibration image are extracted. However, due to the existence of noise, the image after threshold segmentation has a plurality of scattered clutter points which are usually isolated points and the target is a continuous image, so that a filtering method based on geometric shapes is adopted, most isolated points can be filtered, and a target point set can be well protected, so that the influence of noise on calibration is reduced. The method can effectively reduce the influence of the ambient light on the calibration, so that the method is suitable for different working environments, and thus, a black lattice caused by shooting calibration objects and distortion can be segmented, and a black-and-white image, namely a binary image, is formed. Wherein, because some black white points appear on the image when the camera shoots, the randomly appearing black white points are noise.
Step S30, performing edge extraction operation on the binary image to obtain an edge image, and calculating the center coordinates of the edge image;
In the process of collecting the calibration images, due to the influence of factors such as the azimuth of the camera and the distortion coefficient of the camera, when a plurality of calibration images of the same punctuation object are shot in different directions, the shapes of the calibration images may be changed, for example: the shape of the calibration image may be square, diamond, rectangle, irregular quadrangle, etc., and the shape of the black lattice in the calibration image may be changed. Therefore, in order to reduce the influence of camera distortion on the calibration result as much as possible, the calibration template should be parallel to the camera as much as possible, and the checkerboard image should be located at the imaging center.
After obtaining the binary image, carrying out morphological contour extraction on the binary image to obtain a corresponding edge image, and then calculating the center coordinates of different edge images, wherein the center coordinates have a projection relationship with the center coordinates in the original image, namely characteristic points, wherein when the contour extraction is carried out, contour diagrams of different black lattice blocks generated due to distortion are mainly extracted, referring to fig. 6, the step of carrying out edge extraction operation on the binary image to obtain the edge image, and calculating the center coordinates of the edge image comprises the following steps:
Step S31, carrying out edge extraction on different black lattice blocks in the binary image to obtain an edge image of the black lattice blocks;
Step S32, obtaining edge point data in the edge image, and calculating fitting parameters according to the edge point data and a set equation;
And step S33, calculating the center coordinates of the edge image according to the fitting parameters.
The binocular floor sweeping robot performs morphological contour extraction on the extracted binary image, and extracts different black lattice blocks (square, diamond, rectangle and the like) generated by distortion into different shapes according to the contour for feature extraction calculation. And acquiring edge point data in the edge image, calculating fitting parameters according to the edge point data and a set equation, and calculating the center coordinates of the edge image according to the fitting parameters. Specifically, the computer frame memory coordinates (u i,vi) of the center of the quadrangle are obtained by using the least square method, and the specific algorithm is as follows:
The general equation for a quadrilateral is:
A1*x+B1*y+C1=0
A2*x+B2*y+C2=0
A3*x+B3*y+C3=0
A4*x+B4*y+C4=0
substituting the extracted edge point data into a general equation of the quadrangle to form an overrun equation, and then solving fitting parameters A, B, C by an optimization method, wherein the center coordinates of the quadrangle are (X, Y), and X and Y are calculated based on the fitting parameters.
Optionally, if the selected calibration object is a black-white circular checkerboard, the circular checkerboard is generally an ellipse due to the influence of factors such as the camera azimuth and the distortion coefficient of the circular checkerboard, and a certain projection relationship exists between the center of the ellipse and the center point of the circular hole, and the center point of the ellipse is the feature point to be extracted. And (3) utilizing the ellipse edge point data to fit an ellipse equation, and utilizing a least square method to obtain the computer frame memory coordinates (u i,vi) of the center of the circular hole of the ellipse. The specific algorithm is as follows:
the general equation for an ellipse is:
Ax2+2Bxy+Cy2+2Dx+2Ey+F=0
Substituting the detected elliptical edge point data into the formula to form an overrun equation set, and then solving a best fitting parameter A, B, C, D, E, F in the least square sense by using an optimization method, wherein the coordinates of the center (X, Y) of the ellipse are as follows:
X=(BE-2CD)/(B2-4AC)
Y=(2EA-BD)/(B2-4AC)
and S40, calculating camera parameters of the sweeping robot according to the center coordinates to obtain internal parameters and external parameters of the camera.
The most commonly used method is to use the camera calibration technology of a perspective transformation matrix, wherein the two-dimensional transformation from the three-dimensional point of an object to the image point is linear transformation, and given the three-dimensional world coordinates of enough points and the corresponding image coordinates, the parameters of the camera can be obtained by using each element of the linear transformation matrix to obtain the perspective transformation matrix, and then the internal parameters and the external parameters of the camera obtained by decomposing the perspective matrix.
And obtaining corresponding two-dimensional coordinates and three-dimensional coordinates of the center coordinates (namely the characteristic points), and performing linear solution based on a two-step calibration method to complete the calibration of the camera so as to obtain internal parameters and external parameters of the camera. Referring to fig. 7, the step of calculating the camera parameters of the sweeping robot according to the center coordinates to obtain the internal parameters and the external parameters of the camera includes:
step S41, acquiring two-dimensional computer frame memory coordinates and three-dimensional space coordinates corresponding to the center coordinates;
and S42, substituting the two-dimensional computer frame memory coordinates and the three-dimensional space coordinates into a camera model, and performing linear solution based on a set calibration mode to obtain internal parameters and external parameters of the camera.
Acquiring a two-dimensional computer frame memory coordinate and a three-dimensional space coordinate corresponding to the center coordinate, substituting the two-dimensional computer frame memory coordinate and the three-dimensional space coordinate into the camera model, and according to four large coordinates in the camera model: and (3) the conversion relation among world coordinates, image coordinates, pixel coordinates and actual physical coordinates is solved linearly according to an RAC (radial arrangement constraint) two-step method to finish the calibration of the camera, and the internal parameters and the external parameters of the camera are obtained. The Tsai two-step calibration method is one of the traditional camera calibration methods, and the Tsai provides a two-stage calibration method by utilizing radial arrangement constraint conditions according to a radial distortion model of a camera: firstly, solving most model parameters according to radial arrangement constraint, and then solving parameters such as distortion coefficients, effective focal length and the like through a nonlinear search method.
Further, referring to fig. 8, the step of obtaining the internal parameters and the external parameters of the camera by performing linear solution based on the set calibration mode includes:
Step S420, obtaining the external parameters based on the relation between the world coordinate system and the camera coordinate system in the set calibration mode, the radial arrangement constraint condition and the orthogonality of the external parameter matrix;
Step S421, the internal parameters are obtained according to the external parameters and the relation between the pixel coordinates and the actual physical coordinates.
In the Tsai two-step calibration method, external parameters of a camera are solved based on direct linear calibration, specifically, after pixel coordinates of feature points are obtained, the relationship between a world coordinate system and a camera coordinate system is calculated:
And then according to the radial coincidence constraint:
And the orthogonality of R, namely det (R) =1, obtaining an external parameter matrix R of the camera and the translation t 1、t2, wherein Xw, yw and Zw are coordinates of a spatial point P in a world coordinate system, x and y are coordinates of the point P in the camera coordinate system, and u and v are pixel coordinates of the point P in an image; r is 3*3 orthogonal identity matrix, t is three-dimensional translation vector, s is uncertain length factor or aspect ratio, and f is effective focal length.
After obtaining the external parameters of the camera, further calculating the internal parameters based on nonlinear optimization, specifically, direct linear calibration does not consider lens deformation, and has the defects of large error and unstable result, so that nonlinear optimization is performed, and the following relationship exists between the actual image point coordinates and the ideal image point coordinates:
And carrying out external parameter data of the camera obtained in the previous step to obtain:
the accurate parameters in the camera can be obtained by using a least square method. Wherein, Is the actual image point coordinates.
According to the embodiment, the calibration images of the calibration objects shot by the cameras of the sweeping robot at different angles are obtained, and the calibration objects are stuck to the charging piles of the sweeping robot for charging; performing image segmentation operation on the calibration image to obtain a segmented image, and performing filtering operation on the segmented image to obtain a binary image; performing edge extraction operation on the binary image to obtain an edge image, and calculating the center coordinates of the edge image; and calculating camera parameters of the sweeping robot according to the center coordinates to obtain internal parameters and external parameters of the camera. The automatic calibration of the binocular floor sweeping robot is carried out through the combination of the charging pile and the calibration object and the self-calibration algorithm, so that the problem that the existing camera calibration mode is not suitable for the binocular floor sweeping robot is solved, and the accuracy of camera calibration and the robustness of the calibration algorithm are improved.
Further, referring to fig. 3, a second embodiment of the camera calibration method of the sweeping robot of the present application is presented.
The second embodiment of the camera calibration method of the sweeping robot is different from the first embodiment of the camera calibration method of the sweeping robot in that after the step of calculating the camera parameters of the sweeping robot according to the center coordinates to obtain the internal parameters and the external parameters of the camera, the method further includes:
S43, obtaining calibration parameters and calibration images of the calibration objects shot at different heights;
step S44, matching the calibration parameters with the center coordinates of the calibration images of the calibration objects shot at different heights, and calculating the three-dimensional space coordinates of the center coordinates;
And step S45, checking the calibration parameters based on the three-dimensional space coordinates of the center coordinates.
After the calibration of the camera parameters is completed, the calibration parameters are verified, the calibration parameters and the calibration images of the calibration objects shot at different heights are obtained, the calibration parameters are matched with the calibration images shot at different heights in a central coordinate mode, the three-dimensional space coordinates of the central coordinates are calculated, and the calibration parameters are verified based on the three-dimensional space coordinates. In an embodiment, the Z-axis is respectively positioned at z=270 and z=280, the left and right images of the calibration object are respectively acquired, and the feature points in the two planes are used as matching corresponding points. After the calibration image is selected, the computer automatically completes the calibration work of the left camera and the right camera. For example, the result obtained by calibrating the left camera is: image plane center coordinates (x, y) = (-10.05, -12.58), scaling factor S x = 0.88 for horizontal and vertical pixels, focal length f = 10.04mm. And writing the calibrated parameters (including the internal parameters and the external parameters) into a parameter file, and using the parameters in space point matching and three-dimensional coordinate recovery calculation, thereby preparing the three-dimensional vision measurement. In order to check the precision of the calibration result, the calibration parameters are used for carrying out characteristic point matching on the images acquired by the calibration blocks at different heights, and the three-dimensional coordinates of the images are calculated. For example, calibration images at heights z=240, z=280, z=310 are acquired, three-dimensional space coordinates corresponding to each height are calculated, and corresponding calibration parameters are verified based on the three-dimensional space coordinates.
In the embodiment, the calibration parameters and the characteristic points of the calibration images are matched by acquiring the calibration parameters and the calibration images shot at different heights, and meanwhile, the three-dimensional space coordinates of the characteristic points are calculated, and the calibration parameters are verified based on the three-dimensional space coordinates, so that the accuracy of the calibration parameters is ensured.
Further, referring to fig. 4, a third embodiment of the camera calibration method of the sweeping robot of the present application is presented.
The third embodiment of the camera calibration method of the sweeping robot is different from the first and second embodiments of the camera calibration method of the sweeping robot in that before the step of calculating the center coordinates of the edge image, the method further includes:
Step S34, tracking the edge information of the edge image and acquiring the characteristic information of the edge image;
And step S35, screening the edge image based on the characteristic information so as to enable the edge image and the characteristic graph in the calibration image to form a corresponding relation.
Multi-target contour tracking and target screening general contour tracking algorithms can only track the edges of a single target. In the application, a plurality of quadrilateral targets exist, so that a multi-target contour tracking algorithm is needed to automatically track all target quadrilateral edge information. Meanwhile, a bidirectional tracking method is adopted for some special straight lines, such as quadrilaterals with broken lines, so that correct results can be obtained when the point sets of the special quadrilaterals are processed, and the method has stronger robustness. Meanwhile, screening is carried out according to the position information, the pixel number and other information of each target, and the required target quadrilaterals are ordered so as to form a one-to-one correspondence relationship with the characteristic quadrilaterals in the calibration image, so that preparation is made for calibration solution.
In the embodiment, the edge information of the target quadrangle is tracked, and screening is performed based on the position information, the pixel number and other information of each target, so that the edge image and the characteristic quadrangle in the calibration image form a one-to-one correspondence relationship, and preparation is made for calibration solution.
The application also provides a camera calibration device of the sweeping robot, in one embodiment, the camera calibration device of the sweeping robot comprises a memory, a processor and a camera calibration program of the sweeping robot, wherein the camera calibration program of the sweeping robot is stored on the memory and can run on the processor, and the following steps are realized when the camera calibration program of the sweeping robot is executed by the processor:
Obtaining calibration images of calibration objects shot by cameras of the sweeping robot at different angles, wherein the calibration objects are stuck to a charging pile for charging the sweeping robot;
performing image segmentation operation on the calibration image to obtain a segmented image, and performing filtering operation on the segmented image to obtain a binary image;
Performing edge extraction operation on the binary image to obtain an edge image, and calculating the center coordinates of the edge image;
and calculating camera parameters of the sweeping robot according to the center coordinates to obtain internal parameters and external parameters of the camera.
In one embodiment, the camera calibration device of the sweeping robot comprises an acquisition module, a segmentation module, an extraction module and a calculation module;
the acquisition module is used for acquiring calibration images of calibration objects shot by cameras of the sweeping robot at different angles, and the calibration objects are stuck to a charging pile for charging the sweeping robot;
The segmentation module is used for performing image segmentation operation on the calibration image to obtain a segmented image, and performing filtering operation on the segmented image to obtain a binary image;
The extraction module is used for performing edge extraction operation on the binary image to obtain an edge image, and calculating the center coordinates of the edge image;
The calculation module is used for calculating camera parameters of the sweeping robot according to the center coordinates to obtain internal parameters and external parameters of the camera.
Further, the segmentation module comprises a segmentation unit and a filtering unit;
the segmentation unit is used for performing dynamic threshold segmentation operation on the calibration image to obtain the segmented image;
The filtering unit is used for filtering the segmented image to filter isolated points in the segmented image, so as to obtain the binary image.
Further, the extraction module comprises an extraction unit and a first calculation unit;
the extraction unit is used for carrying out edge extraction on different black lattice blocks in the binary image to obtain an edge image of the black lattice blocks;
The first calculation unit is used for acquiring edge point data in the edge image and calculating fitting parameters according to the edge point data and a set equation;
The first calculating unit is further configured to calculate center coordinates of the edge image according to the fitting parameters.
Further, the computing module comprises an acquisition unit and a second computing unit;
the acquisition unit is used for acquiring the two-dimensional computer frame memory coordinates and the three-dimensional space coordinates corresponding to the center coordinates;
the second calculation unit is used for substituting the two-dimensional computer frame memory coordinates and the three-dimensional space coordinates into a camera model, and performing linear solution based on a set calibration mode to obtain internal parameters and external parameters of the camera.
Further, the second computing unit includes an acquisition subunit;
the acquisition subunit is used for acquiring the external parameters based on the relation between the world coordinate system and the camera coordinate system in the set calibration mode, the radial arrangement constraint condition and the orthogonality of the external parameter matrix;
the obtaining subunit is further configured to obtain the internal parameter according to the external parameter and a relationship between the pixel coordinate and the actual physical coordinate.
Further, the computing module includes a verification unit;
The acquisition unit is also used for acquiring calibration parameters and calibration images of the calibration objects shot at different heights;
The second calculation unit is further used for matching the calibration parameters with the center coordinates of the calibration images of the calibration objects shot at different heights and calculating three-dimensional space coordinates of the center coordinates;
And the checking unit is used for checking the calibration parameters based on the three-dimensional space coordinates of the center coordinates.
Further, the extraction module comprises a tracking unit and a screening unit;
the tracking unit is used for tracking the edge information of the edge image and acquiring the characteristic information of the edge image;
And the screening unit is used for screening the edge image based on the characteristic information so as to enable the edge image and the characteristic graph in the calibration image to form a corresponding relation.
Further, the acquisition module comprises a shooting unit;
And the shooting unit is used for controlling the camera to shoot the calibration object from different angles when the sweeping robot is recharged, so as to obtain the calibration images shot from different angles.
The implementation of the functions of each module of the camera calibration device of the sweeping robot is similar to the process in the embodiment of the method, and will not be described in detail herein.
In addition, the application also provides a terminal, which comprises a memory, a processor and a camera calibration program of the sweeping robot, wherein the camera calibration program is stored in the memory and runs on the processor, and the terminal is used for sticking calibration objects on a charging pile for charging the sweeping robot by acquiring calibration images of calibration objects shot by cameras of the sweeping robot at different angles; performing image segmentation operation on the calibration image to obtain a segmented image, and performing filtering operation on the segmented image to obtain a binary image; performing edge extraction operation on the binary image to obtain an edge image, and calculating the center coordinates of the edge image; and calculating camera parameters of the sweeping robot according to the center coordinates to obtain internal parameters and external parameters of the camera. The automatic calibration of the binocular floor sweeping robot is carried out through the combination of the charging pile and the calibration object and the self-calibration algorithm, so that the problem that the existing camera calibration mode is not suitable for the binocular floor sweeping robot is solved, and the accuracy of camera calibration and the robustness of the calibration algorithm are improved.
In addition, the application also provides a computer readable storage medium, wherein the computer readable storage medium is stored with a camera calibration program of the sweeping robot, and the steps of the camera calibration method of the sweeping robot are realized when the camera calibration program of the sweeping robot is executed by a processor.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
While alternative embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following appended claims be interpreted as including alternative embodiments and all such alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (9)

1. A camera calibration method for a sweeping robot, the method comprising:
Obtaining calibration images of calibration objects shot by cameras of the sweeping robot at different angles, wherein the calibration objects are stuck to a charging pile for charging the sweeping robot;
performing image segmentation operation on the calibration image to obtain a segmented image, and performing filtering operation on the segmented image to obtain a binary image;
Performing edge extraction operation on the binary image to obtain an edge image, and calculating the center coordinates of the edge image;
acquiring a two-dimensional computer frame memory coordinate and a three-dimensional space coordinate corresponding to the center coordinate;
substituting the two-dimensional computer frame memory coordinates and the three-dimensional space coordinates into a camera model, and performing linear solution based on a set calibration mode to obtain internal parameters and external parameters of the camera.
2. The camera calibration method of a sweeping robot according to claim 1, wherein the step of performing an image segmentation operation on the calibration image to obtain a segmented image and performing a filtering operation on the segmented image to obtain a binary image comprises:
Performing dynamic threshold segmentation operation on the calibration image to obtain the segmented image;
And filtering the segmented image to filter isolated points in the segmented image, so as to obtain the binary image.
3. The camera calibration method of the sweeping robot according to claim 1, wherein the step of performing an edge extraction operation on the binary image to obtain an edge image, and calculating center coordinates of the edge image includes:
performing edge extraction on different black lattice blocks in the binary image to obtain an edge image of the black lattice blocks;
acquiring edge point data in the edge image, and calculating fitting parameters according to the edge point data and a set equation;
and calculating the center coordinates of the edge image according to the fitting parameters.
4. The camera calibration method of the sweeping robot according to claim 1, wherein the step of linearly solving based on a set calibration mode to obtain the internal parameters and the external parameters of the camera comprises:
acquiring the external parameters based on the relation between the world coordinate system and the camera coordinate system in the set calibration mode, the radial arrangement constraint condition and the orthogonality of the external parameter matrix;
And acquiring the internal parameters according to the external parameters and the relation between the pixel coordinates and the actual physical coordinates.
5. A camera calibration method for a sweeping robot according to any one of claims 1 to 3, wherein after the step of obtaining the internal parameters and the external parameters of the camera by performing linear solution based on a set calibration method, the method further comprises:
obtaining calibration parameters and calibration images of the calibration objects shot at different heights;
Matching the calibration parameters with the center coordinates of the calibration images of the calibration objects shot at different heights, and calculating three-dimensional space coordinates of the center coordinates;
and checking the calibration parameters based on the three-dimensional space coordinates of the center coordinates.
6. A camera calibration method of a sweeping robot according to any one of claims 1 to 3, further comprising, before the step of calculating the center coordinates of the edge image:
Tracking the edge information of the edge image and acquiring the characteristic information of the edge image;
And screening the edge image based on the characteristic information so as to enable the edge image and the characteristic graph in the calibration image to form a corresponding relation.
7. A camera calibration method for a sweeping robot according to any one of claims 1 to 3, wherein the step of acquiring calibration images of a calibration object photographed by the camera of the sweeping robot at different angles comprises:
when the sweeping robot is recharged, the camera is controlled to shoot the calibration object from different angles, and the calibration images shot from different angles are obtained.
8. A terminal comprising a memory, a processor and a camera calibration program of a sweeping robot stored on the memory and running on the processor, the processor implementing the steps of the camera calibration method of a sweeping robot according to any one of claims 1 to 7 when executing the camera calibration program of a sweeping robot.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a camera calibration program of a sweeping robot, which when executed by a processor, implements the steps of the camera calibration method of a sweeping robot according to any one of claims 1 to 7.
CN202011595775.XA 2020-12-28 2020-12-28 Camera calibration method, terminal and computer readable storage medium of sweeping robot Active CN112634377B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011595775.XA CN112634377B (en) 2020-12-28 2020-12-28 Camera calibration method, terminal and computer readable storage medium of sweeping robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011595775.XA CN112634377B (en) 2020-12-28 2020-12-28 Camera calibration method, terminal and computer readable storage medium of sweeping robot

Publications (2)

Publication Number Publication Date
CN112634377A CN112634377A (en) 2021-04-09
CN112634377B true CN112634377B (en) 2024-08-13

Family

ID=75287261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011595775.XA Active CN112634377B (en) 2020-12-28 2020-12-28 Camera calibration method, terminal and computer readable storage medium of sweeping robot

Country Status (1)

Country Link
CN (1) CN112634377B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113066134B (en) * 2021-04-23 2024-11-22 深圳市商汤科技有限公司 A visual sensor calibration method and device, electronic device and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107133989A (en) * 2017-06-12 2017-09-05 中国科学院长春光学精密机械与物理研究所 A kind of 3 D scanning system parameter calibration method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108269279B (en) * 2017-07-17 2019-11-08 先临三维科技股份有限公司 Three-dimensional reconstruction method and device based on monocular 3 D scanning system
CN108416791B (en) * 2018-03-01 2021-07-23 燕山大学 A Pose Monitoring and Tracking Method of Parallel Mechanism Moving Platform Based on Binocular Vision
CN110648367A (en) * 2019-08-15 2020-01-03 大连理工江苏研究院有限公司 Geometric object positioning method based on multilayer depth and color visual information
CN111721259B (en) * 2020-06-24 2022-05-03 江苏科技大学 Recycling and positioning method of underwater robot based on binocular vision

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107133989A (en) * 2017-06-12 2017-09-05 中国科学院长春光学精密机械与物理研究所 A kind of 3 D scanning system parameter calibration method

Also Published As

Publication number Publication date
CN112634377A (en) 2021-04-09

Similar Documents

Publication Publication Date Title
CN110568447B (en) Visual positioning method, device and computer readable medium
WO2021115071A1 (en) Three-dimensional reconstruction method and apparatus for monocular endoscope image, and terminal device
CN110163912B (en) Two-dimensional code pose calibration method, device and system
JP6507730B2 (en) Coordinate transformation parameter determination device, coordinate transformation parameter determination method, and computer program for coordinate transformation parameter determination
WO2018133130A1 (en) 3d marker model construction and real-time tracking using monocular camera
JP6348093B2 (en) Image processing apparatus and method for detecting image of detection object from input data
KR102206108B1 (en) A point cloud registration method based on RGB-D camera for shooting volumetric objects
CN112132907B (en) Camera calibration method and device, electronic equipment and storage medium
CN111598946B (en) Object pose measuring method and device and storage medium
CN110926330B (en) Image processing apparatus, image processing method, and program
CN103810475B (en) A kind of object recognition methods and device
CN110163025A (en) Two dimensional code localization method and device
JP2004334819A (en) Stereo calibration device and stereo image monitoring device using same
CN108345821B (en) Face tracking method and device
CN109920004A (en) Image processing method, device, calibration material combination, terminal equipment and calibration system
CN104700355A (en) Generation method, device and system for indoor two-dimension plan
CN107680035B (en) Parameter calibration method and device, server and readable storage medium
CN108960267A (en) System and method for model adjustment
CN109902675B (en) Object pose acquisition method and scene reconstruction method and device
CN116912333A (en) Camera attitude self-calibration method based on operation fence calibration rod
CN112634377B (en) Camera calibration method, terminal and computer readable storage medium of sweeping robot
CN110673607A (en) Feature point extraction method and device in dynamic scene and terminal equipment
CN117115242B (en) Identification method of mark point, computer storage medium and terminal equipment
CN112686962A (en) Indoor visual positioning method and device and electronic equipment
CN117437393A (en) Active alignment algorithm for micro LED chip

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant