CN114299156A - Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area - Google Patents
Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area Download PDFInfo
- Publication number
- CN114299156A CN114299156A CN202111511234.9A CN202111511234A CN114299156A CN 114299156 A CN114299156 A CN 114299156A CN 202111511234 A CN202111511234 A CN 202111511234A CN 114299156 A CN114299156 A CN 114299156A
- Authority
- CN
- China
- Prior art keywords
- camera
- coordinate system
- coordinates
- calibration
- calibrating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 239000011159 matrix material Substances 0.000 claims abstract description 18
- 238000013519 translation Methods 0.000 claims abstract description 10
- 239000013598 vector Substances 0.000 claims abstract description 8
- 238000001514 detection method Methods 0.000 claims description 5
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 abstract description 5
- 238000004364 calculation method Methods 0.000 abstract description 4
- FPIGOBKNDYAZTP-UHFFFAOYSA-N 1,2-epoxy-3-(4-nitrophenoxy)propane Chemical compound C1=CC([N+](=O)[O-])=CC=C1OCC1OC1 FPIGOBKNDYAZTP-UHFFFAOYSA-N 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 241000287196 Asthenes Species 0.000 description 1
- 241000238097 Callinectes sapidus Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a method for calibrating multiple cameras and unifying coordinates under a non-overlapping area, which comprises an operation method, wherein the operation method comprises the following steps: calibrating internal reference to obtain an internal reference matrix and a distortion coefficient of the camera; step two: obtaining image coordinates of the feature points; step three: obtaining world coordinates of the feature points; step four: obtaining a rotation matrix and a translation vector of the camera coordinate system and a world coordinate system; step five: obtaining the rotation and translation relation between each camera coordinate system and the world coordinate system; step six: one camera is taken as a reference camera, and other cameras are unified to a camera coordinate system of the reference camera. The calibration and coordinate unification method of the multiple cameras in the non-overlapping area considers the practical factors of high cost and inconvenient use of a large-scale complex calibration device, selects a calibration plate which is easier to obtain for camera calibration, and unifies the coordinate system without a complicated calculation process.
Description
Technical Field
The invention relates to the technical field of computer vision, in particular to a method for calibrating and unifying coordinates of multiple cameras in a non-overlapping area.
Background
Machine vision in industrial application, high-precision industrial equipment requires the machine vision to have the characteristics of high precision and large field of view, which requires the use of multiple cameras, and requires the unification of multiple cameras into the same coordinate system without overlapping areas. At present, methods and principles for global calibration of multi-camera systems are roughly classified into the following categories:
based on a large-scale measuring device method: for example, a laser tracker is adopted to directly measure the three-dimensional coordinates of pixel points in the image coordinate system in the reference coordinate system, obtain the relative positions of all cameras in the reference coordinate system, and uniformly convert the relative positions into the reference coordinate system. Based on the reflector method: the transformation relationship between the cameras is solved, for example, using a planar calibration plate and mirrors. Thirdly, based on a motion model method: such as by using the precise motion of the robot and the sequence of images observed by the non-overlapping field-of-view cameras to compensate for the non-overlapping regions between the cameras.
At present, the problems of complex calibration device and unstable calibration result exist in the prior art. Large measurement devices are not readily available for general experimental scenarios and are costly.
Based on the method, the method for calibrating the multiple cameras and unifying the coordinates under the non-overlapping area is designed to solve the problems.
Disclosure of Invention
The invention aims to provide a method for calibrating and unifying coordinates of multiple cameras under a non-overlapping area, so as to solve the problems that the prior art has a complicated calibrating device and an unstable calibrating result. The large-scale measuring device is not easy to obtain in common experimental scenes and has higher cost.
In order to achieve the purpose, the invention provides the following technical scheme:
the calibration and coordinate unification method of the multiple cameras under the non-overlapping area comprises an operation method, wherein the operation method comprises the following steps: calibrating internal reference to obtain an internal reference matrix and a distortion coefficient of the camera; step two: obtaining image coordinates of the feature points; step three: obtaining world coordinates of the feature points; step four: obtaining a rotation matrix and a translation vector of the camera coordinate system and a world coordinate system; step five: obtaining the rotation and translation relation between each camera coordinate system and the world coordinate system; step six: one camera is taken as a reference camera, and other cameras are unified to a camera coordinate system of the reference camera.
As a further aspect of the present invention, in the first step, the camera is adjusted to an appropriate aperture and focal length by using the calibrating method of the gnomon camera, and the aperture and focal length cannot be changed during the calibration process. And selecting symmetrical circular calibration plates, and shooting at least 20 qualified calibration pictures by each camera to carry out internal reference calibration so as to obtain an internal reference matrix and a distortion coefficient of the camera.
As a further scheme of the present invention, in the second step, the camera shoots a part of the area of the large-sized chAruCo calibration board, the corner points are detected by using a function, namely detect markers, of detection markers carried by the itself, the detected markers are used as feature points, and the function can return the image coordinates of the corner points.
As a further scheme of the present invention, in the third step, the origin of the world coordinate system is defined by itself, and a coordinate origin that facilitates calculation of the world coordinate of the feature point may be defined, the chAruCo calibration board is attached to the horizontal desktop, the vertex of the first mark in the lower right corner is taken as the origin of the world coordinate system, Z =0 (upward is the Z axis) is taken, and the coordinates of the vertices in the lower right corner of all the mark points, that is, the world coordinates of the mark points, are calculated according to the arrangement rule of the marks and the id of the mark points.
As a further aspect of the present invention, in the fourth step, by using a monocular relative position estimation function cv.
As a further aspect of the present invention, in the fifth step, the first to fourth steps are repeated for each camera to obtain an external reference of the camera coordinate of each camera, and at this time, a conversion relationship from the camera coordinate system to the world coordinate system is obtained.
Compared with the prior art, the invention has the beneficial effects that:
the calibration and coordinate unification method of the multiple cameras in the non-overlapping area considers the practical factors of high cost and inconvenient use of a large-scale complex calibration device, selects a calibration plate which is easier to obtain for camera calibration, and unifies the coordinate system without a complicated calculation process.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic view of a chAruCo calibration plate for a calibration and coordinate unification method of multiple cameras under a non-overlapping area according to the present invention;
fig. 2 is a schematic diagram of embodiment 1 of a calibration and coordinate unification method of multiple cameras under a non-overlap region according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1-2, the present invention provides a method for calibrating and unifying the coordinates of multiple cameras in a non-overlapping area, wherein,
according to the method, the large-size chAruCo calibration board with the two-dimensional codes is utilized, each camera can shoot a part of area of the calibration board, angular points are searched to serve as characteristic points, a world coordinate system is established, the world coordinates of each characteristic point are obtained, and therefore the 2D-3D matching relation of the characteristic points is established. One of the advantages of the chAruCo calibration plate is that the chAruCo calibration plate has a mark id and a direction, and is convenient for calculating world coordinates. Taking an internal reference matrix, a distortion coefficient, a coordinate of a characteristic point in an image coordinate system and a coordinate in a world coordinate system as input, and outputting an external reference of the camera coordinate system through a solvePnP function, wherein the number of the characteristic points is more than 3, and the internal reference matrix and the distortion coefficient of the camera are obtained through camera calibration. After the rotation matrix R and the translation vector T of each camera coordinate system and the world coordinate system are obtained, the coordinate systems of all the cameras can be unified under the coordinate system of a reference camera, wherein the reference camera is any one camera selected by the reference camera.
Fig. 1 is a schematic view of a chAruCo calibration plate for a method of calibration and coordinate unification of multiple cameras under non-overlapping regions according to the present invention, as can be seen from fig. 1,
the calibration and coordinate unification method of the multiple cameras under the non-overlapping area comprises an operation method, wherein the operation method comprises the following steps: calibrating internal reference to obtain an internal reference matrix and a distortion coefficient of the camera; step two: obtaining image coordinates of the feature points; step three: obtaining world coordinates of the feature points; step four: obtaining a rotation matrix and a translation vector of the camera coordinate system and a world coordinate system; step five: obtaining the rotation and translation relation between each camera coordinate system and the world coordinate system; step six: one camera is taken as a reference camera, and other cameras are unified to a camera coordinate system of the reference camera.
In the first step, the camera is adjusted to a proper aperture and focal length by using a Zhang friend camera calibration method, and the aperture and the focal length cannot be changed in the calibration process. And selecting symmetrical circular calibration plates, and shooting at least 20 qualified calibration pictures by each camera to carry out internal reference calibration so as to obtain an internal reference matrix and a distortion coefficient of the camera.
In the second step, a camera shoots a part of area of the large-size chAruCo calibration plate, angular points are detected by using functions of detection mark points, namely detection markers, carried by the aid of the aid, the detected mark points are used as characteristic points, and the functions can return image coordinates of the angular points.
In the third step, the origin of the world coordinate system is defined by itself, and a coordinate origin which is convenient for calculating the world coordinate of the feature point can be defined, as shown in fig. one, the chAruCo calibration board is attached to the horizontal desktop, the vertex of the first mark at the lower right corner is taken as the origin of the world coordinate system, Z =0 (upward is a Z axis) is taken, and the vertex coordinates of the lower right corner of all the mark points, namely the world coordinate of the mark points, are calculated according to the arrangement rule of the marks and the id of the mark points.
In the calibration board, id of each mark is 0, 1, 2, … from right to left from lower right corner, and from bottom to top in sequence, according to the arrangement rule, the world coordinate of the vertex of the lower right corner of each mark point can be calculated, and the vertex coordinate of the ith mark is P (wherein P is the vertex coordinate of the ith mark ((P)0) And the side length of the large square (black square) is le, gth, then:
therefore, the coordinates of all the mark points on the calibration plate can be obtained, and the world coordinates of the feature points can be directly obtained according to the id number of the mark identified after the camera takes a picture.
In the fourth step, a monocular relative pose estimation function cv. Four commonly used alternative algorithms in solvePnP are solvePnP _ ITERATIVE, SOLVEP, P _ EPNP, solvePnP _ P3P and solvePnP _ DLS, respectively, where solvePnP _ ITERATIVE is applicable to the case where feature points are in the same plane, whereas when solvePnP _ EPNP is used, feature points need to be in different planes, and solvePnP _ ITERATIVE requires at least four points not to be in the same plane. SOLVEOPPNP _ ITERATIVE is used herein because all feature points are on the same plane. The following explains why solvePnP requires at least three pairs of feature points for external reference:
according to the relationship between 2D points and 3D points in the world coordinate system on the image, there are
Unfolding R and T:
namely:
substituting the formula III into the formula I and the formula II to obtain:
wherein,,,,in order to be a known camera internal reference,are known coordinates, so the two areIn the equation, only 12 elements in R and T are unknown numbers, but the rotation matrix R is an orthogonal matrix, vectors in each row and each column are orthogonal to each other and are unit vectors, so that the degree of freedom of R is 3, and 3 elements can be known to derive the value of another 6 elements. So plus the three unknown elements in T there are a total of 6 unknowns. One pair of feature points can determine two equations, 6 unknowns require at least 3 pairs of feature points to solve, so we establish a 2D-3D relationship of at least three feature points.
In the fifth step, the first to fourth steps are repeated for each camera to obtain the external parameters of the camera coordinates of each camera, and at the moment, the conversion relation from the camera coordinate system to the world coordinate system is obtained. A simple method can verify whether the solved external parameters are correct: and calculating the coordinates of the origin of the camera in the world coordinate system, and comparing the coordinates with the actual measured values. For the ith camera cami, the following equation holds:
wherein,andrespectively a rotation matrix and a translation vector from a world coordinate system to an ith camera cami coordinate systemThen obtain * That is, the coordinate of the ith camera in the world coordinate system is compared with the measured actual coordinate, and whether the solution of the external parameter is accurate or not can be judged.
Step six: one camera is used as a reference camera, and other cameras are unified to a camera seat of the reference camera.
Set a point under the world coordinate systemThe coordinates of point P in the camera i coordinate system are:then, using the homogeneous coordinates, the following formula is given:
thus, an extrinsic parameter matrix from the ith camera to the first camera (reference camera) can be calculated, denoted asThen
So far, the coordinate systems of all the cameras are unified into the same coordinate system.
Fig. 2 is a schematic diagram of an embodiment 1 of the method for calibrating and unifying the coordinates of multiple cameras in a non-overlapping area according to the present invention, and as can be seen from fig. 2, in practical applications,
the whole process of the invention is shown in figure 2 and comprises the following steps:
and (3) evaluation standard of calibration result:
the method comprises the following steps: reprojection error Re-projection error
Projecting the three-dimensional object point to a two-dimensional image, calculating the Euclidean distance between the projected point and the corresponding angular point extracted from the two-dimensional image, calculating a plurality of groups of distances, calculating the average value of the distances, and measuring the calibration error by taking the average value as a reprojection error. The lower image can visually see the effect after the re-projection, wherein the blue point is the point after the re-projection, and the red point is the corresponding angular point extracted from the two-dimensional image.
However, the reprojection error cannot completely reflect the correctness of the internal and external reference calibration results, because the reprojection error is also affected by other factors:
detecting the angular point of the image, wherein if the angular point detection precision is poor, the reprojection error can be directly influenced;
noise exists in the camera, and the camera shakes;
③ camera resolution, since the unit is a pixel. In other cases where the conditions are otherwise consistent, the larger the resolution of the camera, the denser its pixels, and the larger the resulting reprojection error.
The second method comprises the following steps:
two three-dimensional points are selected and projected on a two-dimensional image of a reference camera, the Euclidean distance of the two points is calculated, the Euclidean distance of the two points can be measured under world coordinates, the distance calculated under the two-dimensional image is differed from the distance in the actual world coordinates, absolute values are obtained, and multiple groups of data are obtained for calculation to obtain average errors.
Taking 6 groups of mark points, and obtaining an average error of about 0.673mm according to the second verification method of the calibration result, because the chAruCo calibration plate used in the invention is printed, the precision cannot be guaranteed; when the world coordinate is established, z =0 is taken, and if the flatness of the plane where the calibration plate is located is not enough, the error is increased. The two points are main sources of errors, and in addition, the measurement errors caused by measuring the side length of the small square when the world coordinates of the mark points are established are also included. The error can be reduced from the three aspects, and the calibration precision expected by the user can be achieved.
The preferred embodiments of the invention disclosed above are intended to be illustrative only. The preferred embodiments are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention. The invention is limited only by the claims and their full scope and equivalents.
Claims (7)
1. The method for calibrating and unifying the coordinates of the multiple cameras under the non-overlapping area is characterized by comprising an operation method, wherein the operation method comprises the following steps: calibrating internal reference to obtain an internal reference matrix and a distortion coefficient of the camera; step two: obtaining image coordinates of the feature points; step three: obtaining world coordinates of the feature points; step four: obtaining a rotation matrix and a translation vector of the camera coordinate system and a world coordinate system; step five: obtaining the rotation and translation relation between each camera coordinate system and the world coordinate system; step six: one camera is taken as a reference camera, and other cameras are unified to a camera coordinate system of the reference camera.
2. The method for calibrating and coordinating multiple cameras in non-overlapping areas according to claim 1, wherein in the first step, the cameras are adjusted to the proper aperture and focal length by using the Zhang friend camera calibration method, and the aperture and focal length cannot be changed during calibration.
3. And selecting symmetrical circular calibration plates, and shooting at least 20 qualified calibration pictures by each camera to carry out internal reference calibration so as to obtain an internal reference matrix and a distortion coefficient of the camera.
4. The method for calibrating and unifying the coordinates of multiple cameras under the non-overlapping area as claimed in claim 1, wherein in the second step, the camera shoots a part of the area of the large-size chAruCo calibration plate, the corner points are detected by using a function of detection mark points of itself, i.e. detect markers, and the detected mark points are used as the feature points, and the function can return the image coordinates of the corner points.
5. The method for calibrating and unifying the coordinates of multiple cameras under the non-overlapping area as claimed in claims 1 and 3, wherein in the third step, the origin of the world coordinate system is self-defined, which can define the origin of coordinates for conveniently calculating the world coordinates of the feature points, the chAruCo calibration board is attached to the horizontal desktop, the vertex of the first mark at the lower right corner is taken as the origin of the world coordinate system, and the vertex coordinates at the lower right corner of all the mark points, namely the world coordinates of the mark points, are calculated according to the arrangement rule of the marks and the id of the mark points, taking Z =0 (upward is the Z axis).
6. The method for calibrating and unifying the coordinates of multiple cameras in the non-overlapping area according to claim 1, wherein in the fourth step, by using a monocular relative position estimation function cv.
7. The method for calibrating and unifying multiple cameras under the non-overlapping area according to claim 1, wherein in the step five, the step one to the step four are repeated for each camera to obtain the external reference of the camera coordinate of each camera, and at this time, the conversion relation from the camera coordinate system to the world coordinate system is obtained.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111511234.9A CN114299156A (en) | 2021-12-11 | 2021-12-11 | Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111511234.9A CN114299156A (en) | 2021-12-11 | 2021-12-11 | Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114299156A true CN114299156A (en) | 2022-04-08 |
Family
ID=80967886
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111511234.9A Withdrawn CN114299156A (en) | 2021-12-11 | 2021-12-11 | Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114299156A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114742905A (en) * | 2022-06-13 | 2022-07-12 | 魔视智能科技(武汉)有限公司 | Multi-camera parameter calibration method, device, equipment and storage medium |
CN114792344A (en) * | 2022-06-24 | 2022-07-26 | 季华实验室 | Multi-camera position calibration method, device and system and storage medium |
CN114926552A (en) * | 2022-06-17 | 2022-08-19 | 中国人民解放军陆军炮兵防空兵学院 | Method and system for calculating Gaussian coordinates of pixel points based on unmanned aerial vehicle image |
CN115035602A (en) * | 2022-06-17 | 2022-09-09 | 广东天物新材料科技有限公司 | Human gait analysis method and device, computer equipment and readable storage medium |
CN117576228A (en) * | 2024-01-16 | 2024-02-20 | 成都合能创越软件有限公司 | Real-time scene-based camera coordinate calibration method and system |
CN118411426A (en) * | 2024-05-21 | 2024-07-30 | 北京积加科技有限公司 | Camera parameter calibration method, device, electronic equipment and computer readable medium |
-
2021
- 2021-12-11 CN CN202111511234.9A patent/CN114299156A/en not_active Withdrawn
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114742905A (en) * | 2022-06-13 | 2022-07-12 | 魔视智能科技(武汉)有限公司 | Multi-camera parameter calibration method, device, equipment and storage medium |
CN114926552A (en) * | 2022-06-17 | 2022-08-19 | 中国人民解放军陆军炮兵防空兵学院 | Method and system for calculating Gaussian coordinates of pixel points based on unmanned aerial vehicle image |
CN115035602A (en) * | 2022-06-17 | 2022-09-09 | 广东天物新材料科技有限公司 | Human gait analysis method and device, computer equipment and readable storage medium |
CN114792344A (en) * | 2022-06-24 | 2022-07-26 | 季华实验室 | Multi-camera position calibration method, device and system and storage medium |
CN117576228A (en) * | 2024-01-16 | 2024-02-20 | 成都合能创越软件有限公司 | Real-time scene-based camera coordinate calibration method and system |
CN117576228B (en) * | 2024-01-16 | 2024-04-16 | 成都合能创越软件有限公司 | Real-time scene-based camera coordinate calibration method and system |
CN118411426A (en) * | 2024-05-21 | 2024-07-30 | 北京积加科技有限公司 | Camera parameter calibration method, device, electronic equipment and computer readable medium |
CN118411426B (en) * | 2024-05-21 | 2025-01-10 | 北京积加科技有限公司 | Camera parameter calibration method, device, electronic device and computer readable medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114299156A (en) | Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area | |
CN110809786B (en) | Calibration device, calibration chart, chart pattern generation device, and calibration method | |
US10924729B2 (en) | Method and device for calibration | |
Kim et al. | A camera calibration method using concentric circles for vision applications | |
JP5999615B2 (en) | Camera calibration information generating apparatus, camera calibration information generating method, and camera calibration information generating program | |
CN108734744A (en) | A kind of remote big field-of-view binocular scaling method based on total powerstation | |
CN106887023A (en) | For scaling board and its scaling method and calibration system that binocular camera is demarcated | |
CN111210468A (en) | Image depth information acquisition method and device | |
WO2023201578A1 (en) | Extrinsic parameter calibration method and device for monocular laser speckle projection system | |
CN105453546A (en) | Image processing apparatus, image processing system, image processing method, and computer program | |
CN115797461A (en) | Flame space positioning system calibration and correction method based on binocular vision | |
CN109272555B (en) | A method of obtaining and calibrating external parameters of RGB-D camera | |
US10628968B1 (en) | Systems and methods of calibrating a depth-IR image offset | |
Park et al. | Active calibration of camera-projector systems based on planar homography | |
CN113379845A (en) | Camera calibration method and device, electronic equipment and storage medium | |
Resch et al. | On-site semi-automatic calibration and registration of a projector-camera system using arbitrary objects with known geometry | |
CN104504691B (en) | Camera position and posture measuring method on basis of low-rank textures | |
Ding et al. | A robust detection method of control points for calibration and measurement with defocused images | |
CN113963068A (en) | A global calibration method for omnidirectional stereo vision sensor of mirrored single camera | |
CN117173254A (en) | Camera calibration method, system, device and electronic equipment | |
CN114693807A (en) | Method and system for reconstructing mapping data of power transmission line image and point cloud | |
Fetzer et al. | Stable intrinsic auto-calibration from fundamental matrices of devices with uncorrelated camera parameters | |
CN110044266B (en) | Photogrammetry system based on speckle projection | |
Kumar et al. | Generalized pupil-centric imaging and analytical calibration for a non-frontal camera | |
CN115375773A (en) | External parameter calibration method and related device for monocular laser speckle projection system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20220408 |