Disclosure of Invention
The application provides an OST calibration system and an OST calibration method, which can improve the degree of automation and the universality.
The application discloses an OST calibration system, which is used for calibrating OST-AR equipment, wherein the OST-AR equipment comprises an AR display screen and a tracking camera which are correspondingly configured, and the OST calibration system comprises:
The three-dimensional target is distributed with a plurality of groups of characteristic points;
An industrial camera for simulating a human eye;
the baffle is provided with a first position for shielding the stereoscopic target for the industrial camera and a second position for opposite avoidance;
The industrial personal computer is used for acquiring virtual body target data and entity target data and correspondingly calculating to obtain calibration parameters of the OST-AR equipment, wherein the virtual body target data is obtained by shooting an image target displayed by the AR display screen through the industrial camera when the baffle is at a first position, and the entity target data is obtained by shooting the three-dimensional target through the AR display screen through the industrial camera and shooting the three-dimensional target through the tracking camera when the baffle is at a second position.
The following provides several alternatives, but not as additional limitations to the above-described overall scheme, and only further additions or preferences, each of which may be individually combined for the above-described overall scheme, or may be combined among multiple alternatives, without technical or logical contradictions.
In one embodiment, the stereoscopic target includes a plurality of planar targets facing different directions and intersecting each other, each set of feature points is distributed on the corresponding planar target, and each feature point is circular.
In one embodiment, the oss calibration system further comprises a carrier device for mounting an oss-AR device, the industrial camera being mounted to the carrier device;
the bearing equipment is provided with an adjusting mechanism controlled by the industrial personal computer and used for changing the three-dimensional space posture of the bearing equipment.
In one embodiment, the industrial cameras are two and move independently of each other.
In one embodiment, the stereoscopic target and the industrial camera are respectively provided with six-axis adjusting devices controlled by the industrial personal computer and used for changing the respective three-dimensional space postures.
In one embodiment, the baffle is configured with a switching mechanism controlled by the industrial personal computer, and the switching mechanism is used for driving the baffle to switch between the first position and the second position.
The application also provides an OST calibration method, which comprises the following steps:
Step S100, respectively providing OST-AR equipment to be calibrated, an industrial camera, a three-dimensional target and a baffle, wherein the three-dimensional target is provided with characteristic points, the OST-AR equipment comprises an AR display screen and a tracking camera which are correspondingly configured, and the baffle is provided with a first position for shielding the three-dimensional target by the industrial camera and a second position for avoiding the three-dimensional target relatively;
step 200, shooting an image target displayed by the AR display screen through the industrial camera when the baffle is at a first position, and obtaining virtual target data;
Step S300, when the baffle is at the second position, the three-dimensional target is respectively shot through the AR display screen by the industrial camera and the tracking camera to obtain solid target data;
and step 400, calculating and obtaining calibration parameters of the OST-AR equipment according to the virtual target data and the entity target data.
In one embodiment, in the step S400, calculating calibration parameters of the OST-AR device includes:
step S410, extracting feature points of the virtual body target data to obtain first information, wherein the first information is two-dimensional coordinate information of the virtual body feature points in an industrial camera imaging coordinate system;
Step S420, obtaining homography of imaging planes of the AR display screen and the industrial camera according to the first information and the feature point information in the image target;
Step S430, extracting feature points of the entity target data to obtain second information and third information, wherein the second information is two-dimensional coordinate information of the entity feature points in an imaging coordinate system of the industrial camera, and the third information is two-dimensional coordinate information of the entity feature points in an imaging coordinate system of the tracking camera;
Step S440, obtaining fourth information according to the second information and the homography relation, wherein the fourth information is two-dimensional coordinate information of the entity characteristic points in an AR display screen imaging coordinate system;
step S450, obtaining three-dimensional coordinate information of the physical feature points in the imaging coordinate system of the tracking camera according to the third information, the physical dimensions of the physical feature points and the internal parameters and distortion parameters of the tracking camera;
And step S460, obtaining the calibration parameters according to the fourth information and the three-dimensional coordinate information.
In one embodiment, the feature points are circular, and in step S410, further includes reading internal parameters and distortion parameters of the industrial camera, and performing a de-distortion operation on the virtual feature points according to the internal parameters and distortion parameters of the industrial camera to obtain the first information.
In one embodiment, in the step S460, the calibration parameters include an external parameter and a projection matrix of the oss-AR device, where the projection matrix is an asymmetric projection matrix.
The OST calibration system and the OST calibration method can be used for calibrating the AR head display equipment, a clearer image can be obtained through the baffle, in addition, the baffle is adopted to be convenient for adjusting the position, the system is suitable for different calibration scenes, and the universality of the equipment is improved.
Detailed Description
In order that the above objects, features and advantages of the application will be readily understood, a more particular description of the application will be rendered by reference to the appended drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. The present application may be embodied in many other forms than described herein and similarly modified by those skilled in the art without departing from the spirit of the application, whereby the application is not limited to the specific embodiments disclosed below.
It will be understood that when an element is referred to as being "mounted" or "disposed" on another element, it can be directly on the other element or intervening elements may also be present. When a component is considered to be "connected" to another component, it can be directly connected to the other component or intervening components may also be present. The terms "vertical", "horizontal", "upper", "lower", "left", "right" and the like are used in the description of the present application for the purpose of illustration only and do not represent the only embodiment.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
In the present application, unless expressly stated or limited otherwise, a first feature "up" or "down" on a second feature may be that the first feature is in direct contact with the second feature, or that the first feature and the second feature are in indirect contact through intermedial media. Moreover, a first feature "above," "over" and "on" a second feature may be a first feature directly above or obliquely above the second feature, or merely indicate that the first feature is higher in level (or in a state of use, or in view of some drawing) than the second feature. A first feature "under", "beneath" and "under" a second feature may be a first feature directly under or obliquely below the second feature, or merely indicate that the first feature is less level than the second feature (or in a state of use, or at some viewing angle of the drawing).
Unless defined otherwise, all technical and scientific terms used in the specification of the present application have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. The term "and/or" as used in the description of the present application includes any and all combinations of one or more of the associated listed items.
In the prior art, in the process of OST off-line calibration, the target is required to be positioned in the fields of view of the AR head display equipment tracking camera and the industrial camera at the same time so as to extract the characteristic points. However, manual control is required to complete the operations at present, the process is not only tedious and depends on the experience of operators, but also can not ensure that each adjustment can meet the calibration requirement. At present, an automatic calibration system and equipment related to OST off-line calibration are still in a development perfection stage, and system-level automatic calibration equipment is absent.
The following is a description of related terms:
VR Virtual Reality.
AR Augmented Reality, augmented reality.
OST Optical See Through, optical perspective, by means of a semi-transparent semi-reflective optical system, reflects a virtual image while seeing the real world, and overlays it in the field of view.
VST Video See Through, video perspective, capturing video stream of scene by camera, superimposing virtual information into video stream, and finally rendering processed video stream on display frame by frame.
OST calibration-tracking cameras and AR display screens of OST-AR devices are calibrated to ensure alignment and accurate projection of virtual images to the real world.
Referring to fig. 1 to 3, an embodiment of the present application provides an oss calibration system, which is used for calibrating an oss-AR device (i.e., an AR head display device 9), where the oss-AR device includes an AR display screen and a tracking camera 10 configured correspondingly, the tracking camera 10 is used for capturing a stereoscopic target for data acquisition during oss calibration, the oss-AR device supports a screen capturing function, and an image target captured by the tracking camera 10 can be projected onto the AR display screen.
The OST calibration system of the present embodiment includes a stereoscopic target 12, an industrial camera, and a baffle 11. In order to install the OST-AR device during calibration, the bearing device can be further configured, the bearing device can adopt a head die or other supporting structures, and in order to change the three-dimensional space posture of the OST-AR device, the bearing device can be configured with an adjusting mechanism controlled by the industrial personal computer 2.
The industrial camera is used for simulating human eyes, in this embodiment, the industrial camera is two cameras for respectively simulating left eyes and right eyes, for example, the industrial camera 7 is used for acquiring virtual body and physical target data of the left eye position in the OST calibration, and the industrial camera 8 is used for acquiring virtual body and physical target data of the right eye position in the OST calibration.
The two industrial cameras are respectively provided with six-axis motors, for example, the industrial camera 7 is provided with a six-axis motor 3, the industrial camera 8 is provided with a six-axis motor 4, and each six-axis motor consists of a three-axis moving motor and a three-axis rotating motor and is used for controlling the corresponding industrial camera to move in six degrees of freedom. Both industrial cameras may be mounted to the carrier device, i.e. follow the carrier device as a whole with the OST-AR device, but both industrial cameras may be moved independently of each other with respect to the carrier device.
The stereoscopic target 12 is of a substantially cubic structure, and the stereoscopic target 12 is provided with a six-axis motor 6 controlled by the industrial personal computer 2 for changing the three-dimensional space posture of the stereoscopic target 12. Three adjacent faces (two by two perpendicular) of the stereoscopic target 12 are arranged with the same type of planar target for the industrial camera and tracking camera 10 to be able to acquire image information of multiple planar targets simultaneously.
Since the stereoscopic target 12 has three adjacent planar targets, one of the planar targets may be set as a stereoscopic target setting surface for convenience of calculation, as shown by a dashed frame in fig. 3. Each planar target is distributed with a group of characteristic points, and as one of improvement, the characteristic points adopted in the application are circular and are arranged in rows and columns, the characteristic points between adjacent rows are staggered, and the adoption of the circular characteristic points is more beneficial to recognition, operation and optimization of final calibration parameters, so that the re-projection error can be further reduced.
The baffle 11 is used for shielding the rear object of the AR display screen of the AR head display device 9, the baffle 11 can be provided with a switching mechanism controlled by the industrial personal computer 2, for example, the up-and-down motion of the baffle can be controlled by adopting a telescopic rod 5 or other driving modes, so that the baffle 11 can be switched between a first position and a second position, the baffle 11 shields the rear object of the AR display screen and shields the stereoscopic target 12 at least under the first position, the optical image of the stereoscopic target 12 is prevented from directly penetrating through the AR display screen to enter the industrial camera, the industrial camera is facilitated to shoot the image target displayed by the AR display screen, and clear virtual target data at the left eye position and the right eye position are acquired.
The baffle 11 avoids the stereoscopic target 12 in the second position, so that the optical image of the stereoscopic target 12 directly penetrates through the AR display screen to enter the industrial camera, and the industrial camera can shoot the stereoscopic target 12 through the AR display screen conveniently, and clear solid target data are obtained.
The mode of baffle 11 is adopted in this embodiment, not only simple structure and flexible operation, of course do not have strict limitation in terms of its own shape and the motion mode of switching position, also can set up six motors if necessary to adapt to the gesture change of AR head display device 9, can guarantee and shelter from the effect and be favorable to image acquisition's definition, baffle 11 and AR head display device 9 both can follow-up adaptation and can the mutual separation be convenient for adjust, have further improved the commonality of equipment.
The industrial personal computer 2 serves as a carrier for control, data acquisition, storage, command transmission, data processing and software algorithms of a plurality of cameras and various moving parts. The industrial personal computer 2 may be connected to a display 1, and previewing, software interface, data visualization, etc. of each camera may be performed by the display 1.
In the calibration process, the industrial personal computer 2 acquires virtual target data and physical target data and correspondingly calculates to acquire calibration parameters of the OST-AR equipment.
Taking the left eye calibration as an example, the components in fig. 2 take the case of the oss calibration, the tracking camera 10 and the industrial camera 7 need to acquire stereo target images at the same time, and the oss external parameters (i.e. external parameters of the oss-AR device) from the tracking camera 10 to the left eye point and the projection matrix of the left eye point on the AR projection screen are calibrated according to the corresponding relation of the feature points.
The virtual target data is obtained by photographing an image target displayed on the AR display screen through the industrial camera when the baffle 11 is at the first position, and the physical target data is obtained by photographing a stereoscopic target through the AR display screen through the industrial camera and photographing the stereoscopic target through the tracking camera 10 when the baffle 11 is at the second position.
The corresponding configuration software modules in the industrial personal computer 2, namely the OST calibration software in FIG. 4, the functional modules may include:
The logic control (module) is used for sending control instructions of all cameras, controlling the telescopic rod to stretch and retract and controlling the AR head display equipment to be opened and closed in a screen throwing mode.
And the motion control (module) is used for resolving the pose of each six-axis motor and correspondingly controlling the motion of each six-axis motor.
And (3) algorithm calling (module) for inputting the acquired data into an OST calibration algorithm for calculation.
And the result visualization (module) is used for visualizing the result marked by the algorithm at the software end for the user to view.
The OST calibration system can be provided with a planar target mode and a three-dimensional target mode so as to be suitable for tracking cameras of different AR head display devices, and has good expansibility, for example, the three-dimensional target mode is suitable for the condition that the distortion of the tracking camera of the AR head display device is small, and the planar target mode is suitable for the condition that the distortion of the tracking camera of the AR head display device is large.
The automatic control of the movement of the three-dimensional target greatly reduces personnel operation, improves calibration efficiency, and can directly calculate the six-axis tail end pose increment of the three-dimensional target moving to the target pose through pose conversion, for example, so as to realize automatic adjustment of calibration.
The industrial cameras adopted by the OST calibration system support the lenses with different fov, and each camera is provided with an independent movement mechanism for pose adjustment, so that the device can be used for adaptively calibrating devices with different photomachine fov and different forms, and the device adaptation degree is high. The two industrial cameras are respectively provided with independent pose control modules in control, and can be adapted to calibrating equipment with different forms.
Referring to fig. 5 to fig. 6, an embodiment of the present application further provides an oss calibration method, which may be implemented by the oss calibration system of any one of the above embodiments, where the oss calibration method specifically includes:
step S100, respectively providing OST-AR equipment to be calibrated, an industrial camera, a three-dimensional target and a baffle, wherein the three-dimensional target is provided with characteristic points, the OST-AR equipment comprises an AR display screen and a tracking camera which are configured correspondingly, and the baffle is provided with a first position for shielding the three-dimensional target by the industrial camera and a second position for avoiding the three-dimensional target relatively.
And step 200, shooting an image target displayed on the AR display screen by an industrial camera when the baffle is at the first position, and obtaining virtual target data.
And step S300, respectively shooting the stereoscopic target through the AR display screen by the industrial camera and the tracking camera to obtain the data of the solid target when the baffle is at the second position.
And step 400, calculating and obtaining calibration parameters of the OST-AR equipment according to the virtual target data and the physical target data.
In one embodiment, in step S400, calculating calibration parameters of the OST-AR device includes:
step S410, extracting feature points of the virtual body target data to obtain first information, where the first information is two-dimensional coordinate information of the feature points of the virtual body in an industrial camera imaging coordinate system.
In the step, in order to further optimize the calibration effect, the method further comprises the steps of reading internal parameters and distortion parameters of the industrial camera, and performing de-distortion operation on the virtual body characteristic points according to the internal parameters and the distortion parameters of the industrial camera to obtain first information.
Step S420, according to the first information and the feature point information in the image target, obtaining the homography of the imaging planes of the AR display screen and the industrial camera, wherein the homography can be seen in the following formula;
Wherein:
The method comprises the steps of obtaining feature point information in an image target, namely pixel coordinates of virtual feature points in an imaging plane of an AR display screen;
P 2D (testCam) is first information;
H (testCam display) is the homography of the imaging plane of the industrial camera to the imaging plane of the AR display.
In order to adapt to different AR devices, the selected industrial camera supports carrying different fov lenses, and since the image acquired by the industrial camera is subjected to de-distortion processing in step S410, the error of the OST calibration is reduced.
And step S430, extracting feature points of the entity target data to obtain second information and third information, wherein the second information is two-dimensional coordinate information of the entity feature points in an imaging coordinate system of the industrial camera, and the third information is two-dimensional coordinate information of the entity feature points in an imaging coordinate system of the tracking camera.
And S440, obtaining fourth information according to the second information and the homography relation, wherein the fourth information is two-dimensional coordinate information of the entity characteristic points in an AR display screen imaging coordinate system.
And S450, obtaining three-dimensional coordinate information of the physical feature points in the imaging coordinate system of the tracking camera according to the third information, the physical dimensions of the physical feature points and internal parameters and distortion parameters of the tracking camera.
And step S460, obtaining calibration parameters according to the fourth information and the three-dimensional coordinate information. The calibration parameters comprise external parameters and projection matrixes of the OST-AR equipment, and the following relational expression can be seen by taking left eye (right eye is the same as the left eye) optimization as an example:
Wherein:
residual is the residual, i.e. the re-projection error (ideal state 0);
is fourth information;
P 3D (RGB) is three-dimensional coordinate information of the entity feature points in a tracking camera imaging coordinate system;
m is a projection matrix of the OST-AR device;
T (RGB 2 Lefteye) is an external parameter of the OST-AR device, namely, a pose conversion matrix for tracking a camera coordinate system to a left eye point.
Because the left and right industrial cameras are independently controlled, the optical axis of the left and right industrial cameras is not strictly consistent with the normal line of the center of the AR projection screen, the projection matrix M is optimized in an asymmetric projection matrix mode by an improved algorithm, and the error of OST calibration is reduced.
It should be understood that, although the steps in the related flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the figures may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor does the order in which the sub-steps or stages are performed necessarily performed in sequence, but may be performed alternately or alternately with at least a portion of other steps or other steps.
In one implementation, the following steps are performed for OST calibration:
And each power-on device is powered on to normally supply power, and the six-axis motor is reset to a preset initial position, so that the setting surface of the three-dimensional target is opposite to the industrial camera.
Opening OST calibration software of the industrial personal computer, and inputting the model of the AR head display device to be tested.
And opening an AR display screen of the AR head display equipment to be tested, placing the AR head display equipment on a head die, and adjusting the head band to ensure that the AR head display equipment is firmly worn without loosening.
Parameters such as internal parameters of two industrial cameras and tracking cameras, types and characteristic point parameters of a virtual target (image target) and a three-dimensional target, image resolution of the virtual target, pixel coordinates of initial characteristic points and the like are input into a software interface.
And clicking an automatic calibration button, and starting running OST calibration software.
OST calibration software controls the extension rod to extend, so that the baffle plate ascends, and the industrial camera can clearly shoot an AR display screen.
OST calibration software controls the six-axis motor to respectively adjust the virtual targets in the AR display screen which can be completely and clearly shot by the two industrial cameras.
The OST calibration software controls the industrial camera to collect and store virtual target images (virtual target data) of the left and right eye positions.
OST calibration software controls the AR head display equipment to close the screen.
OST calibration software controls the telescopic rod to retract, so that the baffle plate descends.
OST calibration software controls the industrial camera to collect and store the current stereo target set face image.
The OST calibration software runs a six-axis motor tail end movement pose calculation algorithm, and the pose variation quantity of the six-axis motor tail end is calculated through the three-dimensional target setting surface and the hand-eye calibration external parameters of the six-axis motor tail end, the external parameters of the current three-dimensional target setting surface relative to the industrial camera and the external parameters of the three-dimensional target setting surface relative to the industrial camera under the ideal OST calibration pose.
As shown in fig. 7, wherein:
P is the origin of a coordinate system of a stand of a six-axis adjusting device (a six-axis motor) of the three-dimensional target;
a0 is the current position of the tail end of the six-axis adjusting device of the three-dimensional target, and correspondingly, b0 is the corresponding position of the origin of the coordinate system of the setting surface of the three-dimensional target;
a1 is the tail end position of the six-axis adjusting device under the target pose of the three-dimensional target, and correspondingly, b1 is the corresponding position of the origin of the coordinate system of the setting surface of the three-dimensional target;
The left broken line shows the initial position of the industrial camera, and the position below the initial position is the position where the industrial camera collects the data of the virtual target, wherein c is the origin of the coordinate system of the industrial camera.
The corresponding relationship can be obtained according to the illustration, for example, the corresponding relationship Tc_b0 between c and b0, the corresponding relationship Tc_b1 between c and b1, the corresponding relationship Ta_b between a0 and b0, the corresponding relationship Ta_b between a1 and b1, and the corresponding relationship Ta0_a1 between a0 and a1.
The OST calibration software runs a six-axis mechanism motion control algorithm, and controls the six-axis motor to move according to the pose change quantity of the tail end, so that the three-dimensional target is moved to an ideal OST calibration pose.
The OST calibration software controls the industrial camera and the tracking camera to acquire and store the current stereoscopic target image (stereoscopic target data).
OST calibration software runs an OST calibration algorithm, an industrial camera and a tracking camera are utilized to collect the corresponding relation of characteristic points in the three-dimensional target, OST external parameters and projection matrixes of the AR head display device are calculated, OST calibration results are stored to the local, and the OST calibration results are visualized on a display screen.
Closing OST calibration software or clicking a reset button, and resetting the motor to a preset initial position.
And taking the AR head display equipment off the head mould, closing the industrial personal computer, and powering off the equipment.
And the OST automatic calibration system finishes all calibration items, including OST external parameters and projection matrixes.
As shown in fig. 8 and 9, the upper left corner number in fig. 8 is the average re-projection error, the number above each feature point is the feature point re-projection error, the error unit is a score, and in the accuracy verification result, the maximum re-projection error of the present application is not more than 2 scores.
The OST calibration system has high automation degree, improves the OST calibration efficiency, reduces the operation threshold, has little influence of human factors, can realize a calibration result with good consistency, is convenient for standardized implementation, has good expandability, can perform OST calibration on most products, and has higher calibration precision.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description. When technical features of different embodiments are embodied in the same drawing, the drawing can be regarded as a combination of the embodiments concerned also being disclosed at the same time.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the claims. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be determined from the following claims.