[go: up one dir, main page]

CN119672123A - OST calibration system and OST calibration method - Google Patents

OST calibration system and OST calibration method Download PDF

Info

Publication number
CN119672123A
CN119672123A CN202411657524.8A CN202411657524A CN119672123A CN 119672123 A CN119672123 A CN 119672123A CN 202411657524 A CN202411657524 A CN 202411657524A CN 119672123 A CN119672123 A CN 119672123A
Authority
CN
China
Prior art keywords
ost
target
calibration
information
industrial camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202411657524.8A
Other languages
Chinese (zh)
Inventor
袁逸钊
张本好
黄菊
蒋坤君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shengyi Optical Sensing Technology Co ltd
Sunny Optical Zhejiang Research Institute Co Ltd
Original Assignee
Zhejiang Shengyi Optical Sensing Technology Co ltd
Sunny Optical Zhejiang Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shengyi Optical Sensing Technology Co ltd, Sunny Optical Zhejiang Research Institute Co Ltd filed Critical Zhejiang Shengyi Optical Sensing Technology Co ltd
Priority to CN202411657524.8A priority Critical patent/CN119672123A/en
Publication of CN119672123A publication Critical patent/CN119672123A/en
Pending legal-status Critical Current

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

本申请涉及一种OST标定系统和OST标定方法,其中OST标定系统用于对OST‑AR设备进行标定,所述OST‑AR设备包括相应配置的AR显示屏和跟踪相机,所述OST标定系统包括:立体标板,分布有多组特征点;工业相机,用于模拟人眼;挡板,具有针对所述工业相机遮挡所述立体标板的第一位置以及相对避让的第二位置;工控机,用于获取虚体标板数据和实体标板数据并相应计算获得所述OST‑AR设备的标定参数,其中所述虚体标板数据为所述挡板在第一位置下,经由所述工业相机拍摄所述AR显示屏所展示的图像标板获得,所述实体标板数据为所述挡板在第二位置下,经由所述工业相机透过所述AR显示屏拍摄所述立体标板、以及经由所述跟踪相机拍摄所述立体标板获得。

The present application relates to an OST calibration system and an OST calibration method, wherein the OST calibration system is used to calibrate an OST-AR device, wherein the OST-AR device includes an AR display screen and a tracking camera with corresponding configurations, and the OST calibration system includes: a three-dimensional target plate, on which multiple groups of feature points are distributed; an industrial camera, used to simulate human eyes; a baffle, having a first position for shielding the three-dimensional target plate from the industrial camera and a second position for relative avoidance; an industrial computer, used to obtain virtual target plate data and physical target plate data and to calculate and obtain calibration parameters of the OST-AR device accordingly, wherein the virtual target plate data is obtained by the industrial camera photographing the image target plate displayed on the AR display screen when the baffle is in the first position, and the physical target plate data is obtained by the industrial camera photographing the three-dimensional target plate through the AR display screen when the baffle is in the second position, and by the tracking camera photographing the three-dimensional target plate.

Description

OST calibration system and OST calibration method
Technical Field
The application relates to the technical field of OST calibration, in particular to an OST calibration system and an OST calibration method.
Background
Augmented Reality (AR) is an interactive real-time technology that allows users to feel that virtual objects exist between real objects in the physical world. Currently, there are two main modes of display for AR head mounted displays, video perspective (VST) and optical perspective (OST). Although VST is easy to deploy, the human eye obtains video stream information, and the observation viewpoint of the camera is not coincident with the actual viewpoint of the human eye, so that compared with the VST display mode, the visual experience is more natural.
With the progress of technology, the hardware devices of the OST head-mounted display become smaller and lighter, and with the addition of various sensors and high performance processors, virtual images can be calculated and rendered in real time. Currently, OST head mounted displays are widely used in entertainment, education, medical, and other fields.
However, in an OST head mounted display, OST calibration is required because the virtual content must be modified according to the eye position, and it is difficult to properly align the virtual content with the real world.
At present, OST calibration is mainly a single-point active alignment method, and the principle of the OST calibration is that a plurality of marks are fixed in a real scene, a user observes the marks through a projection screen, and a mouse is used for controlling a screen sighting device to align the marks. But require recalibration once the device is slid or the user is replaced. Therefore, a common calibration method is to divide the single-point active alignment method into two parts, namely off-line calibration and on-line calibration, which is called a two-step method.
When the existing two-step method is used for off-line calibration, the entity target plate needs to be adjusted, the calibration process is complicated and time-consuming, and in addition, the calibration reprojection error is still to be optimized under the influence of the entity target plate structure and the calibration algorithm.
Disclosure of Invention
The application provides an OST calibration system and an OST calibration method, which can improve the degree of automation and the universality.
The application discloses an OST calibration system, which is used for calibrating OST-AR equipment, wherein the OST-AR equipment comprises an AR display screen and a tracking camera which are correspondingly configured, and the OST calibration system comprises:
The three-dimensional target is distributed with a plurality of groups of characteristic points;
An industrial camera for simulating a human eye;
the baffle is provided with a first position for shielding the stereoscopic target for the industrial camera and a second position for opposite avoidance;
The industrial personal computer is used for acquiring virtual body target data and entity target data and correspondingly calculating to obtain calibration parameters of the OST-AR equipment, wherein the virtual body target data is obtained by shooting an image target displayed by the AR display screen through the industrial camera when the baffle is at a first position, and the entity target data is obtained by shooting the three-dimensional target through the AR display screen through the industrial camera and shooting the three-dimensional target through the tracking camera when the baffle is at a second position.
The following provides several alternatives, but not as additional limitations to the above-described overall scheme, and only further additions or preferences, each of which may be individually combined for the above-described overall scheme, or may be combined among multiple alternatives, without technical or logical contradictions.
In one embodiment, the stereoscopic target includes a plurality of planar targets facing different directions and intersecting each other, each set of feature points is distributed on the corresponding planar target, and each feature point is circular.
In one embodiment, the oss calibration system further comprises a carrier device for mounting an oss-AR device, the industrial camera being mounted to the carrier device;
the bearing equipment is provided with an adjusting mechanism controlled by the industrial personal computer and used for changing the three-dimensional space posture of the bearing equipment.
In one embodiment, the industrial cameras are two and move independently of each other.
In one embodiment, the stereoscopic target and the industrial camera are respectively provided with six-axis adjusting devices controlled by the industrial personal computer and used for changing the respective three-dimensional space postures.
In one embodiment, the baffle is configured with a switching mechanism controlled by the industrial personal computer, and the switching mechanism is used for driving the baffle to switch between the first position and the second position.
The application also provides an OST calibration method, which comprises the following steps:
Step S100, respectively providing OST-AR equipment to be calibrated, an industrial camera, a three-dimensional target and a baffle, wherein the three-dimensional target is provided with characteristic points, the OST-AR equipment comprises an AR display screen and a tracking camera which are correspondingly configured, and the baffle is provided with a first position for shielding the three-dimensional target by the industrial camera and a second position for avoiding the three-dimensional target relatively;
step 200, shooting an image target displayed by the AR display screen through the industrial camera when the baffle is at a first position, and obtaining virtual target data;
Step S300, when the baffle is at the second position, the three-dimensional target is respectively shot through the AR display screen by the industrial camera and the tracking camera to obtain solid target data;
and step 400, calculating and obtaining calibration parameters of the OST-AR equipment according to the virtual target data and the entity target data.
In one embodiment, in the step S400, calculating calibration parameters of the OST-AR device includes:
step S410, extracting feature points of the virtual body target data to obtain first information, wherein the first information is two-dimensional coordinate information of the virtual body feature points in an industrial camera imaging coordinate system;
Step S420, obtaining homography of imaging planes of the AR display screen and the industrial camera according to the first information and the feature point information in the image target;
Step S430, extracting feature points of the entity target data to obtain second information and third information, wherein the second information is two-dimensional coordinate information of the entity feature points in an imaging coordinate system of the industrial camera, and the third information is two-dimensional coordinate information of the entity feature points in an imaging coordinate system of the tracking camera;
Step S440, obtaining fourth information according to the second information and the homography relation, wherein the fourth information is two-dimensional coordinate information of the entity characteristic points in an AR display screen imaging coordinate system;
step S450, obtaining three-dimensional coordinate information of the physical feature points in the imaging coordinate system of the tracking camera according to the third information, the physical dimensions of the physical feature points and the internal parameters and distortion parameters of the tracking camera;
And step S460, obtaining the calibration parameters according to the fourth information and the three-dimensional coordinate information.
In one embodiment, the feature points are circular, and in step S410, further includes reading internal parameters and distortion parameters of the industrial camera, and performing a de-distortion operation on the virtual feature points according to the internal parameters and distortion parameters of the industrial camera to obtain the first information.
In one embodiment, in the step S460, the calibration parameters include an external parameter and a projection matrix of the oss-AR device, where the projection matrix is an asymmetric projection matrix.
The OST calibration system and the OST calibration method can be used for calibrating the AR head display equipment, a clearer image can be obtained through the baffle, in addition, the baffle is adopted to be convenient for adjusting the position, the system is suitable for different calibration scenes, and the universality of the equipment is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments or the conventional techniques of the present application, the drawings required for the descriptions of the embodiments or the conventional techniques will be briefly described below, and it is apparent that the drawings in the following descriptions are only some embodiments of the present application, and other drawings may be obtained according to the drawings without inventive effort for those skilled in the art.
FIG. 1 is a block diagram of an OST calibration system in accordance with one embodiment of the present application;
FIG. 2 is a schematic diagram of a portion of hardware of an OST calibration system according to an embodiment of the present application;
FIG. 3 is a schematic view of the three-dimensional target of FIG. 2;
FIG. 4 is a schematic diagram of functional modules of OST calibration software according to an embodiment of the present application;
FIG. 5 is a flow chart of an OST calibration method according to an embodiment of the application;
FIG. 6 is a flowchart showing the step S400 in FIG. 4;
FIG. 7 is a schematic diagram of a three-dimensional target movement pose calculation;
FIG. 8 is an interface schematic diagram of the re-projection error after implementing the OST calibration method of the present application;
Fig. 9 is an enlarged view of one set of feature points in fig. 8.
The reference numerals of the elements are as follows:
1. the device comprises a display, an industrial personal computer, a 3, six-axis motor, a 4, six-axis motor, a 5, a telescopic rod, a 6, six-axis motor, a 7, an industrial camera, a 8, an industrial camera, a 9, an AR head display device, a 10, a tracking camera, a 11, a baffle plate, a 12 and a three-dimensional target.
Detailed Description
In order that the above objects, features and advantages of the application will be readily understood, a more particular description of the application will be rendered by reference to the appended drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. The present application may be embodied in many other forms than described herein and similarly modified by those skilled in the art without departing from the spirit of the application, whereby the application is not limited to the specific embodiments disclosed below.
It will be understood that when an element is referred to as being "mounted" or "disposed" on another element, it can be directly on the other element or intervening elements may also be present. When a component is considered to be "connected" to another component, it can be directly connected to the other component or intervening components may also be present. The terms "vertical", "horizontal", "upper", "lower", "left", "right" and the like are used in the description of the present application for the purpose of illustration only and do not represent the only embodiment.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
In the present application, unless expressly stated or limited otherwise, a first feature "up" or "down" on a second feature may be that the first feature is in direct contact with the second feature, or that the first feature and the second feature are in indirect contact through intermedial media. Moreover, a first feature "above," "over" and "on" a second feature may be a first feature directly above or obliquely above the second feature, or merely indicate that the first feature is higher in level (or in a state of use, or in view of some drawing) than the second feature. A first feature "under", "beneath" and "under" a second feature may be a first feature directly under or obliquely below the second feature, or merely indicate that the first feature is less level than the second feature (or in a state of use, or at some viewing angle of the drawing).
Unless defined otherwise, all technical and scientific terms used in the specification of the present application have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. The term "and/or" as used in the description of the present application includes any and all combinations of one or more of the associated listed items.
In the prior art, in the process of OST off-line calibration, the target is required to be positioned in the fields of view of the AR head display equipment tracking camera and the industrial camera at the same time so as to extract the characteristic points. However, manual control is required to complete the operations at present, the process is not only tedious and depends on the experience of operators, but also can not ensure that each adjustment can meet the calibration requirement. At present, an automatic calibration system and equipment related to OST off-line calibration are still in a development perfection stage, and system-level automatic calibration equipment is absent.
The following is a description of related terms:
VR Virtual Reality.
AR Augmented Reality, augmented reality.
OST Optical See Through, optical perspective, by means of a semi-transparent semi-reflective optical system, reflects a virtual image while seeing the real world, and overlays it in the field of view.
VST Video See Through, video perspective, capturing video stream of scene by camera, superimposing virtual information into video stream, and finally rendering processed video stream on display frame by frame.
OST calibration-tracking cameras and AR display screens of OST-AR devices are calibrated to ensure alignment and accurate projection of virtual images to the real world.
Referring to fig. 1 to 3, an embodiment of the present application provides an oss calibration system, which is used for calibrating an oss-AR device (i.e., an AR head display device 9), where the oss-AR device includes an AR display screen and a tracking camera 10 configured correspondingly, the tracking camera 10 is used for capturing a stereoscopic target for data acquisition during oss calibration, the oss-AR device supports a screen capturing function, and an image target captured by the tracking camera 10 can be projected onto the AR display screen.
The OST calibration system of the present embodiment includes a stereoscopic target 12, an industrial camera, and a baffle 11. In order to install the OST-AR device during calibration, the bearing device can be further configured, the bearing device can adopt a head die or other supporting structures, and in order to change the three-dimensional space posture of the OST-AR device, the bearing device can be configured with an adjusting mechanism controlled by the industrial personal computer 2.
The industrial camera is used for simulating human eyes, in this embodiment, the industrial camera is two cameras for respectively simulating left eyes and right eyes, for example, the industrial camera 7 is used for acquiring virtual body and physical target data of the left eye position in the OST calibration, and the industrial camera 8 is used for acquiring virtual body and physical target data of the right eye position in the OST calibration.
The two industrial cameras are respectively provided with six-axis motors, for example, the industrial camera 7 is provided with a six-axis motor 3, the industrial camera 8 is provided with a six-axis motor 4, and each six-axis motor consists of a three-axis moving motor and a three-axis rotating motor and is used for controlling the corresponding industrial camera to move in six degrees of freedom. Both industrial cameras may be mounted to the carrier device, i.e. follow the carrier device as a whole with the OST-AR device, but both industrial cameras may be moved independently of each other with respect to the carrier device.
The stereoscopic target 12 is of a substantially cubic structure, and the stereoscopic target 12 is provided with a six-axis motor 6 controlled by the industrial personal computer 2 for changing the three-dimensional space posture of the stereoscopic target 12. Three adjacent faces (two by two perpendicular) of the stereoscopic target 12 are arranged with the same type of planar target for the industrial camera and tracking camera 10 to be able to acquire image information of multiple planar targets simultaneously.
Since the stereoscopic target 12 has three adjacent planar targets, one of the planar targets may be set as a stereoscopic target setting surface for convenience of calculation, as shown by a dashed frame in fig. 3. Each planar target is distributed with a group of characteristic points, and as one of improvement, the characteristic points adopted in the application are circular and are arranged in rows and columns, the characteristic points between adjacent rows are staggered, and the adoption of the circular characteristic points is more beneficial to recognition, operation and optimization of final calibration parameters, so that the re-projection error can be further reduced.
The baffle 11 is used for shielding the rear object of the AR display screen of the AR head display device 9, the baffle 11 can be provided with a switching mechanism controlled by the industrial personal computer 2, for example, the up-and-down motion of the baffle can be controlled by adopting a telescopic rod 5 or other driving modes, so that the baffle 11 can be switched between a first position and a second position, the baffle 11 shields the rear object of the AR display screen and shields the stereoscopic target 12 at least under the first position, the optical image of the stereoscopic target 12 is prevented from directly penetrating through the AR display screen to enter the industrial camera, the industrial camera is facilitated to shoot the image target displayed by the AR display screen, and clear virtual target data at the left eye position and the right eye position are acquired.
The baffle 11 avoids the stereoscopic target 12 in the second position, so that the optical image of the stereoscopic target 12 directly penetrates through the AR display screen to enter the industrial camera, and the industrial camera can shoot the stereoscopic target 12 through the AR display screen conveniently, and clear solid target data are obtained.
The mode of baffle 11 is adopted in this embodiment, not only simple structure and flexible operation, of course do not have strict limitation in terms of its own shape and the motion mode of switching position, also can set up six motors if necessary to adapt to the gesture change of AR head display device 9, can guarantee and shelter from the effect and be favorable to image acquisition's definition, baffle 11 and AR head display device 9 both can follow-up adaptation and can the mutual separation be convenient for adjust, have further improved the commonality of equipment.
The industrial personal computer 2 serves as a carrier for control, data acquisition, storage, command transmission, data processing and software algorithms of a plurality of cameras and various moving parts. The industrial personal computer 2 may be connected to a display 1, and previewing, software interface, data visualization, etc. of each camera may be performed by the display 1.
In the calibration process, the industrial personal computer 2 acquires virtual target data and physical target data and correspondingly calculates to acquire calibration parameters of the OST-AR equipment.
Taking the left eye calibration as an example, the components in fig. 2 take the case of the oss calibration, the tracking camera 10 and the industrial camera 7 need to acquire stereo target images at the same time, and the oss external parameters (i.e. external parameters of the oss-AR device) from the tracking camera 10 to the left eye point and the projection matrix of the left eye point on the AR projection screen are calibrated according to the corresponding relation of the feature points.
The virtual target data is obtained by photographing an image target displayed on the AR display screen through the industrial camera when the baffle 11 is at the first position, and the physical target data is obtained by photographing a stereoscopic target through the AR display screen through the industrial camera and photographing the stereoscopic target through the tracking camera 10 when the baffle 11 is at the second position.
The corresponding configuration software modules in the industrial personal computer 2, namely the OST calibration software in FIG. 4, the functional modules may include:
The logic control (module) is used for sending control instructions of all cameras, controlling the telescopic rod to stretch and retract and controlling the AR head display equipment to be opened and closed in a screen throwing mode.
And the motion control (module) is used for resolving the pose of each six-axis motor and correspondingly controlling the motion of each six-axis motor.
And (3) algorithm calling (module) for inputting the acquired data into an OST calibration algorithm for calculation.
And the result visualization (module) is used for visualizing the result marked by the algorithm at the software end for the user to view.
The OST calibration system can be provided with a planar target mode and a three-dimensional target mode so as to be suitable for tracking cameras of different AR head display devices, and has good expansibility, for example, the three-dimensional target mode is suitable for the condition that the distortion of the tracking camera of the AR head display device is small, and the planar target mode is suitable for the condition that the distortion of the tracking camera of the AR head display device is large.
The automatic control of the movement of the three-dimensional target greatly reduces personnel operation, improves calibration efficiency, and can directly calculate the six-axis tail end pose increment of the three-dimensional target moving to the target pose through pose conversion, for example, so as to realize automatic adjustment of calibration.
The industrial cameras adopted by the OST calibration system support the lenses with different fov, and each camera is provided with an independent movement mechanism for pose adjustment, so that the device can be used for adaptively calibrating devices with different photomachine fov and different forms, and the device adaptation degree is high. The two industrial cameras are respectively provided with independent pose control modules in control, and can be adapted to calibrating equipment with different forms.
Referring to fig. 5 to fig. 6, an embodiment of the present application further provides an oss calibration method, which may be implemented by the oss calibration system of any one of the above embodiments, where the oss calibration method specifically includes:
step S100, respectively providing OST-AR equipment to be calibrated, an industrial camera, a three-dimensional target and a baffle, wherein the three-dimensional target is provided with characteristic points, the OST-AR equipment comprises an AR display screen and a tracking camera which are configured correspondingly, and the baffle is provided with a first position for shielding the three-dimensional target by the industrial camera and a second position for avoiding the three-dimensional target relatively.
And step 200, shooting an image target displayed on the AR display screen by an industrial camera when the baffle is at the first position, and obtaining virtual target data.
And step S300, respectively shooting the stereoscopic target through the AR display screen by the industrial camera and the tracking camera to obtain the data of the solid target when the baffle is at the second position.
And step 400, calculating and obtaining calibration parameters of the OST-AR equipment according to the virtual target data and the physical target data.
In one embodiment, in step S400, calculating calibration parameters of the OST-AR device includes:
step S410, extracting feature points of the virtual body target data to obtain first information, where the first information is two-dimensional coordinate information of the feature points of the virtual body in an industrial camera imaging coordinate system.
In the step, in order to further optimize the calibration effect, the method further comprises the steps of reading internal parameters and distortion parameters of the industrial camera, and performing de-distortion operation on the virtual body characteristic points according to the internal parameters and the distortion parameters of the industrial camera to obtain first information.
Step S420, according to the first information and the feature point information in the image target, obtaining the homography of the imaging planes of the AR display screen and the industrial camera, wherein the homography can be seen in the following formula;
Wherein:
The method comprises the steps of obtaining feature point information in an image target, namely pixel coordinates of virtual feature points in an imaging plane of an AR display screen;
P 2D (testCam) is first information;
H (testCam display) is the homography of the imaging plane of the industrial camera to the imaging plane of the AR display.
In order to adapt to different AR devices, the selected industrial camera supports carrying different fov lenses, and since the image acquired by the industrial camera is subjected to de-distortion processing in step S410, the error of the OST calibration is reduced.
And step S430, extracting feature points of the entity target data to obtain second information and third information, wherein the second information is two-dimensional coordinate information of the entity feature points in an imaging coordinate system of the industrial camera, and the third information is two-dimensional coordinate information of the entity feature points in an imaging coordinate system of the tracking camera.
And S440, obtaining fourth information according to the second information and the homography relation, wherein the fourth information is two-dimensional coordinate information of the entity characteristic points in an AR display screen imaging coordinate system.
And S450, obtaining three-dimensional coordinate information of the physical feature points in the imaging coordinate system of the tracking camera according to the third information, the physical dimensions of the physical feature points and internal parameters and distortion parameters of the tracking camera.
And step S460, obtaining calibration parameters according to the fourth information and the three-dimensional coordinate information. The calibration parameters comprise external parameters and projection matrixes of the OST-AR equipment, and the following relational expression can be seen by taking left eye (right eye is the same as the left eye) optimization as an example:
Wherein:
residual is the residual, i.e. the re-projection error (ideal state 0);
is fourth information;
P 3D (RGB) is three-dimensional coordinate information of the entity feature points in a tracking camera imaging coordinate system;
m is a projection matrix of the OST-AR device;
T (RGB 2 Lefteye) is an external parameter of the OST-AR device, namely, a pose conversion matrix for tracking a camera coordinate system to a left eye point.
Because the left and right industrial cameras are independently controlled, the optical axis of the left and right industrial cameras is not strictly consistent with the normal line of the center of the AR projection screen, the projection matrix M is optimized in an asymmetric projection matrix mode by an improved algorithm, and the error of OST calibration is reduced.
It should be understood that, although the steps in the related flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the figures may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor does the order in which the sub-steps or stages are performed necessarily performed in sequence, but may be performed alternately or alternately with at least a portion of other steps or other steps.
In one implementation, the following steps are performed for OST calibration:
And each power-on device is powered on to normally supply power, and the six-axis motor is reset to a preset initial position, so that the setting surface of the three-dimensional target is opposite to the industrial camera.
Opening OST calibration software of the industrial personal computer, and inputting the model of the AR head display device to be tested.
And opening an AR display screen of the AR head display equipment to be tested, placing the AR head display equipment on a head die, and adjusting the head band to ensure that the AR head display equipment is firmly worn without loosening.
Parameters such as internal parameters of two industrial cameras and tracking cameras, types and characteristic point parameters of a virtual target (image target) and a three-dimensional target, image resolution of the virtual target, pixel coordinates of initial characteristic points and the like are input into a software interface.
And clicking an automatic calibration button, and starting running OST calibration software.
OST calibration software controls the extension rod to extend, so that the baffle plate ascends, and the industrial camera can clearly shoot an AR display screen.
OST calibration software controls the six-axis motor to respectively adjust the virtual targets in the AR display screen which can be completely and clearly shot by the two industrial cameras.
The OST calibration software controls the industrial camera to collect and store virtual target images (virtual target data) of the left and right eye positions.
OST calibration software controls the AR head display equipment to close the screen.
OST calibration software controls the telescopic rod to retract, so that the baffle plate descends.
OST calibration software controls the industrial camera to collect and store the current stereo target set face image.
The OST calibration software runs a six-axis motor tail end movement pose calculation algorithm, and the pose variation quantity of the six-axis motor tail end is calculated through the three-dimensional target setting surface and the hand-eye calibration external parameters of the six-axis motor tail end, the external parameters of the current three-dimensional target setting surface relative to the industrial camera and the external parameters of the three-dimensional target setting surface relative to the industrial camera under the ideal OST calibration pose.
As shown in fig. 7, wherein:
P is the origin of a coordinate system of a stand of a six-axis adjusting device (a six-axis motor) of the three-dimensional target;
a0 is the current position of the tail end of the six-axis adjusting device of the three-dimensional target, and correspondingly, b0 is the corresponding position of the origin of the coordinate system of the setting surface of the three-dimensional target;
a1 is the tail end position of the six-axis adjusting device under the target pose of the three-dimensional target, and correspondingly, b1 is the corresponding position of the origin of the coordinate system of the setting surface of the three-dimensional target;
The left broken line shows the initial position of the industrial camera, and the position below the initial position is the position where the industrial camera collects the data of the virtual target, wherein c is the origin of the coordinate system of the industrial camera.
The corresponding relationship can be obtained according to the illustration, for example, the corresponding relationship Tc_b0 between c and b0, the corresponding relationship Tc_b1 between c and b1, the corresponding relationship Ta_b between a0 and b0, the corresponding relationship Ta_b between a1 and b1, and the corresponding relationship Ta0_a1 between a0 and a1.
The OST calibration software runs a six-axis mechanism motion control algorithm, and controls the six-axis motor to move according to the pose change quantity of the tail end, so that the three-dimensional target is moved to an ideal OST calibration pose.
The OST calibration software controls the industrial camera and the tracking camera to acquire and store the current stereoscopic target image (stereoscopic target data).
OST calibration software runs an OST calibration algorithm, an industrial camera and a tracking camera are utilized to collect the corresponding relation of characteristic points in the three-dimensional target, OST external parameters and projection matrixes of the AR head display device are calculated, OST calibration results are stored to the local, and the OST calibration results are visualized on a display screen.
Closing OST calibration software or clicking a reset button, and resetting the motor to a preset initial position.
And taking the AR head display equipment off the head mould, closing the industrial personal computer, and powering off the equipment.
And the OST automatic calibration system finishes all calibration items, including OST external parameters and projection matrixes.
As shown in fig. 8 and 9, the upper left corner number in fig. 8 is the average re-projection error, the number above each feature point is the feature point re-projection error, the error unit is a score, and in the accuracy verification result, the maximum re-projection error of the present application is not more than 2 scores.
The OST calibration system has high automation degree, improves the OST calibration efficiency, reduces the operation threshold, has little influence of human factors, can realize a calibration result with good consistency, is convenient for standardized implementation, has good expandability, can perform OST calibration on most products, and has higher calibration precision.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description. When technical features of different embodiments are embodied in the same drawing, the drawing can be regarded as a combination of the embodiments concerned also being disclosed at the same time.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the claims. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be determined from the following claims.

Claims (10)

1. An OST calibration system for calibrating an OST-AR device, the OST-AR device including a correspondingly configured AR display and tracking camera, the OST calibration system comprising:
The three-dimensional target is distributed with a plurality of groups of characteristic points;
An industrial camera for simulating a human eye;
the baffle is provided with a first position for shielding the stereoscopic target for the industrial camera and a second position for opposite avoidance;
The industrial personal computer is used for acquiring virtual body target data and entity target data and correspondingly calculating to obtain calibration parameters of the OST-AR equipment, wherein the virtual body target data is obtained by shooting an image target displayed by the AR display screen through the industrial camera when the baffle is at a first position, and the entity target data is obtained by shooting the three-dimensional target through the AR display screen through the industrial camera and shooting the three-dimensional target through the tracking camera when the baffle is at a second position.
2. The OST calibration system of claim 1, wherein the solid target comprises a plurality of planar targets oriented differently and intersecting each other, each set of feature points being distributed on a corresponding planar target and each feature point being circular.
3. The OST calibration system according to claim 1, further comprising a carrier device for mounting an OST-AR device, the industrial camera being mounted to the carrier device;
the bearing equipment is provided with an adjusting mechanism controlled by the industrial personal computer and used for changing the three-dimensional space posture of the bearing equipment.
4. The OST calibration system of claim 3, wherein the industrial camera is two and moves independently of each other.
5. The oss calibration system of claim 1, wherein the stereo target and the industrial camera are respectively configured with six-axis adjusting devices controlled by the industrial personal computer for changing respective three-dimensional spatial attitudes.
6. The OST calibration system of claim 1 wherein the shutter is configured with a switching mechanism controlled by the industrial personal computer for driving the shutter to switch between the first position and the second position.
7. An oss calibration method, comprising:
Step S100, respectively providing OST-AR equipment to be calibrated, an industrial camera, a three-dimensional target and a baffle, wherein the three-dimensional target is provided with characteristic points, the OST-AR equipment comprises an AR display screen and a tracking camera which are correspondingly configured, and the baffle is provided with a first position for shielding the three-dimensional target by the industrial camera and a second position for avoiding the three-dimensional target relatively;
step 200, shooting an image target displayed by the AR display screen through the industrial camera when the baffle is at a first position, and obtaining virtual target data;
Step S300, when the baffle is at the second position, the three-dimensional target is respectively shot through the AR display screen by the industrial camera and the tracking camera to obtain solid target data;
and step 400, calculating and obtaining calibration parameters of the OST-AR equipment according to the virtual target data and the entity target data.
8. The oss calibration method of claim 7, wherein in step S400, calculating calibration parameters of the oss-AR device includes:
step S410, extracting feature points of the virtual body target data to obtain first information, wherein the first information is two-dimensional coordinate information of the virtual body feature points in an industrial camera imaging coordinate system;
Step S420, obtaining homography of imaging planes of the AR display screen and the industrial camera according to the first information and the feature point information in the image target;
Step S430, extracting feature points of the entity target data to obtain second information and third information, wherein the second information is two-dimensional coordinate information of the entity feature points in an imaging coordinate system of the industrial camera, and the third information is two-dimensional coordinate information of the entity feature points in an imaging coordinate system of the tracking camera;
Step S440, obtaining fourth information according to the second information and the homography relation, wherein the fourth information is two-dimensional coordinate information of the entity characteristic points in an AR display screen imaging coordinate system;
step S450, obtaining three-dimensional coordinate information of the physical feature points in the imaging coordinate system of the tracking camera according to the third information, the physical dimensions of the physical feature points and the internal parameters and distortion parameters of the tracking camera;
And step S460, obtaining the calibration parameters according to the fourth information and the three-dimensional coordinate information.
9. The OST calibration method according to claim 8, wherein the feature points are circular, and step S410 further comprises reading internal parameters and distortion parameters of the industrial camera, and performing de-distortion operation on the virtual feature points according to the internal parameters and distortion parameters of the industrial camera to obtain the first information.
10. The OST calibration method according to claim 8, wherein in step S460 the calibration parameters include an external parameter of the OST-AR device and a projection matrix, wherein the projection matrix is an asymmetric projection matrix.
CN202411657524.8A 2024-11-19 2024-11-19 OST calibration system and OST calibration method Pending CN119672123A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411657524.8A CN119672123A (en) 2024-11-19 2024-11-19 OST calibration system and OST calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411657524.8A CN119672123A (en) 2024-11-19 2024-11-19 OST calibration system and OST calibration method

Publications (1)

Publication Number Publication Date
CN119672123A true CN119672123A (en) 2025-03-21

Family

ID=94994053

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411657524.8A Pending CN119672123A (en) 2024-11-19 2024-11-19 OST calibration system and OST calibration method

Country Status (1)

Country Link
CN (1) CN119672123A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120339415A (en) * 2025-06-16 2025-07-18 杭州秋果计划科技有限公司 Calibration detection method and system for head mounted display device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120339415A (en) * 2025-06-16 2025-07-18 杭州秋果计划科技有限公司 Calibration detection method and system for head mounted display device

Similar Documents

Publication Publication Date Title
CN110335292B (en) Method, system and terminal for realizing simulation scene tracking based on picture tracking
US10958891B2 (en) Visual annotation using tagging sessions
US20240273934A1 (en) Object tracking assisted with hand or eye tracking
CN109729365B (en) A video processing method, device, intelligent terminal, and storage medium
JP6090786B2 (en) Background difference extraction apparatus and background difference extraction method
EP3572916B1 (en) Apparatus, system, and method for accelerating positional tracking of head-mounted displays
CN108292489A (en) Information processing unit and image generating method
AU1948999A (en) Method and apparatus for generating virtual views of sporting events
JP7182920B2 (en) Image processing device, image processing method and program
JP2017174125A (en) Information processing apparatus, information processing system, and information processing method
US10602117B1 (en) Tool for onsite augmentation of past events
EP3857499A1 (en) Panoramic light field capture, processing and display
CN110648274A (en) Fisheye image generation method and device
US20210056662A1 (en) Image processing apparatus, image processing method, and storage medium
CN106296598A (en) 3 d pose processing method, system and camera terminal
CN115985209B (en) Wearable display device, control method, control device, electronic device and storage medium
CN112669436A (en) Deep learning sample generation method based on 3D point cloud
CN119672123A (en) OST calibration system and OST calibration method
JP2019133214A (en) Image display apparatus, video display system including apparatus, image display method and program for displaying image
Pinhanez et al. Intelligent studios: Using computer vision to control TV cameras
US11187914B2 (en) Mirror-based scene cameras
JP4700476B2 (en) Multi-view video composition device and multi-view video composition system
US20210082174A1 (en) Image processing apparatus, image processing method, and storage medium
CN110189263A (en) It is a kind of based on multi-angle sampling big visual field wear display equipment distortion correction method
CN118967796A (en) Virtual display calibration method and device for extended reality device and extended reality device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination