Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The lens distortion correction coefficient calibration method provided by the embodiment of the application can be applied to a processor, the processor is connected with a camera to be tested to acquire images acquired by the camera to be tested, the camera to be tested is provided with an internal or external rotating device, and the processor can be also connected with the rotating device and can be used for rotating the camera by controlling the rotating device. Furthermore, in actual use, the camera to be tested can obtain the distortion correction coefficient by adopting the lens distortion correction coefficient calibration method provided by the embodiment of the application, and the acquired image is subjected to distortion correction.
In one embodiment, as shown in fig. 1, a lens distortion correction coefficient calibration method is provided, and the method is applied to a processor in the application environment, for example, and includes the following steps:
step S100, a plurality of first images acquired by a camera to be detected are acquired, and principal point coordinates of the camera to be detected are determined.
Wherein the principal point may be a point where the optical axis intersects the image plane in an ideal case. For example, an image center point may be used for calibrating the distortion correction coefficient.
The plurality of first images includes a plurality of first images of vertical light rays and a plurality of first images of horizontal light rays. The target is arranged in the visual field of the camera and used for reflecting light of vertical light rays and horizontal light rays, so that the reflected light enters the camera. Furthermore, the optical axis of the camera to be measured can be determined according to the appearance of the camera to be measured, and the target is arranged on a plane perpendicular to the optical axis. The first image of the vertical light may be a first image of the vertical light projected on the reticle and presented as a vertical light in the camera frame to be measured, and the corresponding first image of the horizontal light may be a first image of the horizontal light projected on the reticle and presented as a horizontal light in the camera frame to be measured.
And acquiring a plurality of first images acquired by the camera to be detected, and determining the principal point coordinates of the camera to be detected, namely determining the principal point coordinates of the camera to be detected according to the first images of the plurality of vertical light rays and the first images of the plurality of horizontal light rays. For example, the principal point coordinates of the camera to be measured may be determined according to the distortion condition of the light rays in the plurality of first images.
It will be appreciated that the picture distortion of a camera is typically due to inherent characteristics of the optical lens, such as converging rays of light by the convex lens, diverging rays by the concave lens, etc., resulting in picture distortion. The point intersecting the optical axis in the image plane, i.e., the principal point, has a refraction angle of 0 ° perpendicular to the interface of the convex lens and the concave lens, and therefore the principal point, which has the minimum degree of distortion of the picture, can be regarded as a point where no distortion exists.
The center of the picture plane can be determined through the determination of the principal point coordinates, so that the determination of the subsequent distortion coefficients is facilitated.
Step S200, controlling the camera to be tested to rotate according to a preset rotation rule, collecting a plurality of second images and recording the rotation angle of each second image.
The plurality of second images comprise a plurality of second images of vertical light rays and a plurality of second images of horizontal light rays. The second image of the vertical light may be a second image of the vertical light projected on the reticle and presented as a vertical light in the camera frame to be measured, and the corresponding second image of the horizontal light may be a second image of the horizontal light projected on the reticle and presented as a horizontal light in the camera frame to be measured.
The camera to be tested is controlled to rotate according to a preset rotation rule, wherein the preset rotation rule can be that the camera rotates around one axis or a plurality of axes according to a preset angle sequence or a preset mode, so that as many spatial orientations as possible are covered, and a more comprehensive data set is obtained.
The camera is controlled to rotate according to a certain rule to shoot, so that image data under different visual angles can be obtained, and distortion conditions of camera pictures under different angles can be obtained.
The rotation angle of each second image is recorded, so that each second image corresponds to one rotation angle respectively, and the subsequent determination of the distortion correction coefficient is facilitated.
And step S300, determining a distortion correction coefficient of the camera to be tested according to the plurality of second images, the rotation angle of each second image and the principal point coordinates.
The distortion correction coefficient may be a set of values representing the degree of nonlinear distortion caused by the lens of the camera, and may include, for example, a radial distortion correction coefficient that deals with lens bending, a tangential distortion correction coefficient that deals with lens mounting errors, and the like.
And fitting and/or interpolating the distortion curve by utilizing the collected data comprising the second images under different angles and the corresponding angle values, and performing optimization solving by minimizing the error between the predicted value and the actual observed value to finally obtain the optimal distortion correction coefficient. Further, the distortion curve may be fitted using, for example, the Brown-Conrady model or other existing models, and the present embodiment is not limited herein.
The obtained distortion correction coefficient is applied to image processing software, so that the problem of image deformation caused by lens distortion can be corrected, and the overall performance of a vision system is improved.
The lens distortion correction coefficient calibration method comprises the steps of determining principal point coordinates of a camera to be tested by obtaining a plurality of first images collected by the camera to be tested, wherein the plurality of first images comprise a plurality of first images of vertical light rays and a plurality of first images of horizontal light rays, controlling the camera to be tested to rotate according to a preset rotation rule, collecting a plurality of second images and recording rotation angles of each second image, wherein the plurality of second images comprise a plurality of second images of the vertical light rays and a plurality of second images of the horizontal light rays, and determining distortion correction coefficients of the camera to be tested according to the plurality of second images, the rotation angles of each second image and the principal point coordinates, wherein the finding of the principal point coordinates is completed through the vertical light rays and the horizontal light rays, and then determining the distortion correction coefficients of the camera to be tested according to the second images, so that the effect of improving distortion calibration accuracy under the condition of low resolution is achieved.
In one embodiment, the acquiring the plurality of first images acquired by the camera to be measured, and determining the principal point coordinates of the camera to be measured includes:
when the light source device projects vertical light on the target, acquiring a plurality of first sub-images acquired by the camera to be tested at a plurality of angles;
When the light source device projects horizontal light on the target, acquiring a plurality of second sub-images acquired by the camera to be tested at a plurality of angles;
and determining the principal point coordinates of the camera to be tested according to the plurality of first sub-images and the plurality of second sub-images.
The light source device may be a device for generating a specific mode of light, and in this embodiment, the light source device may be a device for generating vertical light or horizontal light, respectively, and projecting the vertical light or the horizontal light onto the target. In a specific embodiment, the light source device may be a line light source.
In this embodiment, the first image of the vertical light may be a first sub-image, and the first image of the horizontal light may be a second sub-image.
It can be understood that the first sub-image and the second sub-image may be acquired at multiple angles, and exemplary, the vertical light and the horizontal light may be projected in a translation manner, so that the vertical light and the horizontal light at different positions are projected on the target plate in sequence, and further the first sub-image and the second sub-image acquired at multiple angles are obtained, or in another example, the camera may be rotated according to a preset rotation rule, so that positions of the vertical light and the horizontal light in the image are moved, and thus the first sub-image and the second sub-image acquired at multiple angles are obtained.
And determining principal point coordinates of the camera to be detected according to the first sub-images and the second sub-images, wherein the principal point is determined to be the point least affected by lens distortion according to light distortion conditions in the first sub-images and the second sub-images, and the coordinate position of the principal point coordinates in a picture is determined.
According to the lens distortion correction coefficient calibration method, the main point coordinates are determined by combining the image data in the vertical direction and the parallel direction, so that the estimation of the main point coordinates is more accurate, and the effect of improving the distortion calibration accuracy under the condition of low resolution is achieved.
In one embodiment, the acquiring the plurality of first images acquired by the camera to be measured, and determining the principal point coordinates of the camera to be measured includes:
When the camera to be tested is in a first state and the light source device projects vertical light on the target, controlling the camera to be tested to rotate by a preset rotation angle along a preset rotation direction by taking a preset initial position as a starting position, collecting a plurality of first sub-images and recording the rotation angle of each first sub-image;
When the camera to be tested is in the second state and the light source device projects vertical light on the target, the camera to be tested is controlled to rotate by a preset rotation angle along a preset rotation direction by taking a preset initial position as a starting position, a plurality of second sub-images are collected, and the rotation angle of each second sub-image is recorded.
The first state may be a first placement state, and the second state may be a second placement state.
In this embodiment, the optical axes of the first state and the second state at the preset initial position may be the same.
The preset rotation angle is rotated along the preset rotation direction, which may be by taking the preset initial position as the initial position, and the preset rotation angle is rotated along the preset rotation direction, and the image acquisition is performed under each stepping rotation angle, and the rotation angle corresponding to the image is recorded.
In a specific embodiment, under the condition that the position of the optical axis is unchanged, the camera to be detected in the first state can be rotated by 90 degrees to enter the second state, or the camera to be detected in the second state can be rotated by 90 degrees to enter the first state, so that the vertical light rays on the target plate are converted into the horizontal light rays under the coordinate system of the camera to be detected, and a second image of a plurality of vertical light rays and a second image of a plurality of horizontal light rays are acquired.
In one embodiment, the determining the principal point coordinates of the camera to be measured according to the plurality of first sub-images and the plurality of second sub-images includes:
Determining a first undistorted image in the plurality of first sub-images and a second undistorted image in the plurality of second sub-images;
Determining an abscissa corresponding to the principal point coordinate according to the first undistorted image;
and determining the ordinate corresponding to the principal point coordinate according to the second undistorted image.
The undistorted image may be an image that is not affected by significant lens distortion during photographing.
The determining the first undistorted image in the plurality of first sub-images may be determining an image, which is least affected by lens distortion, in the plurality of first sub-images as the first undistorted image by fitting vertical light rays or occupying the number of pixel columns in the image by the vertical light rays, from the plurality of first sub-images.
Correspondingly, determining the second undistorted image in the plurality of second sub-images may be to determine, from the plurality of second sub-images, an image in the plurality of second sub-images that is least affected by lens distortion by fitting horizontal light or occupying the number of pixel rows in the image with horizontal light, and the like, as the second undistorted image.
It will be appreciated that since vertical and horizontal lines are projected, the first and second undistorted images are theoretically perfectly perpendicular to the image edges in the picture when there is no lens distortion.
Therefore, according to the first undistorted image, the abscissa corresponding to the main point coordinate is determined, which may be the abscissa value of the main point coordinate for the pixel point abscissa position of one or more points on the vertical light in the first undistorted image.
Correspondingly, determining the ordinate corresponding to the principal point coordinate according to the second undistorted image may be determining the ordinate value of the principal point coordinate for the ordinate position of the pixel point of one or more points on the horizontal ray in the second undistorted image.
According to the calibration method for the lens distortion correction coefficient, the main point coordinate and the ordinate of the main point coordinate are respectively determined by using the undistorted image, so that the main point of a camera can be more accurately positioned, errors caused by lens distortion can be effectively reduced by accurately measuring and analyzing line characteristics in the image, and particularly, the calibration precision is improved under the low resolution condition, meanwhile, the main point coordinate is determined by using the first undistorted image and the second undistorted image, the calculation process is simplified, and the main point coordinate can be obtained by processing a small amount of key images, so that the effect of improving the distortion calibration accuracy and the calibration efficiency under the low resolution condition is achieved.
In one embodiment, the controlling the camera to be tested to rotate according to a preset rotation rule, and collecting a plurality of second images and recording the rotation angle of each second image includes that the preset rotation rule includes a preset initial position, a preset rotation direction and a preset rotation angle;
When the camera to be tested is in a first state and the light source device projects vertical light on the target, controlling the camera to be tested to rotate by a preset rotation angle along a preset rotation direction by taking a preset initial position as a starting position, collecting a plurality of third sub-images and recording the first rotation angle of each third sub-image;
When the camera to be tested is in the second state and the light source device projects vertical light on the target, the camera to be tested is controlled to rotate by a preset rotation angle along a preset rotation direction by taking a preset initial position as a starting position, a plurality of fourth sub-images are collected, and the second rotation angle of each fourth sub-image is recorded.
The first state may be a first placement state, and the second state may be a second placement state.
In this embodiment, the optical axes of the first state and the second state at the preset initial position may be the same.
The preset rotation angle is rotated along the preset rotation direction, which may be by taking the preset initial position as the initial position, and the preset rotation angle is rotated along the preset rotation direction, and the image acquisition is performed under each stepping rotation angle, and the rotation angle corresponding to the image is recorded. In the first state, a first rotation angle corresponding to the recorded image is set, and in the second state, a second rotation angle corresponding to the recorded image is set.
In a specific embodiment, the camera may be rotated 90 ° clockwise or counterclockwise under the condition that the optical axis of the camera is unchanged, so that the vertical light rotates 90 ° in the image of the camera to be measured, and thus the camera in the first state and the second state may be rotated according to the preset rotation direction under the condition that the position of the vertical light on the target is unchanged, so that the third sub-image and the fourth sub-image acquired under multiple angles may be obtained.
It can be understood that the form of the vertical light in the camera picture can be a vertical line and a horizontal line, and can also be oblique lines, and another oblique line light in the picture can be obtained by rotating the camera to be detected, and the distortion rule of the same oblique line under different angles can be obtained by moving the camera according to a preset rotation rule.
It will be appreciated that the camera internal parameters can be derived using known rotation angles and characteristic variations in the image. For example, in the case of vertical rays, as the camera rotates, the lines behave differently in the image, from which differences the intrinsic properties of the camera can be extrapolated back.
According to the lens distortion correction coefficient calibration method, rotation, image acquisition and rotation angle recording are respectively carried out on the camera to be tested in the first state and the second state, so that the optical characteristics of the camera can be determined, further, a more accurate distortion correction coefficient can be obtained, and the effects of improving distortion calibration accuracy and calibration efficiency under the condition of low resolution are achieved.
In one embodiment, the determining the distortion correction coefficient of the camera to be measured according to the plurality of second images, the rotation angle of each second image, and the principal point coordinates includes:
determining a target position corresponding to the principal point coordinates in each second image according to the second images and the principal point coordinates;
And determining a distortion correction coefficient of the camera to be tested according to the target position corresponding to each second image and the rotation angle corresponding to each second image.
It can be understood that the determination of the coordinates of the principal point is equivalent to simultaneously determining the position of the optical axis in the image of the camera to be measured and aligning the optical axis of the camera to be measured with the vertical light on the target, so that the principal point can be determined as a point on the vertical light in the actual scene under the condition of determining the coordinates of the principal point.
After the principal point coordinates are determined, the principal point on the vertical light on the target may be used as a reference point, and by rotating the camera, since the position of the optical axis in the image of the camera to be measured is not affected by the distortion of the lens, the optical axis is rotated by 30 ° at the same time when the camera rotates from 0 ° to 30 °, and the distortion condition of the image at 30 ° may be determined according to the moving distances of the principal point on the image acquired at 0 ° and 30 ° respectively in the image, so as to determine the distortion correction coefficient.
The target position corresponding to the principal point coordinates in each second image is determined according to the plurality of second images and the principal point coordinates, and may be a known position of the principal point on the light ray, after rotation, the coordinates of the principal point in the second image are located according to the light ray in the second image, and the coordinates of the principal point on the light ray in the second image are taken as the target position.
It will be appreciated that as the camera angle rotates, the position of the principal point in the image does not necessarily correspond to the theoretical position corresponding to the actual rotation angle, and illustratively, in the case of fish-eye distortion, the image distance of the principal point in the second image from the center point of the second image may be greater than the actual distance within a certain distance from the principal point, and the distance of the principal point in the second image from the center point of the second image may be less than the actual distance outside a certain distance from the principal point, exhibiting near magnification, far reduction distortion effects. The principal point position in the current second image can be determined by the position of the vertical ray in the second image and the principal point coordinates previously determined.
And determining a distortion correction coefficient of the camera to be detected according to the target position corresponding to each second image and the rotation angle corresponding to each second image, wherein the distortion correction coefficient can be determined according to the rotation angle corresponding to the second image, and the actual distance between the center point and the main point of the image in the actual scene, so that the distortion correction coefficient of the camera to be detected at the position can be correspondingly generated according to the image distance and the actual distance.
Further, the distortion may be expressed in polynomial form, and for a conventional low-pixel lens, the distortion correction coefficient may be determined using, for example, a Brown-Conrady model, and for a low-pixel wide-angle lens, the distortion correction coefficient may be determined using, for example, a Kanala-Brandt model.
According to the lens distortion correction coefficient calibration method, the distortion characteristics are extracted from the data collected under the multiple angles, and the accurate distortion correction coefficient is calculated according to the distortion characteristics, so that the effects of improving the distortion calibration accuracy and the calibration efficiency under the condition of low resolution can be achieved.
In one embodiment, the determining the distortion correction coefficient of the camera to be measured according to the plurality of second images, the rotation angle of each second image, and the principal point coordinates may further include:
determining at least one rotation angle to be interpolated based on a preset resolution;
determining two second images with rotation angles adjacent to the single rotation angle to be interpolated based on the rotation angle to be interpolated;
Determining principal point coordinates under the rotation angle to be interpolated based on the relative positions of the rotation angle to be interpolated and the rotation angles of the two second images and the principal point coordinates of the two second images;
and determining a distortion correction coefficient of the camera to be detected according to the plurality of second images, the rotation angle and the principal point coordinate of each second image, the rotation angle to be interpolated and the principal point coordinate thereof.
It can be understood that in the case of limited calibration conditions, for example, the cases of limited calibration time, limited rotation precision and the like, the lens distortion correction coefficient calibration method provided by the embodiment can properly relax the requirement on the calibration conditions by means of interpolation and re-fitting, so that under the condition of limited calibration time, less data volume required by the distortion correction coefficient fitting can still obtain more accurate distortion correction coefficients, and under the condition of abundant calibration conditions, the accuracy of the distortion correction coefficients can be further improved by means of interpolation, and the effect of improving the distortion calibration accuracy is achieved.
In one embodiment, the determining, according to the plurality of second images and the principal point coordinates, the target position corresponding to the principal point coordinates in each of the second images includes:
searching a pixel point with the same ordinate and the maximum pixel value in the third sub-image as a first target position based on the ordinate of the main point coordinate;
and searching the pixel point with the same abscissa and the maximum pixel value in the fourth sub-image as a second target position based on the abscissa of the main point coordinate.
In this embodiment, the preset initial position of the camera in the first state may be a vertical line form of the vertical light in the image, and may be located at the left edge, the center, the right edge, or other positions set based on actual requirements in the screen. The preset initial position of the camera in the second state can be that the vertical light rays show a transverse line form in the image, and can be positioned at the upper edge, the center, the upper edge or other positions set based on actual requirements in the picture. It will be appreciated that the camera may be rotated 90 ° clockwise or counter-clockwise with the optical axis unchanged, from the first state to the second state, or from the second state to the first state.
The vertical light rays are represented as a bright line in the image, so that the brightness value of the pixel points in the image, which are in the image and only the image area where the vertical light rays are located, is highest under the same ordinate of the first state, and therefore the abscissa position and the ordinate position can be determined according to the pixel point with the highest pixel value, and the first target position can be obtained.
Under the same abscissa in the second state, the brightness value of the pixel point in the image area where only the vertical light is located in the image is highest, so that the abscissa position and the ordinate position can be determined according to the pixel point with the largest pixel value, and the second target position is obtained.
According to the lens distortion correction coefficient calibration method, the pixel point with the largest pixel value is used as the target position according to the abscissa and ordinate positions of the main point coordinate, so that the position of the actual point of the main point in the image can be accurately determined, the corresponding target position can be rapidly determined without fitting curves in each second image, and the effect of improving distortion calibration accuracy and calibration efficiency under the condition of low resolution is achieved.
In one embodiment, the determining the distortion correction coefficient of the camera to be measured according to the target position corresponding to each of the second images and the rotation angle corresponding to each of the second images includes:
Performing curve fitting according to the first target position of each third sub-image and the first rotation angle of each third sub-image to obtain a first mapping relation;
Performing curve fitting according to the second target position of each fourth sub-image and the second rotation angle of each fourth sub-image to obtain a second mapping relation;
and determining a distortion correction coefficient of the camera to be tested according to the first mapping relation and the second mapping relation.
And performing curve fitting according to the first target position of each third sub-image and the first rotation angle of each third sub-image, wherein the mapping relationship between the target position and the first rotation angle in the first state can be determined by a curve fitting mode. When the vertical ray in the first state presents a vertical line in the image, the change relation of the abscissa of the target position along with the angle can be determined through fitting.
And performing curve fitting according to the second target position of each fourth sub-image and the second rotation angle of each fourth sub-image, wherein the mapping relationship between the target position and the second rotation angle in the second state can be determined by a curve fitting mode. When the vertical ray in the second state presents a horizontal line in the image, the change relation of the ordinate of the target position along with the angle can be determined through fitting.
And determining the distortion correction coefficient of the camera to be detected according to the first mapping relation and the second mapping relation, wherein the adjustment parameters adopted by the picture of the camera to be detected for a plurality of pixel points can be determined according to the mapping relation in two states.
According to the lens distortion correction coefficient calibration method, the target position change logic under different rotation angles can be determined by means of fitting the target position and the rotation angles, so that the corresponding distortion correction coefficient can be further determined, an accurate distortion correction strategy is realized, and the effect of improving the distortion calibration accuracy under the condition of low resolution is achieved.
In one embodiment, determining the distortion correction coefficient of the camera to be measured according to the first mapping relationship and the second mapping relationship includes:
Determining a distortion correction coefficient of the camera to be measured in a first direction according to the first mapping relation;
determining a distortion correction coefficient of the camera to be detected in a second direction according to the second mapping relation;
and determining a distortion correction coefficient of the camera to be tested in a third direction according to the first mapping relation and the second mapping relation.
The first mapping relation describes the change condition of the target position corresponding to the main point coordinate in the third sub-image along with the rotation angle, and the deviation between the target position and the ideal position under the specific rotation angle can be determined by analyzing the first mapping relation, so that the influence of lens distortion can be described by adopting a proper mathematical model according to the deviation data.
The second mapping relation describes the change condition of the target position corresponding to the main point coordinate in the fourth sub-image along with the rotation angle, and the deviation between the target position and the ideal position under the specific rotation angle can be determined by analyzing the second mapping relation, so that the influence of lens distortion can be described by adopting a proper mathematical model according to the deviation data.
The third direction may include a plurality of directions, and may be other directions of the vertical light in the first state and the second state than the directions in the image. Taking the example that the directions of the vertical light rays in the images in the first state and the second state are vertical and horizontal respectively, the third direction can be any one or more oblique directions. And determining a distortion correction coefficient of the camera to be measured in the third direction according to the first mapping relation and the second mapping relation, wherein the distortion correction coefficient of the camera to be measured in the third direction can be determined by calculating and analyzing a change relation between the position in the third direction and a preset rotation angle according to the first mapping relation and the second mapping relation, so that the third mapping relation in the third direction is determined, and further the distortion correction coefficient in the third direction is determined.
According to the lens distortion correction coefficient calibration method provided by the embodiment, the distortion correction coefficients of the camera to be tested in the first direction and the second direction are respectively determined according to the first mapping relation and the second mapping relation, and the distortion correction coefficients of the camera to be tested in the third direction are determined according to the first mapping relation and the second mapping relation, so that lens distortion conditions of different positions in an image of the camera to be tested can be more comprehensively described, and the more accurate correction coefficients are provided, so that the effect of improving distortion calibration accuracy under the condition of low resolution is achieved.
The application further provides a detailed embodiment for more clearly explaining the technical scheme of the application.
In one embodiment, a lens distortion correction coefficient calibration method is provided, and can be applied to distortion calibration of a low-resolution wide-angle camera, wherein the method is applied to a distortion correction device shown in fig. 2, the device comprises a light source, a target, a camera and a turntable, the light source is used for projecting horizontal or vertical light spots to the target, the target is a solid color target, the camera is arranged on the turntable, and further, the center of the camera or the center of an optical axis of the camera is consistent with the center of the turntable, so that when the turntable rotates around the center, the rotation angle of the turntable is equal to the rotation angle of the camera. The lens distortion correction coefficient calibration method comprises the following steps:
Step S1, determining principal point coordinates (CX, CY), namely determining an undistorted state (no distortion on a camera image-no image detection crossing rows/columns) in the vertical direction and the horizontal direction of the image by adopting external laser, acquiring the image by a camera, and performing centroid detection on the acquired image to determine the principal point coordinates.
In one embodiment, the external laser may employ a Gaussian line source to facilitate finding the brightest sub-pixel point.
And S11, determining main point coordinates CX, namely shooting external line laser, adjusting the position and the angle of a camera to perform shooting for a plurality of times to obtain a plurality of first sub-images, enabling an external line light source to form an image in the camera into a vertical straight line, and marking the X coordinates of the image where the line laser without distortion and crossing lines is located as CX. As shown in fig. 3, the vertical light rays of the first graph and the second graph have distortion, and the vertical light rays of the second graph are straight lines and have no distortion and cross rows, so that the abscissa where the vertical light rays are located can be taken as the principal point coordinate CX.
And step S12, determining main point coordinates CY, namely shooting external line lasers, adjusting the position and the angle of a camera to perform shooting for a plurality of times to obtain a plurality of second sub-images, enabling the imaging of an external line light source in the camera to be a horizontal straight line, and marking Y coordinates of images where the line lasers without distortion and crossing lines are located as CY. As shown in fig. 4, the horizontal light of the first graph and the second graph has distortion, and the horizontal light of the second graph is straight and has no distortion and spans, so the ordinate where the horizontal light is located can be taken as the principal point coordinate CY.
As shown in fig. 5, when determining the principal point coordinates, an image acquired in a state of no distortion in the vertical direction is selected, the abscissa of the principal point coordinates is determined, an image acquired in a state of no distortion in the horizontal direction is selected, and the ordinate of the principal point coordinates is determined, thereby determining the principal point coordinates.
And S2, rotating the acquired image, namely carrying out full-angle image light source scanning and data acquisition through the left and right rotation angles of the automatic turntable.
And S21, projecting external laser, and collecting images by using an automatic turntable to rotate left and right by a certain angle (such as 0.1 DEG interval).
The camera is placed in a first state, external line laser is projected, and an image is acquired by rotating the automatic turntable left and right by a certain angle, so that a plurality of third sub-images are obtained. The rotation of a certain angle may be uniformly performed by a predetermined angle, for example, 0.1 ° and 0.2 ° as shown in fig. 6, or may be performed by a predetermined larger angle, for example, 1 ° and 2 ° in the middle as shown in fig. 7, and the rotation of the edge is performed by a predetermined small angle, for example, 0.1 ° and 0.2 °. The preset value of the rotation angle can be adjusted according to actual requirements.
And S22, projecting external laser, rotating the camera position by 90 degrees, and collecting images by rotating the turntable left and right by a certain angle.
Under the condition that the initial position is preset and the optical axis is fixed, the camera can rotate 90 degrees clockwise or anticlockwise, so that the camera is placed vertically relative to the step S21, external line laser is projected as a second state, line laser in a picture is made to be horizontal light, and a turntable is used for carrying out left-right rotation for a certain angle to acquire images, so that a plurality of fourth sub-images are obtained. As shown in fig. 8, the preset value of the rotation angle may be different from that when the rotation angle is horizontally placed.
And S3, calculating the corresponding relation between the rotation angle and the pixel position.
In step S31, centroid calculation is performed on the acquired image to obtain a corresponding relationship (X, degreeX) between the horizontal rotation angle X and the pixel position degreeX, and fitting or interpolating is performed according to the acquired rotation angle and the pixel position relationship.
In step S32, the position correspondence (Y, degreeY) between the vertical rotation angle Y and the pixel degreeY is calculated, and fitting or interpolation is performed according to the acquired rotation angle and pixel position relationship.
And S4, restoring the distortion correction coefficient by the rotation angle and the pixel position.
Wherein the horizontal distortion correction coefficient directX is sin (degrex), the vertical distortion correction coefficient directY is sin (degrey), and the third distortion correction coefficient directZ is sqrt (1-directx×directx-directY × directY).
The lens distortion correction coefficient calibration method provided by the embodiment comprises the steps of determining principal point coordinates of a camera to be tested by acquiring a plurality of first images acquired by the camera to be tested, wherein the plurality of first images comprise a plurality of first images of vertical rays and a plurality of first images of horizontal rays, controlling the camera to be tested to rotate according to a preset rotation rule, acquiring a plurality of second images and recording rotation angles of each second image, wherein the plurality of second images comprise a plurality of second images of vertical rays and a plurality of second images of horizontal rays, and according to the plurality of second images, The method comprises the steps of determining the distortion correction coefficient of the camera to be measured by the rotation angle and the principal point coordinate of each second image, firstly completing searching of the principal point coordinate by the vertical light and the horizontal light, then determining the distortion correction coefficient of the camera to be measured according to the second image, thereby achieving the effect of improving the distortion calibration accuracy under the condition of low resolution, determining the principal point coordinate by combining the image data in the vertical direction and the parallel direction, enabling the estimation of the principal point coordinate to be more accurate, achieving the effect of improving the distortion calibration accuracy under the condition of low resolution, determining the horizontal coordinate and the vertical coordinate of the principal point coordinate by utilizing the undistorted image, enabling the principal point of the camera to be more accurately positioned, enabling errors caused by lens distortion to be effectively reduced by accurate measurement and analysis of line characteristics in the image, particularly improving the calibration accuracy under the condition of low resolution, simultaneously, enabling the calculation process to be simplified by only determining the principal point coordinate by the first undistorted image and the second undistorted image, enabling the principal point coordinate to be obtained by only processing a small amount of key images, improving the accuracy under the condition of low resolution, and achieving the effect of high accuracy under the condition of rotation of the camera to be measured by the first and second image, The image acquisition and the rotation angle recording can help to determine the optical characteristics of the camera, so that a more accurate distortion correction coefficient can be obtained, and the effects of improving the distortion calibration accuracy and the calibration efficiency under the condition of low resolution are achieved; the method comprises the steps of collecting data from a plurality of angles, extracting distortion characteristics from the data collected from the angles, calculating accurate distortion correction coefficients according to the distortion characteristics, improving distortion calibration accuracy and calibration efficiency under the condition of low resolution, accurately determining the positions of actual points of principal points in images by taking the pixel points with the maximum pixel values as target positions according to the abscissa and ordinate positions of principal point coordinates, quickly determining corresponding target positions without fitting curves in each second image, improving the distortion calibration accuracy and calibration efficiency under the condition of low resolution, determining target position change logic under different rotation angles by fitting the target positions and rotation angles, further determining corresponding distortion correction coefficients, realizing accurate distortion correction strategies, improving the distortion calibration accuracy under the condition of low resolution, respectively determining the correction coefficients of the camera to be tested in the first direction and the second direction according to the first mapping relation and the second mapping relation, comprehensively describing the camera to be tested in the condition of the distortion calibration coefficients under the condition of low resolution, therefore, the effect of improving the distortion calibration accuracy under the condition of low resolution is achieved.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
In one embodiment, a distortion correction method is provided for use with a camera, the distortion correction method comprising:
acquiring an image to be corrected and a distortion correction coefficient as described above;
and carrying out distortion correction on the image to be corrected based on the distortion correction coefficient to obtain a distortion correction image.
By means of the distortion correction coefficients according to any of the above embodiments, the camera can perform more accurate distortion correction on the image to be corrected, and a distortion corrected image is obtained.
In one embodiment, as shown in FIG. 9, a lens distortion correction coefficient calibration system is provided, the system comprising a light source device 106, a target 110, a rotation device 108, and a processor 1021;
the camera 104 to be tested is arranged on the rotating device 108;
the target 110 is disposed in the field of view of the camera 104 to be tested, and the target 110 includes a solid color target 110;
the light source device 106 is configured to project light to the target 110;
The rotating device 108 is configured to drive the camera 104 to be tested to rotate;
the processor 1021 is connected to the light source device 106, the rotating device 108, and the camera 104 to be tested, respectively, and is configured to implement the lens distortion correction coefficient calibration method according to any of the embodiments.
Wherein the method comprises the following steps:
Acquiring a plurality of first images acquired by a camera 104 to be detected, and determining the principal point coordinates of the camera 104 to be detected, wherein the plurality of first images comprise a plurality of first images of vertical light rays and a plurality of first images of horizontal light rays;
Controlling the camera 104 to be tested to rotate according to a preset rotation rule, collecting a plurality of second images and recording the rotation angle of each second image, wherein the plurality of second images comprise a plurality of second images of vertical light rays and a plurality of second images of horizontal light rays;
And determining a distortion correction coefficient of the camera 104 to be tested according to the second images, the rotation angle of each second image and the principal point coordinates.
Based on the same inventive concept, the embodiment of the application also provides a lens distortion correction coefficient calibration device for realizing the lens distortion correction coefficient calibration method. The implementation scheme of the device for solving the problem is similar to that described in the above method, so the specific limitation of the embodiment of the device for calibrating the lens distortion correction coefficient provided in the following may be referred to the limitation of the method for calibrating the lens distortion correction coefficient, which is not repeated herein.
In one embodiment, as shown in fig. 10, there is provided a lens distortion correction coefficient calibration apparatus comprising:
The acquisition module 100 is configured to acquire a plurality of first images acquired by a camera to be tested, and determine principal point coordinates of the camera to be tested. The plurality of first images includes a plurality of first images of vertical light rays and a plurality of first images of horizontal light rays.
The control module 200 is configured to control the camera to be tested to rotate according to a preset rotation rule, collect a plurality of second images, and record a rotation angle of each second image, where the plurality of second images include a plurality of second images of vertical light and a plurality of second images of horizontal light.
And the calibration module 300 is used for determining the distortion correction coefficient of the camera to be tested according to the plurality of second images, the rotation angle of each second image and the principal point coordinates.
In one embodiment, the obtaining module 100 is further configured to:
when the light source device projects vertical light on the target, acquiring a plurality of first sub-images acquired by the camera to be tested at a plurality of angles;
When the light source device projects horizontal light on the target, acquiring a plurality of second sub-images acquired by the camera to be tested at a plurality of angles;
and determining the principal point coordinates of the camera to be tested according to the plurality of first sub-images and the plurality of second sub-images.
In one embodiment, the obtaining module 100 is further configured to:
Determining a first undistorted image in the plurality of first sub-images and a second undistorted image in the plurality of second sub-images;
Determining an abscissa corresponding to the principal point coordinate according to the first undistorted image;
and determining the ordinate corresponding to the principal point coordinate according to the second undistorted image.
In one embodiment, the control module 200 is further configured to determine that the preset rotation rule includes a preset initial position, a preset rotation direction, and a preset rotation angle;
When the camera to be tested is in a first state and the light source device projects vertical light on the target, controlling the camera to be tested to rotate by a preset rotation angle along a preset rotation direction by taking a preset initial position as a starting position, collecting a plurality of third sub-images and recording the first rotation angle of each third sub-image;
When the camera to be tested is in the second state and the light source device projects vertical light on the target, the camera to be tested is controlled to rotate by a preset rotation angle along a preset rotation direction by taking a preset initial position as a starting position, a plurality of fourth sub-images are collected, and the second rotation angle of each fourth sub-image is recorded.
In one embodiment, the calibration module 300 is further configured to:
determining a target position corresponding to the principal point coordinates in each second image according to the second images and the principal point coordinates;
And determining a distortion correction coefficient of the camera to be tested according to the target position corresponding to each second image and the rotation angle corresponding to each second image.
In one embodiment, the calibration module 300 is further configured to:
searching a pixel point with the same ordinate and the maximum pixel value in the third sub-image as a first target position based on the ordinate of the main point coordinate;
and searching the pixel point with the same abscissa and the maximum pixel value in the fourth sub-image as a second target position based on the abscissa of the main point coordinate.
In one embodiment, the calibration module 300 is further configured to:
Performing curve fitting according to the first target position of each third sub-image and the first rotation angle of each third sub-image to obtain a first mapping relation;
Performing curve fitting according to the second target position of each fourth sub-image and the second rotation angle of each fourth sub-image to obtain a second mapping relation;
and determining a distortion correction coefficient of the camera to be tested according to the first mapping relation and the second mapping relation.
In one embodiment, the calibration module 300 is further configured to:
Determining a distortion correction coefficient of the camera to be measured in a first direction according to the first mapping relation;
determining a distortion correction coefficient of the camera to be detected in a second direction according to the second mapping relation;
and determining a distortion correction coefficient of the camera to be tested in a third direction according to the first mapping relation and the second mapping relation.
All or part of the modules in the lens distortion correction coefficient calibration device can be realized by software, hardware and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a camera under test, and the internal structure of which may be as shown in fig. 11. The computer device includes a processor, a memory, a communication interface, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The processor and the internal memory may be integrated in an image processing chip of the camera under test. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer device is used as a camera to be tested, can be connected with external equipment through the communication interface, and can perform instruction transmission and/or image transmission. The computer program is executed by a processor to implement a distortion correction method. The input device of the computer equipment can be an instruction input device arranged on the shell of the computer equipment, such as a key, a roller, a touch layer and the like, can also be an image acquisition device, comprises an optical imaging component, such as a lens, an image sensor, an infrared sensor and the like, can also be a space sensing device, such as a radar scanning component, a depth sensor and the like, and can also be a motion sensing device, such as a gyroscope and the like. The original image is acquired by the image acquisition device, and the distortion correction method in the above embodiment is performed on the original image by the processor, thereby obtaining a camera image after distortion correction.
It will be appreciated by those skilled in the art that the structure shown in FIG. 11 is merely a block diagram of some of the structures associated with the present inventive arrangements and is not limiting of the computer device to which the present inventive arrangements may be applied, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In one embodiment, a computer device is provided, including a memory and a processor, the memory storing a computer program, the processor implementing the distortion correction method described above when executing the computer program:
acquiring an image to be corrected and a distortion correction coefficient as described in any one of the embodiments above;
and carrying out distortion correction on the image to be corrected based on the distortion correction coefficient to obtain a distortion correction image.
The image to be corrected can be a camera to be detected used for establishing the distortion correction coefficient, or can be obtained by a camera with the same specification as the camera to be detected.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor implements the distortion correction method described above:
acquiring an image to be corrected and a distortion correction coefficient as described in any one of the embodiments above;
and carrying out distortion correction on the image to be corrected based on the distortion correction coefficient to obtain a distortion correction image.
The image to be corrected can be a camera to be detected used for establishing the distortion correction coefficient, or can be obtained by a camera with the same specification as the camera to be detected.
The user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or sufficiently authorized by each party.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magneto-resistive random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric RandomAccess Memory, FRAM), phase change Memory (PHASE CHANGE Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in various forms such as static Random access memory (Static Random Access Memory, SRAM) or Dynamic Random access memory (Dynamic Random AccessMemory, DRAM), and the like. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.