[go: up one dir, main page]

CN114730472B - Calibration method and related device for external parameters of vehicle-mounted camera - Google Patents

Calibration method and related device for external parameters of vehicle-mounted camera Download PDF

Info

Publication number
CN114730472B
CN114730472B CN202180006501.9A CN202180006501A CN114730472B CN 114730472 B CN114730472 B CN 114730472B CN 202180006501 A CN202180006501 A CN 202180006501A CN 114730472 B CN114730472 B CN 114730472B
Authority
CN
China
Prior art keywords
coordinate system
vehicle
parallel lines
parallel
camera coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202180006501.9A
Other languages
Chinese (zh)
Other versions
CN114730472A (en
Inventor
何启盛
李涵
黄海晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yinwang Intelligent Technology Co Ltd
Original Assignee
Shenzhen Yinwang Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yinwang Intelligent Technology Co Ltd filed Critical Shenzhen Yinwang Intelligent Technology Co Ltd
Publication of CN114730472A publication Critical patent/CN114730472A/en
Application granted granted Critical
Publication of CN114730472B publication Critical patent/CN114730472B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a calibration method and a related device for external parameters of a vehicle-mounted camera in the technical field of camera calibration. According to the technical scheme, according to the obtained constraint of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the M second parallel lines in the aerial view of the first image, the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is determined, M is an integer greater than or equal to 3, the M second parallel lines are in one-to-one correspondence with the M first parallel lines, and according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system of the vehicle-mounted camera and the world coordinate system, the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is determined, and the flexibility of the external parameter calibration method of the vehicle-mounted camera is improved.

Description

Calibration method and related device for external parameters of vehicle-mounted camera
Technical Field
The application relates to the technical field of camera calibration, in particular to a calibration method and a related device for external parameters of a vehicle-mounted camera.
Background
In-vehicle cameras are increasingly important as a sensor for assisting driving and automatic driving, and by correlating the environment around the vehicle with digital images captured by the in-vehicle cameras, necessary information can be provided for safe driving. In the process of correlating the environment around the vehicle with the digital image captured by the in-vehicle camera, external parameters of the in-vehicle camera play an important role. The external parameters of the vehicle-mounted camera refer to a translation distance and a rotation angle of the vehicle-mounted camera relative to the vehicle.
At present, external parameters of the vehicle-mounted camera are calibrated by means of parallel lines, and the parallel lines used for calibrating the external parameters of the vehicle-mounted camera can be lane lines drawn in a specific field or lane lines on a road. In the calibration process, many prior art schemes are provided with specific constraint conditions, for example, the distances between three parallel lines required for external parameter calibration of the vehicle-mounted camera are equal. Specifically, the pixel coordinate system coordinates of three parallel lines are obtained, the pixel coordinate system coordinates of the three parallel lines are converted into camera coordinate system coordinates, and the distances between the three parallel lines are equal as constraint conditions, so that external parameters of the vehicle-mounted camera are obtained. But the calibration of the vehicle-mounted camera using equidistant parallel lines makes the calibration method of the vehicle-mounted camera less flexible.
Therefore, how to improve the flexibility of the external parameter calibration method of the vehicle-mounted camera becomes a problem to be solved.
Disclosure of Invention
The application provides a calibration method and a related device for external parameters of a vehicle-mounted camera, which improve the flexibility of the external parameter calibration method of the vehicle-mounted camera.
In a first aspect, the application provides a calibration method for external parameters of a vehicle-mounted camera, the method comprises the steps of obtaining coordinates of M first parallel lines in a first image in an actual camera coordinate system of the vehicle-mounted camera, wherein the first image is an image obtained by shooting a vehicle of the vehicle-mounted camera in a driving process on a target road, the target road comprises N parallel lane lines, the distance between any two lane lines in the N lane lines is known, the M first parallel lines are in one-to-one correspondence with M first lane lines in the N lane lines, M is an integer greater than or equal to 3, N is an integer greater than or equal to M, according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the constraint of M second parallel lines in a bird-eye view of the first image, the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera is determined, the M first parallel lines and the distance between any two first parallel lines are known, M first parallel lines and M first parallel lines are in one-to-another, M is an integer greater than or equal to 3, N is an integer greater than or equal to M, M is a constraint between the first parallel line and a virtual ideal line is a constraint between any two of the first parallel lines, and a vertical line is a vertical constraint between the first parallel line and a second parallel line is a vertical constraint between the first parallel to a second line and a first line, the distance constraint comprises that the difference between the distance between any two second parallel lines and the distance between M lane lines corresponding to the any two second parallel lines is the smallest, and the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is determined according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system and the world coordinate system.
In the method, firstly, according to the acquired constraint of M first parallel lines in an actual camera coordinate system of a vehicle-mounted camera and M second parallel lines in a bird's eye view of a first image, the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera is determined, M is an integer greater than or equal to 3, the M second parallel lines are in one-to-one correspondence with the M first parallel lines, the constraint of the M second parallel lines in the bird's eye view comprises parallel constraint, vertical constraint, spacing proportion constraint and spacing constraint, and then according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system of the vehicle-mounted camera and the world coordinate system, the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system is determined, wherein the first image is an image obtained by shooting the vehicle-mounted camera in the running process of the vehicle-mounted camera on a target road, N parallel lines are contained on the target road, N is an integer greater than or equal to M, the constraint of the M in the virtual ideal line of the vehicle-mounted camera in the virtual ideal coordinate system is introduced, the space between the two parallel lines in the vehicle-mounted camera coordinate system of the vehicle-mounted camera and the virtual ideal line of the vehicle-mounted camera is set as the specific constraint condition of the vehicle-mounted camera, and the distance between the two parallel lines in the vehicle-mounted camera coordinate system in the vehicle-mounted camera is different from the actual constraint condition, and the virtual line in the vehicle-mounted camera has a specific constraint condition, the flexibility of the external parameter calibration method of the vehicle-mounted camera is improved.
In one possible implementation manner, the method for determining the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the constraints of the M second parallel lines in the aerial view of the first image comprises determining the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the parallel constraints, determining the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the vertical constraints, determining the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the pitch proportion constraints, and determining the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system of the vehicle-mounted camera coordinate system according to the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the virtual camera coordinate system.
In the implementation manner, the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined according to the coordinates and parallel constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera in sequence, the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined according to the coordinates and vertical constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera, the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined according to the coordinates and pitch constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera, the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined according to the coordinates and vertical constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera, and the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is obtained, and the accuracy of the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is improved.
In one possible implementation, the parallel constraint further includes 0.5 Σ (θ ii+1)2 is minimum, where θ i is an angle between an i-th second parallel line of the M second parallel lines and an x-axis direction of the virtual ideal camera coordinate system, θ i+1 is an angle between an i+1th second parallel line of the M second parallel lines and an x-axis direction of the virtual ideal camera coordinate system, i is an integer greater than or equal to 1 and less than M, θ i is less than or equal to 90 degrees, and θ i+1 is less than or equal to 90 degrees.
In this implementation, the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined with 0.5 Σ (θ ii+1)2 minimum as a parallel constraint), improving the accuracy of the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system.
In one possible implementation, the vertical constraint further includes 0.5 Σ (θ j-90)2 is minimum, θ j is an angle between a j-th second parallel line of the M second parallel lines and an x-axis direction of the virtual ideal camera coordinate system, j is an integer greater than or equal to 1 and less than or equal to M, and θ j is less than or equal to 90 degrees.
In this implementation, determining the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system with a minimum of 0.5 Σ (θ j-90)2) as a vertical constraint improves the accuracy of the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system.
In one possible implementation, the pitch ratio constraint further includes: At a minimum, ω s,s+1 is a distance between the s-th second parallel line and the s+1th second parallel line in the M second parallel lines, ω s+1,s+2 is a distance between the s+1th second parallel line and the s+2th second parallel line in the M second parallel lines, W s,s+1 is an actual distance between the lane line corresponding to the s-th second parallel line and the lane line corresponding to the s+1th second parallel line in the M second parallel lines, W s+1,s+2 is an actual distance between the lane line corresponding to the s+1th second parallel line and the lane line corresponding to the s+2th second parallel line in the M second parallel lines, and s is an integer greater than or equal to 1 and less than M-1.
In this implementation, it willThe minimum z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined as a pitch proportion constraint, so that the accuracy of the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is improved.
In one possible implementation, the space constraint further includes 0.5 Σ (ω i,i+1-Wi,i+1)2 is minimum, ω i,i+1 is a distance between an ith second parallel line and an i+1th second parallel line of the M second parallel lines, W i,i+1 is an actual distance between a lane line corresponding to the ith second parallel line and a lane line corresponding to the i+1th second parallel line of the M second parallel lines, and i is an integer greater than or equal to 1 and less than M.
In the implementation, the minimum distance of 0.5 sigma (omega i,i+1-Wi,i+1)2) is used as a spacing constraint to determine the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system, so that the accuracy of the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system is improved.
In a possible implementation manner, the method further comprises the steps of obtaining the y-axis translation amount of the vehicle-mounted camera in the world coordinate system, obtaining positioning information of a vehicle, wherein the positioning information of the vehicle comprises current position information of the vehicle and/or acceleration of the vehicle and/or wheel speed of the vehicle, establishing an observation model of the vehicle according to the positioning information of the vehicle and the y-axis translation amount of the vehicle-mounted camera in the world coordinate system, obtaining a course angle of the current moment of the vehicle according to the observation model of the vehicle, compensating the course angle of the current moment of the vehicle according to the course angle in a transformation relation between an actual camera coordinate system of the vehicle-mounted camera and the world coordinate system to obtain a compensated course angle, and updating the course angle in the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system to be the compensated course angle.
According to the implementation mode, the course angle of the current moment of the vehicle is obtained according to the observation model of the vehicle, which is built by the positioning information of the vehicle and the y-axis translation amount of the vehicle-mounted camera in the world coordinate system, the course angle of the current moment of the vehicle is compensated by using the course angle of the vehicle-mounted camera in the external parameter under the world coordinate system, the compensated course angle is obtained, and the course angle in the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is updated to be the compensated course angle, so that when the vehicle does not run completely parallel to the lane line, the influence of the course angle of the vehicle can be compensated, and the accuracy of the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is improved.
In a second aspect, the present application provides a calibration device for external parameters of an on-board camera, which device may comprise respective modules for implementing the method in the first aspect, which modules may be implemented in software and/or hardware.
In a third aspect, the present application provides a calibration apparatus for external parameters of an in-vehicle camera. The apparatus may include a processor coupled to a memory. Wherein the memory is for storing program code and the processor is for executing the program code in the memory to implement the method of the first aspect or any one of the implementations.
Optionally, the apparatus may further comprise the memory.
In a fourth aspect, the present application provides a chip comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a wire, the at least one processor being adapted to run a computer program or instructions to perform a method as described in the first aspect or any one of the possible implementations thereof.
In a fifth aspect, the present application provides a computer readable medium storing program code for execution by a device, the program code comprising instructions for performing the method of the first aspect or any one of the possible implementations thereof.
In a sixth aspect, the application provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method according to the first aspect or any one of the possible implementations thereof.
In a seventh aspect, the present application provides a computing device comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a line, the communication interface being in communication with a target system, the at least one processor being operable to execute a computer program or instructions to perform a method as described in the first aspect or any one of the possible implementations thereof.
In an eighth aspect, the present application provides a computing system comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a line, the communication interface being in communication with a target system, the at least one processor being operable to execute a computer program or instructions to perform a method as described in the first aspect or any one of the possible implementations thereof.
Drawings
FIG. 1 is a schematic diagram of a pixel coordinate system;
FIG. 2 is a schematic diagram of a camera coordinate system;
FIG. 3 is a schematic diagram of an actual camera coordinate system and a virtual ideal camera coordinate system according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a world coordinate system according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an application scenario according to an embodiment of the present application;
fig. 6 is a flowchart of a method for calibrating external parameters of an on-vehicle camera according to an embodiment of the present application;
FIG. 7 is a schematic illustration of a first image according to an embodiment of the present application;
FIG. 8 is a schematic view of an aerial view of an embodiment of the present application;
FIG. 9 is a schematic diagram of heading angle compensation according to an embodiment of the application;
FIG. 10 is a flow chart of another method for calibrating external parameters of an onboard camera according to an embodiment of the application;
FIG. 11 is a schematic block diagram of an external parameter calibration apparatus of an in-vehicle camera according to an embodiment of the present application;
Fig. 12 is a schematic structural diagram of an external parameter calibration device of an in-vehicle camera according to another embodiment of the present application.
Detailed Description
The following description of the technical solutions according to the embodiments of the present application will be given with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Fig. 1 is a schematic diagram of a pixel coordinate system. As shown in fig. 1, the vertex of the upper left corner of the pixel coordinate system is the origin O p, the horizontal right is the u-axis, and the vertical down is the v-axis.
Pixel coordinates refer to the location of a pixel in an image. In the pixel coordinate system, the coordinates of any one pixel point may be represented as (u i,vi). The method of representation of the pixels does not reflect the physical dimensions of the object in the image.
Fig. 2 is a schematic diagram of a camera coordinate system. As shown in fig. 2, the camera coordinate system uses the optical axis of the camera as the Z c axis, and the central position of the light beam in the optical system of the camera is the origin O c, which is actually the center of the lens. The horizontal axis X c and the vertical axis Y c of the camera coordinate system are parallel to the u-axis and v-axis, respectively, of the pixel coordinate system.
Fig. 3 is a schematic diagram of an actual camera coordinate system and a virtual ideal camera coordinate system according to an embodiment of the present application. As shown in fig. 3, L1, L2 and L3 are three parallel lines on the vehicle running road, and according to the selection rules and the positional relationship of the origin, X-axis, Y-axis and Z-axis of the camera coordinate system shown in fig. 2, the center of the camera lens is selected as the origin O r of the actual camera coordinate system, the optical axis of the camera is selected as Z r,Zr of the actual camera coordinate system, which is parallel to the vehicle running road, the front of the camera is the positive direction of Z r, the direction perpendicular to Z r and parallel to the vehicle running road is selected as X r of the actual camera coordinate system, the direction of L2 to L3 is selected as the positive direction of X r, the direction perpendicular to the vehicle running road is selected as Y r (not shown in the figure), and the direction perpendicular to the vehicle running road inwards is selected as the positive direction of Y r.
The origin of the virtual ideal camera coordinate system coincides with the origin O r of the actual camera coordinate system, and the Y-axis of the virtual ideal camera coordinate system coincides with Y r and is consistent in direction. And rotating Z r and X r of the actual camera coordinate system by beta angle around Y r until X r is vertical to three parallel lines on the automobile driving road surface, so as to obtain the Z-axis direction and the X-axis direction of the virtual ideal camera coordinate system.
FIG. 4 is a diagram of a world coordinate system according to an embodiment of the present application. As shown in fig. 4, L1, L2 and L3 are three parallel lines on the road surface of the vehicle, ZOX is a virtual ideal camera coordinate system of the camera, any point on L2 is selected as an origin O w of a world coordinate system, the direction of X w of the world coordinate system is consistent with the Z-axis direction of the virtual ideal camera coordinate system of the camera, Y w of the world coordinate system is perpendicular to L2 and parallel to the road surface of the vehicle, and the directions of L2 to L1 are selected as positive directions of Y w.
Fig. 5 is a schematic diagram of an application scenario according to an embodiment of the present application. The scene shown in fig. 5 is a scene in which the external parameters of the camera are calibrated by three parallel lines L1, L2 and L3 of known pitches in parallel on the road surface on which the vehicle is traveling. The calibration of external parameters of the camera is completed by driving the vehicle from one end to the other end of the three parallel lines. The three parallel lines L1, L2 and L3 parallel to the road surface with known distance can be lane lines drawn on a specific site, or lane lines on a normal road.
It will be appreciated that the scenario shown in fig. 5 is only an example, and the technical solution of the present application may also be applied to other scenarios, as long as the scenario involves calibration of external parameters of the camera. For example, the technical scheme of the application can be also applied to scenes such as calibration of external parameters of a camera in the intelligent robot.
Fig. 6 is a flowchart of a method for calibrating external parameters of an on-vehicle camera according to an embodiment of the application. As shown in fig. 6, the method includes at least S601 to S603.
S601, acquiring coordinates of M first parallel lines in a first image in an actual camera coordinate system of a vehicle-mounted camera, wherein the first image is an image obtained by shooting a vehicle to which the vehicle-mounted camera belongs in the driving process of the vehicle on a target road, the target road comprises N parallel lane lines, the distance between any two lane lines in the N lane lines is known, the M first parallel lines are in one-to-one correspondence with the M first lane lines in the N lane lines, M is an integer greater than or equal to 3, and N is an integer greater than or equal to M.
In one possible implementation manner, during a running process of a vehicle of the onboard camera on a target road, the onboard camera photographs M lane lines of the N lane lines on the target road to obtain a first image. The first image comprises M first parallel lines, the M first parallel lines are obtained by shooting M lane lines in N lane lines on the target road by the vehicle-mounted camera, and the M first parallel lines in the first image are in one-to-one correspondence with the M lane lines in the N lane lines on the target road.
Fig. 7 is a schematic diagram of a first image according to an embodiment of the present application. As shown in fig. 7, there are 3 first parallel lines L1, L2, and L3 in the first image, which correspond to 3 lane lines out of N lane lines in the target road, respectively, where M is equal to 3 and N is greater than or equal to 3.
And extracting M first parallel lines in the first image to obtain coordinates of the M first parallel lines in a pixel coordinate system, and converting the coordinates of the M parallel lines in the pixel coordinate system into coordinates of the M parallel lines in a camera coordinate system of the vehicle-mounted camera. One example of the pixel coordinate system may be the pixel coordinate system shown in fig. 1, and another example of the camera coordinate system may be the camera coordinate system of the actual camera shown in fig. 3.
As an example, the parallel line regions and contours in the first image are extracted by a segmentation algorithm, and then edge extraction at the sub-pixel level is performed to obtain M first parallel lines. For example, the segmentation algorithm includes a watershed algorithm, etc.
As another example, straight lines in the first image are extracted through hough transformation, and then, M first parallel lines in the first image are obtained by adopting methods such as clustering and filtering. The Hough transformation is a feature extraction technology in image processing, objects with specific shapes can be detected through a voting algorithm, a clustering method is used for clustering straight line segments with similar slopes and intercept, and a filtering method is used for screening a region of interest through camera reference installation angles and positions.
S602, determining a transformation relation between an actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera according to the coordinates of M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the constraints of M second parallel lines in the aerial view of the first image, wherein the constraints of the M second parallel lines in the aerial view comprise parallel constraints, vertical constraints, pitch proportion constraints and pitch constraints, and the M second parallel lines are in one-to-one correspondence with the M first parallel lines.
The transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is the x-axis rotation angle, the y-axis rotation angle, the z-axis rotation angle and the z-axis translation distance of the virtual ideal camera coordinate system of the vehicle-mounted camera relative to the actual camera coordinate system of the vehicle-mounted camera. One example of the actual camera coordinate system of the vehicle-mounted camera may be the actual camera coordinate system shown in fig. 3, and one example of the virtual ideal camera coordinate system of the vehicle-mounted camera may be the virtual ideal camera coordinate system shown in fig. 3.
In one possible implementation, according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera, an inverse perspective transformation method is adopted to obtain a bird's eye view of the M first parallel lines. In the aerial view, there are M second parallel lines, and the M second parallel lines are in one-to-one correspondence with the M first parallel lines.
Fig. 8 is a schematic diagram illustrating a bird's eye view of an embodiment of the present application. As shown in fig. 8, in the bird's eye view, there are 3 second parallel lines, M is equal to 3, and is L1, L2, and L3, respectively, where θ 1 is an angle between the second parallel line L1 and the x-axis direction of the virtual ideal coordinate system, θ 2 is an angle between the second parallel line L2 and the x-axis direction of the virtual ideal coordinate system, θ 3 is an angle between the second parallel line L3 and the x-axis direction of the virtual ideal coordinate system, ω 12 is a distance between the second parallel line L1 and the second parallel line L2, and ω 23 is a distance between the second parallel line L2 and the second parallel line L3.
The parallel constraint comprises M second parallel lines, the vertical constraint comprises that any one of the M second parallel lines is perpendicular to the x-axis direction of the virtual ideal camera coordinate system, the distance proportion constraint comprises that the distance proportion between any two of the M second parallel lines is the same as the distance proportion between M lane lines corresponding to the any two second parallel lines, and the distance constraint comprises that the difference between the distance between any two second parallel lines and the distance between M lane lines corresponding to the any two second parallel lines is the smallest.
In one possible implementation, the conversion relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is sequentially optimized by adopting an optimization sequence of the x-axis rotation angle, the y-axis rotation angle, the z-axis rotation angle and the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system.
And determining the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system through optimization and other methods according to the coordinates and the parallel constraint of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera.
Illustratively, the parallel constraint includes minimizing a loss function 0.5 Σ (θ ii+1)2, θ i is an angle of an i-th second parallel line of the M second parallel lines to an x-axis direction of the virtual ideal camera coordinate system, θ i+1 is an angle of an i+1th second parallel line of the M second parallel lines to an x-axis direction of the virtual ideal camera coordinate system, i is an integer greater than or equal to 1 and less than M, θ i is less than or equal to 90 degrees, and θ i+1 is less than or equal to 90 degrees).
And determining the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system through methods such as optimization according to the coordinates and vertical constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera.
Illustratively, the vertical constraint includes minimizing a loss function of 0.5 Σ (θ j-90)2, θ j is an angle between a j-th second parallel line of the M second parallel lines and an x-axis direction of the virtual ideal camera coordinate system, j is an integer greater than or equal to 1 and less than or equal to M, and θ j is less than or equal to 90 degrees.
And determining the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system through methods such as optimization according to the coordinate and space proportion constraint of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera.
Exemplary, the pitch proportion constraint includes causing a loss functionAt the minimum, ω s,s+1 is a distance between the s-th second parallel line and the s+1-th second parallel line in the M second parallel lines, ω s+1,s+2 is a distance between the s+1-th second parallel line and the s+2-th second parallel line in the M second parallel lines, W s,s+1 is an actual distance between the lane line corresponding to the s-th second parallel line and the lane line corresponding to the s+1-th second parallel line in the M second parallel lines, W s+1,s+2 is an actual distance between the lane line corresponding to the s+1-th second parallel line and the lane line corresponding to the s+2-th second parallel line in the M second parallel lines, and s is an integer greater than or equal to 1 and less than M-1.
And determining the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system through methods such as optimization according to the coordinates and the space constraint of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera.
Illustratively, the spacing constraint further includes 0.5 Σ (ω i,i+1-Wi,i+1)2 is minimum, ω i,i+1 is a distance between an i-th second parallel line and an i+1-th second parallel line of the M second parallel lines, W i,i+1 is an actual distance between a lane line corresponding to the i-th second parallel line and a lane line corresponding to the i+1-th second parallel line of the M second parallel lines, and i is an integer greater than or equal to 1 and less than M.
By using the optimization sequence of the x-axis rotation angle, the y-axis rotation angle, the z-axis rotation angle and the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system, which is provided by the implementation manner, errors introduced by coupling of Euler angles can be avoided, and the accuracy of the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is improved.
In another possible implementation, the constraining of the M second parallel lines in the bird's eye view includes minimizing a loss function αΣ (θ n-90)2+β∑(ωm,m+1-Wm,m+1)2, where α and β are weight parameters, θ n is an angle between the nth of the M second parallel lines and an x-axis direction of the virtual ideal camera coordinate system, ω m,m+1 is a distance between the mth of the M second parallel lines and the (m+1) -th second parallel line, W m,m+1 is an actual distance between a lane line corresponding to the mth of the M second parallel lines and a lane line corresponding to the (m+1) -th second parallel line, n is an integer greater than or equal to 1 and less than or equal to M, and M is an integer greater than or equal to 1 and less than M.
The constraint method provided by the implementation mode can be combined with the optimization and other methods to simultaneously obtain the x-axis rotation angle, the y-axis rotation angle, the z-axis rotation angle and the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system, but the local optimization value, rather than the global optimization value, of the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is easy to obtain due to improper initial value selection.
S603, determining the transformation relation between the actual camera coordinate system and the world coordinate system of the vehicle-mounted camera according to the transformation relation between the actual camera coordinate system and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system and the world coordinate system.
The transformation relationship between the virtual ideal camera coordinate system and the world coordinate system is an x-axis rotation angle, a y-axis rotation angle, a z-axis rotation angle and a z-axis translation distance of the world coordinate system relative to the virtual ideal camera coordinate system of the vehicle-mounted camera. The transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is an external parameter of the vehicle-mounted camera, and comprises an x-axis rotation angle, a y-axis rotation angle, a z-axis rotation angle and a z-axis translation distance of the world coordinate system relative to the actual camera coordinate system of the vehicle-mounted camera, and the transformation relation can also be called a roll angle (roll), a pitch angle (pitch), a yaw angle (yaw) and the translation distance. One example of the actual camera coordinate system of the vehicle-mounted camera may be the actual camera coordinate system of the vehicle-mounted camera shown in fig. 3, one example of the virtual ideal camera coordinate system of the vehicle-mounted camera may be the virtual ideal camera coordinate system of the vehicle-mounted camera shown in fig. 3 or fig. 4, and one example of the world coordinate system may be the world coordinate system shown in fig. 4.
In one possible implementation, the matrix is transformed byAnd converting the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera into the transformation relation between the virtual ideal camera coordinate system and the world coordinate system, and obtaining the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system, namely the external parameters of the vehicle-mounted camera.
And acquiring positioning information of the vehicle from sensors of the vehicle to which the vehicle-mounted camera belongs, wherein the sensors of the vehicle-mounted camera comprise a wheel speed meter, an inertial sensor (inertial measurement unit, an IMU), a global positioning system (global positioning system, GPS) and the like, and accordingly, the positioning information of the vehicle comprises current position information of the vehicle and/or acceleration of the vehicle and/or wheel speed of the vehicle and the like.
The method comprises the steps of establishing an observation model of a vehicle according to positioning information of the vehicle and the y-axis translation amount of a vehicle-mounted camera in a world coordinate system, obtaining a course angle of the vehicle at the current moment according to the observation model of the vehicle, compensating the course angle of the vehicle at the current moment according to the obtained course angle in external parameters of the vehicle-mounted camera to obtain a compensated course angle, and updating the course angle in external parameters of the vehicle-mounted camera to be the compensated course angle.
Fig. 9 is a schematic diagram of a course angle compensation according to an embodiment of the present application, as shown in fig. 9, a course angle of a vehicle is a course angle of a current moment of the vehicle obtained according to an observation model of the vehicle, a camera course angle is a determined course angle in an external parameter of a vehicle-mounted camera, the course angle of the vehicle is subtracted from the camera course angle to obtain a course angle of the camera to the vehicle, which may also be referred to as a compensated course angle, and the determined course angle in the vehicle-mounted camera is updated to the compensated course angle, so that an influence of the vehicle course angle obtained by positioning the vehicle is compensated.
In another possible implementation manner, the external parameters of the vehicle-mounted camera after the course angle compensation are cached, the external parameters of the vehicle-mounted camera after the course angle compensation are filtered, the external parameters of the vehicle-mounted camera after the filtering are obtained, and the external parameters of the vehicle-mounted camera are updated to the external parameters of the vehicle-mounted camera after the filtering.
By way of example, a plurality of external parameters of the vehicle-mounted camera after course angle compensation are cached, and periodic optimal estimation is performed. If the external parameters of the vehicle-mounted camera after the course angle compensation are cached and exceed the preset threshold value, the external parameters of the vehicle-mounted camera after the course angle compensation are cached and are used for optimal estimation and calibration, and if the external parameters of the vehicle-mounted camera after the course angle compensation are cached and exceed the preset threshold value when the vehicle runs out of the calibration field, the external parameters of the vehicle-mounted camera after the course angle compensation are cached and are selected for optimal estimation every certain number (such as 10 frames). The calibration site is an area where N lane lines in a target road for external parameter calibration of the vehicle-mounted camera are located.
Preferably, a group of data is respectively cached for each type of external parameters of the vehicle-mounted camera after the course angle compensation is completed, namely, a rolling angle, a pitch angle, a course angle and a translation distance in the external parameters of the vehicle-mounted camera after the course angle compensation are respectively cached for a group of data, and a kernel density estimation method is adopted to obtain an external parameter value with the maximum probability in each group of data, namely, the optimal estimation of the type of external parameters is achieved.
Specifically, the kernel density estimation method is to fit a gaussian function to a data statistical histogram, find a gaussian function with the best fitting effect to the histogram of known data, and resist a certain amount of noise interference. For each type of external parameters, a fitted kernel function (preferably a Gaussian kernel function) is estimated through a set of external parameter values corresponding to each type, and the peak value of the kernel function is found, wherein the corresponding abscissa value (where the probability of the Gaussian function is maximum) is the optimal value of each type of external parameters.
Alternatively, the four-dimensional kernel density estimation method can be used to estimate the optimal value of each of the four types of external parameters simultaneously.
Optionally, a group of data is respectively cached for each type of external parameters, and a median filtering method is adopted to obtain the median of each type of external parameters, namely the optimal value of each type of external parameters.
Optionally, a group of data is respectively cached for each type of external parameter, and an average value of each group of data is calculated by adopting an average value filtering method, so that the obtained average value of each group of data is the optimal value of each type of external parameter.
And (3) when the standard deviation of the obtained optimal value of each type of external parameter is smaller than a specific threshold value, converging the calibration result, and stopping the calibration to obtain the external parameter of the vehicle-mounted camera.
According to the technical scheme, according to the parallel constraint, the vertical constraint, the pitch proportion constraint and the pitch constraint of M first parallel lines in a first image acquired by a vehicle-mounted camera in an actual camera coordinate system of the vehicle-mounted camera and the parallel constraint, the vertical constraint, the pitch proportion constraint and the pitch constraint of M second parallel lines in a bird's eye view of the first image, the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is determined according to the optimization sequence of the x-axis rotation angle, the y-axis rotation angle, the z-axis rotation angle and the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system, and then the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system of the vehicle-mounted camera and the world coordinate system are determined, so that the flexibility of an external parameter calibration method of the vehicle-mounted camera is improved, and the accuracy of the transformation relation (namely, the external parameters of the vehicle-mounted camera is improved.
Fig. 10 is a flowchart of another calibration method of external parameters of an in-vehicle camera according to an embodiment of the application. As shown in fig. 10, the method includes at least S1001 to S1011.
S1001, acquiring coordinates of M first parallel lines in a first image in an actual camera coordinate system of a vehicle-mounted camera, wherein the first image is an image obtained by shooting a vehicle to which the vehicle-mounted camera belongs in the driving process of the vehicle on a target road, the target road comprises N parallel lane lines, the distance between any two lane lines in the N lane lines is known, the M first parallel lines are in one-to-one correspondence with the M first lane lines in the N lane lines, M is an integer greater than or equal to 3, and N is an integer greater than or equal to M.
S1002, determining a transformation relation between an actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera according to the coordinates of M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the constraints of M second parallel lines in the aerial view of the first image, wherein the constraints of the M second parallel lines in the aerial view comprise parallel constraints, vertical constraints, pitch proportion constraints and pitch constraints, and the M second parallel lines are in one-to-one correspondence with the M first parallel lines.
It should be noted that S1001 to S1002 may refer to S601 to S602, and will not be described here.
S1003, determining a first external parameter of the vehicle-mounted camera according to a transformation relationship between an actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera and a transformation relationship between the virtual ideal camera coordinate system and a world coordinate system.
The transformation relationship between the virtual ideal camera coordinate system and the world coordinate system is an x-axis rotation angle, a y-axis rotation angle, a z-axis rotation angle and a z-axis translation distance of the world coordinate system relative to the virtual ideal camera coordinate system of the vehicle-mounted camera. The transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is an external parameter of the vehicle-mounted camera, and comprises an x-axis rotation angle, a y-axis rotation angle, a z-axis rotation angle and a z-axis translation distance of the world coordinate system relative to the actual camera coordinate system of the vehicle-mounted camera, and the transformation relation can also be called a roll angle (roll), a pitch angle (pitch), a yaw angle (yaw) and the translation distance. One example of the actual camera coordinate system of the vehicle-mounted camera may be the actual camera coordinate system of the vehicle-mounted camera shown in fig. 3, one example of the virtual ideal camera coordinate system of the vehicle-mounted camera may be the virtual ideal camera coordinate system of the vehicle-mounted camera shown in fig. 3 or fig. 4, and one example of the world coordinate system may be the world coordinate system shown in fig. 4.
In one possible implementation, the matrix is transformed byAnd converting the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera into the transformation relation between the virtual ideal camera coordinate system and the world coordinate system, and obtaining a first transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system, namely a first external parameter of the vehicle-mounted camera.
S1004, acquiring the y-axis translation amount of the vehicle-mounted camera in the world coordinate system according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system and the world coordinate system.
As an example, in the world coordinate system shown in fig. 4, the position of the vehicle-mounted camera on the Y w axis of the world coordinate system is the Y-axis translation amount of the vehicle-mounted camera in the world coordinate system.
S1005, according to the positioning information of the vehicle to which the vehicle-mounted camera belongs and the y-axis translation amount of the vehicle-mounted camera in the world coordinate system, an observation model of the vehicle is established, and the observation model comprises the influence of the y-axis translation amount on the state quantity.
In one possible implementation, the positioning information of the vehicle is obtained from a sensor of the vehicle to which the vehicle-mounted camera belongs, wherein the sensor of the vehicle to which the vehicle-mounted camera belongs comprises a wheel speed meter, an IMU, a GPS and the like, and accordingly the positioning information of the vehicle comprises current position information of the vehicle and/or acceleration of the vehicle and/or wheel speed of the vehicle and the like.
S1006, acquiring the course angle of the current moment of the vehicle according to the observation model of the vehicle.
In one possible implementation, the observation model of the vehicle is updated by filtering, and the heading angle of the vehicle at the current moment is obtained.
S1007, compensating the course angle of the current moment of the vehicle according to the course angle in the first external parameter of the vehicle-mounted camera, and obtaining the compensated course angle.
In one possible implementation manner, as shown in the schematic diagram of the course angle compensation shown in fig. 9, the course angle of the vehicle is the course angle of the current moment of the vehicle obtained according to the observation model of the vehicle, the course angle of the camera is the determined course angle in the external parameter of the vehicle-mounted camera, the course angle of the camera is obtained by subtracting the course angle of the camera from the course angle of the vehicle, and the course angle of the camera can also be called as the compensated course angle, and the determined course angle in the vehicle-mounted camera is updated as the compensated course angle, so that the influence of the course angle of the vehicle obtained by positioning the vehicle is compensated.
And S1008, updating the course angle in the first external parameters of the vehicle-mounted camera to the compensated course angle.
The first external parameters of the vehicle-mounted camera comprise a first rolling angle, a first pitch angle, a first course angle and a first translation distance. And updating a first course angle in the first external parameters of the vehicle-mounted camera to be a compensated course angle, wherein the updated first external parameters of the vehicle-mounted camera comprise a first rolling angle, a first pitch angle, the compensated course angle and a first translation distance.
S1009, caching a plurality of first external parameters of the in-vehicle camera.
In one possible implementation manner, during a calibration process of external parameters of the vehicle-mounted camera, the vehicle of the vehicle-mounted camera shoots and obtains a plurality of first images from a period from entering the calibration site to exiting the calibration site, and obtains a plurality of first external parameters of the vehicle-mounted camera according to the plurality of first images, and caches the plurality of first external parameters of the vehicle-mounted camera. The calibration site is an area where N lane lines in the external parameter calibration target road for the vehicle-mounted camera are located.
S1010, filtering the cached first external parameters of the vehicle-mounted camera to obtain second external parameters of the vehicle-mounted camera.
In one possible implementation, a periodic optimal estimation is made of a plurality of first external parameters of the cached onboard camera.
Optionally, if the number of the first external parameters of the vehicle-mounted camera cached is not more than a preset threshold value in the period from the time of entering the calibration site to the time of exiting the calibration site, the first external parameters of the vehicle-mounted camera cached are all used for performing optimal estimation and completing calibration, and if the number of the first external parameters of the vehicle-mounted camera cached is more than the preset threshold value in the period from the time of entering the calibration site to the time of exiting the calibration site, the first external parameters of the vehicle-mounted camera cached at a certain interval (for example, 10 frames) are selected for performing optimal estimation.
Preferably, a group of data is respectively cached for each type of external parameters in the plurality of first external parameters of the vehicle-mounted camera, namely, the roll angle, the pitch angle, the course angle and the translation distance in the plurality of first external parameters of the vehicle-mounted camera are respectively cached for a group of data, and a kernel density estimation method is adopted to obtain an external parameter value with the highest probability in each group of data, namely, the optimal estimation of the type of external parameter is achieved.
Specifically, the kernel density estimation method is to fit a gaussian function to a data statistical histogram, find a gaussian function with the best fitting effect to the histogram of known data, and resist a certain amount of noise interference. For each type of external parameters, a fitted kernel function (preferably a Gaussian kernel function) is estimated through a set of external parameter values corresponding to each type, and the peak value of the kernel function is found, wherein the corresponding abscissa value (where the probability of the Gaussian function is maximum) is the optimal value of each type of external parameters.
Alternatively, the four-dimensional kernel density estimation method can be used to estimate the optimal value of each of the four types of external parameters simultaneously.
Optionally, a group of data is respectively cached for each type of external parameters, and a median filtering method is adopted to obtain the median of each type of external parameters, namely the optimal value of each type of external parameters.
Optionally, a group of data is respectively cached for each type of external parameter, and an average value of each group of data is calculated by adopting an average value filtering method, so that the obtained average value of each group of data is the optimal value of each type of external parameter.
And (3) when the standard deviation of the obtained optimal value of each type of external parameter is smaller than a specific threshold value, converging the calibration result, stopping the calibration, and obtaining a second external parameter of the vehicle-mounted camera.
S1011, updating the external parameter of the on-board camera to the second external parameter.
According to the technical scheme provided by the application, the course angle of the vehicle at the current moment is compensated by using the course angle of the vehicle-mounted camera in the external parameters under the world coordinate system, so that the compensated course angle is obtained, and the course angle in the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is updated to be the compensated course angle, so that when the vehicle does not run completely parallel to the lane line, the influence of the course angle of the vehicle can be compensated, and the accuracy of the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is improved.
Fig. 11 is a schematic structural diagram of an external parameter calibration device of an in-vehicle camera according to an embodiment of the present application. As shown in fig. 11, the apparatus 1100 may include an acquisition module 1101 and a processing module 1102. Apparatus 1100 may be used to implement the method shown in any of the embodiments described above.
In one possible implementation, the apparatus 1100 may be used to implement the method illustrated in fig. 6 described above. For example, the acquisition module 1101 is used to implement S601, and the processing module 1102 is used to implement S602 and S603.
In another possible implementation, the apparatus 1100 further includes a compensation module, an update module, and a cache module. The apparatus 1100 in this implementation may be used to implement the method illustrated in fig. 10 described above. For example, the acquisition module 1101 is used to implement S1001, S1004, and S1006, the processing module 1102 is used to implement S1002, S1003, S1005, and S1010, the compensation module is used to implement S1007, the update module is used to implement S1008 and S1011, and the buffer module is used to implement S1009.
Fig. 12 is a schematic structural diagram of an external parameter calibration device of an in-vehicle camera according to another embodiment of the present application. The apparatus 1200 shown in fig. 12 may be used to perform the calibration method of the external parameters of the in-vehicle camera shown in any of the above embodiments.
As shown in fig. 12, the apparatus 1200 of the present embodiment includes a memory 1201, a processor 1202, a communication interface 1203, and a bus 1204. Wherein the memory 1201, the processor 1202 and the communication interface 1203 are communicatively coupled to each other via a bus 1204.
The memory 1201 may be a Read Only Memory (ROM), a static storage device, a dynamic storage device, or a random access memory (random access memory, RAM). The memory 1201 may store a program, and the processor 1202 may be configured to perform the steps of the methods shown in fig. 6 and 10 when the program stored in the memory 1201 is executed by the processor 1202.
The processor 1202 may employ a general-purpose central processing unit (central processing unit, CPU), microprocessor, application Specific Integrated Circuit (ASIC), or one or more integrated circuits for executing related programs to implement the method for calibrating external parameters of the vehicle camera according to the method embodiment of the present application.
The processor 1202 may also be an integrated circuit chip with signal processing capabilities. In implementation, various steps of methods of various embodiments of the application may be performed by integrated logic circuitry in hardware or by instructions in software in processor 1202.
The processor 1202 may also be a general purpose processor, a digital signal processor (DIGITAL SIGNAL processing unit, DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (field programmable GATE ARRAY, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 1201, and the processor 1202 reads information in the memory 1201 and in combination with its hardware performs functions necessary for performing the respective methods in the embodiments of the present application, for example, the steps/functions of the embodiments shown in fig. 6 and 10 may be performed.
The communication interface 1203 may enable communication between the apparatus 1200 and other devices or communication networks using, but is not limited to, a transceiver-like transceiver.
The bus 1204 may include a path to transfer information between various components of the apparatus 1200 (e.g., the memory 1201, the processor 1202, the communication interface 1203).
It should be understood that the apparatus 1200 shown in the embodiment of the present application may be an electronic device, or may be a chip configured in an electronic device.
It should be appreciated that the processor in embodiments of the present application may be a central processing unit (central processing unit, CPU), which may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL processors, DSPs), application Specific Integrated Circuits (ASICs), off-the-shelf programmable gate arrays (field programmable GATE ARRAY, FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It should also be appreciated that the memory in embodiments of the present application may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an erasable programmable ROM (erasable PROM), an electrically erasable programmable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as external cache memory. By way of example, and not limitation, many forms of random access memory (random access memory, RAM) are available, such as static random access memory (STATIC RAM, SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate synchronous dynamic random access memory (double DATA RATE SDRAM, DDR SDRAM), enhanced synchronous dynamic random access memory (ENHANCED SDRAM, ESDRAM), synchronous link dynamic random access memory (SYNCHLINK DRAM, SLDRAM), and direct memory bus random access memory (direct rambus RAM, DR RAM).
The above embodiments may be implemented in whole or in part by software, hardware, firmware, or any other combination. When implemented in software, the above-described embodiments may be implemented in whole or in part in the form of a computer program product. The computer program product comprises one or more computer instructions or computer programs. When the computer instructions or computer program are loaded or executed on a computer, the processes or functions described in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center by wired (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more sets of available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium. The semiconductor medium may be a solid state disk.
It should be understood that the term "and/or" is merely an association relationship describing the associated object, and means that three relationships may exist, for example, a and/or B, and may mean that a exists alone, while a and B exist alone, and B exists alone, wherein a and B may be singular or plural. In addition, the character "/" herein generally indicates that the associated object is an "or" relationship, but may also indicate an "and/or" relationship, and may be understood by referring to the context.
In the present application, "at least one" means one or more, and "a plurality" means two or more. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (a, b, or c) of a, b, c, a-b, a-c, b-c, or a-b-c may be represented, wherein a, b, c may be single or plural.
It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. The storage medium includes various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory, a random access memory, a magnetic disk or an optical disk.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (17)

1.一种车载相机的外部参数的标定方法,其特征在于,所述方法包括:1. A method for calibrating the external parameters of a vehicle-mounted camera, characterized in that the method comprises: 获取第一图像中的M条第一平行线在所述车载相机的实际相机坐标系中的坐标,所述第一图像为所述车载相机的所属车辆在目标道路上的行驶过程中拍摄得到的图像,所述目标道路上包含平行的N条车道线,所述N条车道线中的任意两条车道线之间的距离是已知的,所述M条第一平行线与所述N条车道线中的M条第一车道线一一对应,M为大于或等于3的整数,N为大于或等于M的整数;Obtaining coordinates of M first parallel lines in a first image in an actual camera coordinate system of the vehicle-mounted camera, wherein the first image is an image captured by a vehicle to which the vehicle-mounted camera belongs during driving on a target road, wherein the target road includes N parallel lane lines, wherein a distance between any two of the N lane lines is known, wherein the M first parallel lines correspond one-to-one to M first lane lines of the N lane lines, wherein M is an integer greater than or equal to 3, and N is an integer greater than or equal to M; 根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述第一图像的鸟瞰图中M条第二平行线的约束,确定所述车载相机的实际相机坐标系与所述车载相机的虚拟理想相机坐标系之间的变换关系,所述M条第二平行线与所述M条第一平行线一一对应,所述M条第二平行线在所述鸟瞰图中的约束包括平行约束、竖直约束、间距比例约束和间距约束,所述平行约束包括所述M条第二平行线平行,所述竖直约束包括所述M条第二平行线中的任意一条第二平行线与所述虚拟理想相机坐标系的x轴方向垂直,所述间距比例约束包括所述M条第二平行线中的任意两个第二平行线之间的距离比例与所述任意两个第二平行线对应的M条车道线之间的距离比例相同,所述间距约束包括所述任意两个第二平行线之间的距离与所述任意两个第二平行线对应的M条车道线之间的距离的差值最小;According to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the constraints of the M second parallel lines in the bird's-eye view of the first image, determine the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera, the M second parallel lines correspond to the M first parallel lines one-to-one, and the constraints of the M second parallel lines in the bird's-eye view include parallel constraints, vertical constraints, spacing ratio constraints and spacing constraints. The parallel constraint includes that the M second parallel lines are parallel, the vertical constraint includes that any one of the M second parallel lines is perpendicular to the x-axis direction of the virtual ideal camera coordinate system, the spacing ratio constraint includes that the distance ratio between any two of the M second parallel lines is the same as the distance ratio between the M lane lines corresponding to the any two second parallel lines, and the spacing constraint includes that the difference between the distance between the any two second parallel lines and the distance between the M lane lines corresponding to the any two second parallel lines is minimum; 根据所述车载相机的实际相机坐标系与所述车载相机的虚拟理想相机坐标系之间的变换关系以及所述虚拟理想相机坐标系与世界坐标系之间的变换关系,确定所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系。The transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is determined according to the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relationship between the virtual ideal camera coordinate system and the world coordinate system. 2.根据权利要求1所述的方法,其特征在于,所述根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述第一图像的鸟瞰图中M条第二平行线的约束,确定所述车载相机的实际相机坐标系与所述车载相机的虚拟理想相机坐标系之间的变换关系,包括:2. The method according to claim 1, characterized in that the step of determining the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the constraints of the M second parallel lines in the bird's-eye view of the first image comprises: 根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述平行约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的x轴旋转角;Determine an x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the parallel constraint; 根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述竖直约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的y轴旋转角;Determine a y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the vertical constraint; 根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述间距比例约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的z轴旋转角;Determine a z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the spacing ratio constraint; 根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述间距约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的z轴平移距离。According to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the spacing constraint, a z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined. 3.根据权利要求2所述的方法,其特征在于,所述平行约束还包括:最小,其中,为所述M条第二平行线中第条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,为所述M条第二平行线中的第条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,为大于或等于1且小于M的整数,小于或等于90度,小于或等于90度。3. The method according to claim 2, characterized in that the parallel constraint further comprises: The minimum, among which, is the first of the M second parallel lines The angle between the second parallel line and the x-axis direction of the virtual ideal camera coordinate system, is the first of the M second parallel lines The angle between the second parallel line and the x-axis direction of the virtual ideal camera coordinate system, is an integer greater than or equal to 1 and less than M, Less than or equal to 90 degrees, Less than or equal to 90 degrees. 4.根据权利要求1至3中任一项所述的方法,其特征在于,所述竖直约束还包括:最小,为所述M条第二平行线中第条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,为大于或等于1且小于或等于M的整数,小于或等于90度。4. The method according to any one of claims 1 to 3, characterized in that the vertical constraint further comprises: Minimum, is the first of the M second parallel lines The angle between the second parallel line and the x-axis direction of the virtual ideal camera coordinate system, is an integer greater than or equal to 1 and less than or equal to M, Less than or equal to 90 degrees. 5.根据权利要求4所述的方法,其特征在于,所述间距比例约束还包括:最小,其中,为所述M条第二平行线中的第条第二平行线与第条第二平行线之间的距离,为所述M条第二平行线中的第条第二平行线与第条第二平行线之间的距离,为所述M条第二平行线中第条第二平行线对应的车道线与第条第二平行线对应的车道线之间的实际距离,为所述M条第二平行线中第条第二平行线对应的车道线与第条第二平行线对应的车道线之间的实际距离,为大于或等于1且小于M-1的整数。5. The method according to claim 4, characterized in that the spacing ratio constraint further comprises: The minimum, among which, is the first of the M second parallel lines The second parallel line The distance between the second parallel lines, is the first of the M second parallel lines The second parallel line The distance between the second parallel lines, is the first of the M second parallel lines The lane line corresponding to the second parallel line is The actual distance between the lane lines corresponding to the second parallel lines, is the first of the M second parallel lines The lane line corresponding to the second parallel line is The actual distance between the lane lines corresponding to the second parallel lines, is an integer greater than or equal to 1 and less than M-1. 6.根据权利要求1至3、5中任一项所述的方法,其特征在于,所述间距约束还包括最小,为所述M条第二平行线中的第条第二平行线与第条第二平行线之间的距离,为所述M条第二平行线中第条第二平行线对应的车道线与第条第二平行线对应的车道线之间的实际距离,为大于或等于1且小于M的整数。6. The method according to any one of claims 1 to 3 and 5, characterized in that the spacing constraint further comprises Minimum, is the first of the M second parallel lines The second parallel line The distance between the second parallel lines, is the first of the M second parallel lines The lane line corresponding to the second parallel line is The actual distance between the lane lines corresponding to the second parallel lines, is an integer greater than or equal to 1 and less than M. 7.根据权利要求1至3、5中任一项所述的方法,其特征在于,所述方法还包括:7. The method according to any one of claims 1 to 3 and 5, characterized in that the method further comprises: 获取所述车载相机在所述世界坐标系中的y轴平移量;Obtaining the y-axis translation of the vehicle-mounted camera in the world coordinate system; 获取车辆的定位信息,所述车辆的定位信息包括所述车辆当前的位置信息和/或所述车辆的加速度和/或所述车辆的轮速;Acquiring vehicle positioning information, wherein the vehicle positioning information includes current position information of the vehicle and/or acceleration of the vehicle and/or wheel speed of the vehicle; 根据所述车辆的定位信息和所述车载相机在所述世界坐标系中的y轴平移量建立所述车辆的观测模型;Establishing an observation model of the vehicle according to the positioning information of the vehicle and the y-axis translation of the vehicle-mounted camera in the world coordinate system; 根据所述车辆的观测模型获取所述车辆当前时刻的航向角;Acquiring the heading angle of the vehicle at a current moment according to the observation model of the vehicle; 根据所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系中的航向角对所述车辆当前时刻的航向角进行补偿,得到补偿后的航向角;Compensating the heading angle of the vehicle at a current moment according to the heading angle in the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system to obtain a compensated heading angle; 将所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系中的航向角更新为所述补偿后的航向角。The heading angle in the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is updated to the compensated heading angle. 8.一种车载相机的外部参数的标定装置,其特征在于,所述装置包括:8. A device for calibrating external parameters of a vehicle-mounted camera, characterized in that the device comprises: 获取模块,用于获取第一图像中的M条第一平行线在所述车载相机的实际相机坐标系中的坐标,所述第一图像为所述车载相机的所属车辆在目标道路上的行驶过程中拍摄得到的图像,所述目标道路上包含平行的N条车道线,所述N条车道线中的任意两条车道线之间的距离是已知的,所述M条第一平行线与所述N条车道线中的M条第一车道线一一对应,M为大于或等于3的整数,N为大于或等于M的整数;an acquisition module, configured to acquire coordinates of M first parallel lines in a first image in an actual camera coordinate system of the vehicle-mounted camera, wherein the first image is an image captured by a vehicle to which the vehicle-mounted camera belongs during driving on a target road, wherein the target road includes N parallel lane lines, wherein a distance between any two of the N lane lines is known, wherein the M first parallel lines correspond one-to-one to the M first lane lines of the N lane lines, wherein M is an integer greater than or equal to 3, and N is an integer greater than or equal to M; 处理模块,用于根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述第一图像的鸟瞰图中M条第二平行线的约束,确定所述车载相机的实际相机坐标系与所述车载相机的虚拟理想相机坐标系之间的变换关系,所述M条第二平行线与所述M条第一平行线一一对应,所述M条第二平行线在所述鸟瞰图中的约束包括平行约束、竖直约束、间距比例约束和间距约束,所述平行约束包括所述M条第二平行线平行,所述竖直约束包括所述M条第二平行线中的任意一条第二平行线与所述虚拟理想相机坐标系的x轴方向垂直,所述间距比例约束包括所述M条第二平行线中的任意两个第二平行线之间的距离比例与所述任意两个第二平行线对应的M条车道线之间的距离比例相同,所述间距约束包括所述任意两个第二平行线之间的距离与所述任意两个第二平行线对应的M条车道线之间的距离的差值最小;a processing module, for determining a transformation relationship between an actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera according to coordinates of the M first parallel lines in an actual camera coordinate system of the vehicle-mounted camera and constraints of the M second parallel lines in a bird's-eye view of the first image, wherein the M second parallel lines correspond one-to-one to the M first parallel lines, and the constraints of the M second parallel lines in the bird's-eye view include a parallel constraint, a vertical constraint, a spacing ratio constraint, and a spacing constraint, wherein the parallel constraint includes that the M second parallel lines are parallel, the vertical constraint includes that any one of the M second parallel lines is perpendicular to an x-axis direction of the virtual ideal camera coordinate system, the spacing ratio constraint includes that a distance ratio between any two of the M second parallel lines is the same as a distance ratio between M lane lines corresponding to the any two second parallel lines, and the spacing constraint includes that a difference between the distance between the any two second parallel lines and the distance between the M lane lines corresponding to the any two second parallel lines is minimum; 所述处理模块,还用于根据所述车载相机的实际相机坐标系与所述车载相机的虚拟理想相机坐标系之间的变换关系以及所述虚拟理想相机坐标系与世界坐标系之间的变换关系,确定所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系。The processing module is also used to determine the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system based on the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relationship between the virtual ideal camera coordinate system and the world coordinate system. 9.根据权利要求8所述的装置,其特征在于,所述处理模块具体用于:9. The device according to claim 8, characterized in that the processing module is specifically used for: 根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述平行约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的x轴旋转角;Determine an x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the parallel constraint; 根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述竖直约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的y轴旋转角;Determine a y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the vertical constraint; 根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述间距比例约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的z轴旋转角;Determine a z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the spacing ratio constraint; 根据所述M条第一平行线在所述车载相机的实际相机坐标系中的坐标和所述间距约束,确定所述虚拟理想相机坐标系相对于所述实际相机坐标系的z轴平移距离。According to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the spacing constraint, a z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined. 10.根据权利要求9所述的装置,其特征在于,所述平行约束还包括:最小,其中,为所述M条第二平行线中第条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,为所述M条第二平行线中的第条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,为大于或等于1且小于M的整数,小于或等于90度,小于或等于90度。10. The device according to claim 9, characterized in that the parallel constraint further comprises: The minimum, among which, is the first of the M second parallel lines The angle between the second parallel line and the x-axis direction of the virtual ideal camera coordinate system, is the first of the M second parallel lines The angle between the second parallel line and the x-axis direction of the virtual ideal camera coordinate system, is an integer greater than or equal to 1 and less than M, Less than or equal to 90 degrees, Less than or equal to 90 degrees. 11.根据权利要求8至10中任一项所述的装置,其特征在于,所述竖直约束还包括:最小,为所述M条第二平行线中第条第二平行线与所述虚拟理想相机坐标系的x轴方向的夹角,为大于或等于1且小于或等于M的整数,小于或等于90度。11. The device according to any one of claims 8 to 10, characterized in that the vertical constraint further comprises: Minimum, is the first of the M second parallel lines The angle between the second parallel line and the x-axis direction of the virtual ideal camera coordinate system, is an integer greater than or equal to 1 and less than or equal to M, Less than or equal to 90 degrees. 12.根据权利要求11所述的装置,其特征在于,所述间距比例约束还包括:最小,其中,为所述M条第二平行线中的第条第二平行线与第条第二平行线之间的距离,为所述M条第二平行线中的第条第二平行线与第条第二平行线之间的距离,为所述M条第二平行线中第条第二平行线对应的车道线与第条第二平行线对应的车道线之间的实际距离,为所述M条第二平行线中第条第二平行线对应的车道线与第条第二平行线对应的车道线之间的实际距离,为大于或等于1且小于M-1的整数。12. The device according to claim 11, wherein the spacing ratio constraint further comprises: The minimum, among which, is the first of the M second parallel lines The second parallel line The distance between the second parallel lines, is the first of the M second parallel lines The second parallel line The distance between the second parallel lines, is the first of the M second parallel lines The lane line corresponding to the second parallel line is The actual distance between the lane lines corresponding to the second parallel lines, is the first of the M second parallel lines The lane line corresponding to the second parallel line is The actual distance between the lane lines corresponding to the second parallel lines, is an integer greater than or equal to 1 and less than M-1. 13.根据权利要求8至10、12中任一项所述的装置,其特征在于,所述间距约束还包括最小,为所述M条第二平行线中的第条第二平行线与第条第二平行线之间的距离,为所述M条第二平行线中第条第二平行线对应的车道线与第条第二平行线对应的车道线之间的实际距离,为大于或等于1且小于M的整数。13. The device according to any one of claims 8 to 10 and 12, characterized in that the spacing constraint further comprises Minimum, is the first of the M second parallel lines The second parallel line The distance between the second parallel lines, is the first of the M second parallel lines The lane line corresponding to the second parallel line is The actual distance between the lane lines corresponding to the second parallel lines, is an integer greater than or equal to 1 and less than M. 14.根据权利要求8至10、12中任一项所述的装置,其特征在于,所述装置还包括补偿模块,所述补偿模块用于:14. The device according to any one of claims 8 to 10 and 12, characterized in that the device further comprises a compensation module, wherein the compensation module is used to: 获取所述车载相机在所述世界坐标系中的y轴平移量;Obtaining the y-axis translation of the vehicle-mounted camera in the world coordinate system; 获取车辆的定位信息,所述车辆的定位信息包括所述车辆当前的位置信息和/或所述车辆的加速度和/或所述车辆的轮速;Acquiring vehicle positioning information, wherein the vehicle positioning information includes current position information of the vehicle and/or acceleration of the vehicle and/or wheel speed of the vehicle; 根据所述车辆的定位信息和所述车载相机在所述世界坐标系中的y轴平移量建立所述车辆的观测模型;Establishing an observation model of the vehicle according to the positioning information of the vehicle and the y-axis translation of the vehicle-mounted camera in the world coordinate system; 根据所述车辆的观测模型获取所述车辆当前时刻的航向角;Acquiring the heading angle of the vehicle at a current moment according to the observation model of the vehicle; 根据所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系中的航向角对所述车辆当前时刻的航向角进行补偿,得到补偿后的航向角;Compensating the heading angle of the vehicle at a current moment according to the heading angle in the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system to obtain a compensated heading angle; 将所述车载相机的实际相机坐标系与所述世界坐标系之间的变换关系中的航向角更新为所述补偿后的航向角。The heading angle in the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is updated to the compensated heading angle. 15.一种车载相机的外部参数的标定装置,其特征在于,包括:存储器和处理器;15. A device for calibrating external parameters of a vehicle-mounted camera, comprising: a memory and a processor; 所述存储器用于存储程序指令;The memory is used to store program instructions; 所述处理器用于调用所述存储器中的程序指令执行如权利要求1至7中任一项所述的方法。The processor is configured to call the program instructions in the memory to execute the method according to any one of claims 1 to 7. 16.一种芯片,其特征在于,包括至少一个处理器和通信接口,所述通信接口和所述至少一个处理器通过线路互联,所述至少一个处理器用于运行计算机程序或指令,以执行如权利要求1至7中任一项所述的方法。16. A chip, characterized in that it comprises at least one processor and a communication interface, wherein the communication interface and the at least one processor are interconnected via a line, and the at least one processor is used to run a computer program or instruction to execute the method as described in any one of claims 1 to 7. 17.一种计算机可读介质,其特征在于,所述计算机可读介质存储用于计算机执行的程序代码,该程序代码包括用于执行如权利要求1至7中任一项所述的方法的指令。17 . A computer-readable medium, characterized in that the computer-readable medium stores a program code for computer execution, wherein the program code includes instructions for executing the method according to claim 1 .
CN202180006501.9A 2021-08-31 2021-08-31 Calibration method and related device for external parameters of vehicle-mounted camera Active CN114730472B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/115802 WO2023028880A1 (en) 2021-08-31 2021-08-31 External parameter calibration method for vehicle-mounted camera and related apparatus

Publications (2)

Publication Number Publication Date
CN114730472A CN114730472A (en) 2022-07-08
CN114730472B true CN114730472B (en) 2025-05-02

Family

ID=82235994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180006501.9A Active CN114730472B (en) 2021-08-31 2021-08-31 Calibration method and related device for external parameters of vehicle-mounted camera

Country Status (2)

Country Link
CN (1) CN114730472B (en)
WO (1) WO2023028880A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116012508B (en) * 2023-03-28 2023-06-23 高德软件有限公司 Lane line rendering method, device and storage medium
CN116402900B (en) * 2023-03-31 2025-10-31 深圳市大族数控科技股份有限公司 Method and device for determining compensation precision of camera system and storage medium
CN116704040B (en) * 2023-04-03 2024-03-15 上海保隆汽车科技(武汉)有限公司 Camera calibration method, device, controller, vehicle and storage medium
CN116109698B (en) * 2023-04-11 2023-07-14 禾多科技(北京)有限公司 Method, device and storage medium for determining coordinate value of target virtual parking space
CN116664669B (en) * 2023-05-30 2025-10-17 浙江海康智联科技有限公司 Target center positioning compensation method
CN117011391A (en) * 2023-06-05 2023-11-07 东软睿驰汽车技术(武汉)有限公司 Self-calibration method and device for vehicle surround view camera suitable for basement environment
CN117315041B (en) * 2023-09-04 2025-09-23 黑龙江惠达科技股份有限公司 Calibration method, device and drone
CN116934847B (en) * 2023-09-15 2024-01-05 蓝思系统集成有限公司 Unloading methods, devices, electronic equipment and storage media
CN117274384A (en) * 2023-09-27 2023-12-22 中汽创智科技有限公司 A camera pose correction method, device, computer equipment and storage medium
CN118115601A (en) * 2024-03-26 2024-05-31 深圳看到科技有限公司 Camera lens rotation calibration method, device and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104494598A (en) * 2014-11-23 2015-04-08 北京联合大学 Road-crossing driving control method for intelligent vehicles
CN106651963A (en) * 2016-12-29 2017-05-10 清华大学苏州汽车研究院(吴江) Mounting parameter calibration method for vehicular camera of driving assistant system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2544898B2 (en) * 1993-11-25 1996-10-16 住友電気工業株式会社 In-vehicle camera attitude parameter calculator
US9286678B2 (en) * 2011-12-28 2016-03-15 Pelco, Inc. Camera calibration using feature identification
CN108805934B (en) * 2017-04-28 2021-12-28 华为技术有限公司 External parameter calibration method and device for vehicle-mounted camera
JP7223978B2 (en) * 2018-05-23 2023-02-17 パナソニックIpマネジメント株式会社 Calibration device and calibration method
GB2579843A (en) * 2018-12-18 2020-07-08 Continental Automotive Gmbh Method and apparatus for calibrating the extrinsic parameter of an image sensor
CN112509054B (en) * 2020-07-20 2024-05-17 重庆兰德适普信息科技有限公司 Camera external parameter dynamic calibration method
CN112184830B (en) * 2020-09-22 2021-07-09 深研人工智能技术(深圳)有限公司 Camera internal parameter and external parameter calibration method and device, computer equipment and storage medium
CN112365549B (en) * 2021-01-12 2021-04-09 腾讯科技(深圳)有限公司 Attitude correction method and device for vehicle-mounted camera, storage medium and electronic device
CN113284193B (en) * 2021-06-22 2024-02-02 智道网联科技(北京)有限公司 Calibration method, device and equipment of RS equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104494598A (en) * 2014-11-23 2015-04-08 北京联合大学 Road-crossing driving control method for intelligent vehicles
CN106651963A (en) * 2016-12-29 2017-05-10 清华大学苏州汽车研究院(吴江) Mounting parameter calibration method for vehicular camera of driving assistant system

Also Published As

Publication number Publication date
WO2023028880A1 (en) 2023-03-09
CN114730472A (en) 2022-07-08

Similar Documents

Publication Publication Date Title
CN114730472B (en) Calibration method and related device for external parameters of vehicle-mounted camera
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
CN110147382B (en) Lane line update method, device, device, system and readable storage medium
WO2020097840A1 (en) Systems and methods for correcting a high-definition map based on detection of obstructing objects
CN113256719B (en) Parking navigation positioning method, device, electronic equipment and storage medium
CN112219225B (en) Positioning method, system and movable platform
CN110766760B (en) Method, device, equipment and storage medium for camera calibration
CN109871739B (en) Automatic target detection and space positioning method for mobile station based on YOLO-SIOCTL
CN115236643B (en) Sensor calibration method, system, device, electronic equipment and medium
CN112634141B (en) License plate correction method, device, equipment and medium
CN115523929B (en) SLAM-based vehicle-mounted integrated navigation method, device, equipment and medium
CN115164885B (en) Image inverse perspective transformation method, device, electronic device, and storage medium
CN119124215A (en) A method, device, equipment and medium for optimizing visual inertial odometer of unmanned aerial vehicle
CN113147746A (en) Method and device for detecting ramp parking space
CN115908486B (en) Vehicle speed prediction method and device
CN114972494B (en) Map construction method and device for memorizing parking scene
CN113034538B (en) Pose Tracking Method and Device for Visual Inertial Navigation Equipment, and Visual Inertial Navigation Equipment
CN111811501B (en) Trunk feature-based unmanned aerial vehicle positioning method, unmanned aerial vehicle and storage medium
US12523752B2 (en) Online sensor alignment using feature registration
CN116697975A (en) Locomotive-mounted monocular distance measurement method based on infrared phase
CN115861442A (en) A camera external parameter self-calibration method and device
CN117804442B (en) Method, device, electronic device and storage medium for determining position and posture of aircraft
CN120472006B (en) Road target object visual positioning system and method based on city streetscape
US20240112363A1 (en) Position estimation system, position estimation method, and program
CN114812574B (en) Vehicle positioning method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20241106

Address after: 518129 Huawei Headquarters Office Building 101, Wankecheng Community, Bantian Street, Longgang District, Shenzhen, Guangdong

Applicant after: Shenzhen Yinwang Intelligent Technology Co.,Ltd.

Country or region after: China

Address before: 518129 Bantian HUAWEI headquarters office building, Longgang District, Guangdong, Shenzhen

Applicant before: HUAWEI TECHNOLOGIES Co.,Ltd.

Country or region before: China

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant