Disclosure of Invention
The application provides a calibration method and a related device for external parameters of a vehicle-mounted camera, which improve the flexibility of the external parameter calibration method of the vehicle-mounted camera.
In a first aspect, the application provides a calibration method for external parameters of a vehicle-mounted camera, the method comprises the steps of obtaining coordinates of M first parallel lines in a first image in an actual camera coordinate system of the vehicle-mounted camera, wherein the first image is an image obtained by shooting a vehicle of the vehicle-mounted camera in a driving process on a target road, the target road comprises N parallel lane lines, the distance between any two lane lines in the N lane lines is known, the M first parallel lines are in one-to-one correspondence with M first lane lines in the N lane lines, M is an integer greater than or equal to 3, N is an integer greater than or equal to M, according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the constraint of M second parallel lines in a bird-eye view of the first image, the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera is determined, the M first parallel lines and the distance between any two first parallel lines are known, M first parallel lines and M first parallel lines are in one-to-another, M is an integer greater than or equal to 3, N is an integer greater than or equal to M, M is a constraint between the first parallel line and a virtual ideal line is a constraint between any two of the first parallel lines, and a vertical line is a vertical constraint between the first parallel line and a second parallel line is a vertical constraint between the first parallel to a second line and a first line, the distance constraint comprises that the difference between the distance between any two second parallel lines and the distance between M lane lines corresponding to the any two second parallel lines is the smallest, and the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is determined according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system and the world coordinate system.
In the method, firstly, according to the acquired constraint of M first parallel lines in an actual camera coordinate system of a vehicle-mounted camera and M second parallel lines in a bird's eye view of a first image, the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera is determined, M is an integer greater than or equal to 3, the M second parallel lines are in one-to-one correspondence with the M first parallel lines, the constraint of the M second parallel lines in the bird's eye view comprises parallel constraint, vertical constraint, spacing proportion constraint and spacing constraint, and then according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system of the vehicle-mounted camera and the world coordinate system, the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system is determined, wherein the first image is an image obtained by shooting the vehicle-mounted camera in the running process of the vehicle-mounted camera on a target road, N parallel lines are contained on the target road, N is an integer greater than or equal to M, the constraint of the M in the virtual ideal line of the vehicle-mounted camera in the virtual ideal coordinate system is introduced, the space between the two parallel lines in the vehicle-mounted camera coordinate system of the vehicle-mounted camera and the virtual ideal line of the vehicle-mounted camera is set as the specific constraint condition of the vehicle-mounted camera, and the distance between the two parallel lines in the vehicle-mounted camera coordinate system in the vehicle-mounted camera is different from the actual constraint condition, and the virtual line in the vehicle-mounted camera has a specific constraint condition, the flexibility of the external parameter calibration method of the vehicle-mounted camera is improved.
In one possible implementation manner, the method for determining the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the constraints of the M second parallel lines in the aerial view of the first image comprises determining the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the parallel constraints, determining the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the vertical constraints, determining the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the pitch proportion constraints, and determining the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system of the vehicle-mounted camera coordinate system according to the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the virtual camera coordinate system.
In the implementation manner, the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined according to the coordinates and parallel constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera in sequence, the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined according to the coordinates and vertical constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera, the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined according to the coordinates and pitch constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera, the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined according to the coordinates and vertical constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera, and the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is obtained, and the accuracy of the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is improved.
In one possible implementation, the parallel constraint further includes 0.5 Σ (θ i-θi+1)2 is minimum, where θ i is an angle between an i-th second parallel line of the M second parallel lines and an x-axis direction of the virtual ideal camera coordinate system, θ i+1 is an angle between an i+1th second parallel line of the M second parallel lines and an x-axis direction of the virtual ideal camera coordinate system, i is an integer greater than or equal to 1 and less than M, θ i is less than or equal to 90 degrees, and θ i+1 is less than or equal to 90 degrees.
In this implementation, the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined with 0.5 Σ (θ i-θi+1)2 minimum as a parallel constraint), improving the accuracy of the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system.
In one possible implementation, the vertical constraint further includes 0.5 Σ (θ j-90)2 is minimum, θ j is an angle between a j-th second parallel line of the M second parallel lines and an x-axis direction of the virtual ideal camera coordinate system, j is an integer greater than or equal to 1 and less than or equal to M, and θ j is less than or equal to 90 degrees.
In this implementation, determining the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system with a minimum of 0.5 Σ (θ j-90)2) as a vertical constraint improves the accuracy of the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system.
In one possible implementation, the pitch ratio constraint further includes: At a minimum, ω s,s+1 is a distance between the s-th second parallel line and the s+1th second parallel line in the M second parallel lines, ω s+1,s+2 is a distance between the s+1th second parallel line and the s+2th second parallel line in the M second parallel lines, W s,s+1 is an actual distance between the lane line corresponding to the s-th second parallel line and the lane line corresponding to the s+1th second parallel line in the M second parallel lines, W s+1,s+2 is an actual distance between the lane line corresponding to the s+1th second parallel line and the lane line corresponding to the s+2th second parallel line in the M second parallel lines, and s is an integer greater than or equal to 1 and less than M-1.
In this implementation, it willThe minimum z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is determined as a pitch proportion constraint, so that the accuracy of the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system is improved.
In one possible implementation, the space constraint further includes 0.5 Σ (ω i,i+1-Wi,i+1)2 is minimum, ω i,i+1 is a distance between an ith second parallel line and an i+1th second parallel line of the M second parallel lines, W i,i+1 is an actual distance between a lane line corresponding to the ith second parallel line and a lane line corresponding to the i+1th second parallel line of the M second parallel lines, and i is an integer greater than or equal to 1 and less than M.
In the implementation, the minimum distance of 0.5 sigma (omega i,i+1-Wi,i+1)2) is used as a spacing constraint to determine the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system, so that the accuracy of the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system is improved.
In a possible implementation manner, the method further comprises the steps of obtaining the y-axis translation amount of the vehicle-mounted camera in the world coordinate system, obtaining positioning information of a vehicle, wherein the positioning information of the vehicle comprises current position information of the vehicle and/or acceleration of the vehicle and/or wheel speed of the vehicle, establishing an observation model of the vehicle according to the positioning information of the vehicle and the y-axis translation amount of the vehicle-mounted camera in the world coordinate system, obtaining a course angle of the current moment of the vehicle according to the observation model of the vehicle, compensating the course angle of the current moment of the vehicle according to the course angle in a transformation relation between an actual camera coordinate system of the vehicle-mounted camera and the world coordinate system to obtain a compensated course angle, and updating the course angle in the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system to be the compensated course angle.
According to the implementation mode, the course angle of the current moment of the vehicle is obtained according to the observation model of the vehicle, which is built by the positioning information of the vehicle and the y-axis translation amount of the vehicle-mounted camera in the world coordinate system, the course angle of the current moment of the vehicle is compensated by using the course angle of the vehicle-mounted camera in the external parameter under the world coordinate system, the compensated course angle is obtained, and the course angle in the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is updated to be the compensated course angle, so that when the vehicle does not run completely parallel to the lane line, the influence of the course angle of the vehicle can be compensated, and the accuracy of the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is improved.
In a second aspect, the present application provides a calibration device for external parameters of an on-board camera, which device may comprise respective modules for implementing the method in the first aspect, which modules may be implemented in software and/or hardware.
In a third aspect, the present application provides a calibration apparatus for external parameters of an in-vehicle camera. The apparatus may include a processor coupled to a memory. Wherein the memory is for storing program code and the processor is for executing the program code in the memory to implement the method of the first aspect or any one of the implementations.
Optionally, the apparatus may further comprise the memory.
In a fourth aspect, the present application provides a chip comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a wire, the at least one processor being adapted to run a computer program or instructions to perform a method as described in the first aspect or any one of the possible implementations thereof.
In a fifth aspect, the present application provides a computer readable medium storing program code for execution by a device, the program code comprising instructions for performing the method of the first aspect or any one of the possible implementations thereof.
In a sixth aspect, the application provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method according to the first aspect or any one of the possible implementations thereof.
In a seventh aspect, the present application provides a computing device comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a line, the communication interface being in communication with a target system, the at least one processor being operable to execute a computer program or instructions to perform a method as described in the first aspect or any one of the possible implementations thereof.
In an eighth aspect, the present application provides a computing system comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a line, the communication interface being in communication with a target system, the at least one processor being operable to execute a computer program or instructions to perform a method as described in the first aspect or any one of the possible implementations thereof.
Detailed Description
The following description of the technical solutions according to the embodiments of the present application will be given with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Fig. 1 is a schematic diagram of a pixel coordinate system. As shown in fig. 1, the vertex of the upper left corner of the pixel coordinate system is the origin O p, the horizontal right is the u-axis, and the vertical down is the v-axis.
Pixel coordinates refer to the location of a pixel in an image. In the pixel coordinate system, the coordinates of any one pixel point may be represented as (u i,vi). The method of representation of the pixels does not reflect the physical dimensions of the object in the image.
Fig. 2 is a schematic diagram of a camera coordinate system. As shown in fig. 2, the camera coordinate system uses the optical axis of the camera as the Z c axis, and the central position of the light beam in the optical system of the camera is the origin O c, which is actually the center of the lens. The horizontal axis X c and the vertical axis Y c of the camera coordinate system are parallel to the u-axis and v-axis, respectively, of the pixel coordinate system.
Fig. 3 is a schematic diagram of an actual camera coordinate system and a virtual ideal camera coordinate system according to an embodiment of the present application. As shown in fig. 3, L1, L2 and L3 are three parallel lines on the vehicle running road, and according to the selection rules and the positional relationship of the origin, X-axis, Y-axis and Z-axis of the camera coordinate system shown in fig. 2, the center of the camera lens is selected as the origin O r of the actual camera coordinate system, the optical axis of the camera is selected as Z r,Zr of the actual camera coordinate system, which is parallel to the vehicle running road, the front of the camera is the positive direction of Z r, the direction perpendicular to Z r and parallel to the vehicle running road is selected as X r of the actual camera coordinate system, the direction of L2 to L3 is selected as the positive direction of X r, the direction perpendicular to the vehicle running road is selected as Y r (not shown in the figure), and the direction perpendicular to the vehicle running road inwards is selected as the positive direction of Y r.
The origin of the virtual ideal camera coordinate system coincides with the origin O r of the actual camera coordinate system, and the Y-axis of the virtual ideal camera coordinate system coincides with Y r and is consistent in direction. And rotating Z r and X r of the actual camera coordinate system by beta angle around Y r until X r is vertical to three parallel lines on the automobile driving road surface, so as to obtain the Z-axis direction and the X-axis direction of the virtual ideal camera coordinate system.
FIG. 4 is a diagram of a world coordinate system according to an embodiment of the present application. As shown in fig. 4, L1, L2 and L3 are three parallel lines on the road surface of the vehicle, ZOX is a virtual ideal camera coordinate system of the camera, any point on L2 is selected as an origin O w of a world coordinate system, the direction of X w of the world coordinate system is consistent with the Z-axis direction of the virtual ideal camera coordinate system of the camera, Y w of the world coordinate system is perpendicular to L2 and parallel to the road surface of the vehicle, and the directions of L2 to L1 are selected as positive directions of Y w.
Fig. 5 is a schematic diagram of an application scenario according to an embodiment of the present application. The scene shown in fig. 5 is a scene in which the external parameters of the camera are calibrated by three parallel lines L1, L2 and L3 of known pitches in parallel on the road surface on which the vehicle is traveling. The calibration of external parameters of the camera is completed by driving the vehicle from one end to the other end of the three parallel lines. The three parallel lines L1, L2 and L3 parallel to the road surface with known distance can be lane lines drawn on a specific site, or lane lines on a normal road.
It will be appreciated that the scenario shown in fig. 5 is only an example, and the technical solution of the present application may also be applied to other scenarios, as long as the scenario involves calibration of external parameters of the camera. For example, the technical scheme of the application can be also applied to scenes such as calibration of external parameters of a camera in the intelligent robot.
Fig. 6 is a flowchart of a method for calibrating external parameters of an on-vehicle camera according to an embodiment of the application. As shown in fig. 6, the method includes at least S601 to S603.
S601, acquiring coordinates of M first parallel lines in a first image in an actual camera coordinate system of a vehicle-mounted camera, wherein the first image is an image obtained by shooting a vehicle to which the vehicle-mounted camera belongs in the driving process of the vehicle on a target road, the target road comprises N parallel lane lines, the distance between any two lane lines in the N lane lines is known, the M first parallel lines are in one-to-one correspondence with the M first lane lines in the N lane lines, M is an integer greater than or equal to 3, and N is an integer greater than or equal to M.
In one possible implementation manner, during a running process of a vehicle of the onboard camera on a target road, the onboard camera photographs M lane lines of the N lane lines on the target road to obtain a first image. The first image comprises M first parallel lines, the M first parallel lines are obtained by shooting M lane lines in N lane lines on the target road by the vehicle-mounted camera, and the M first parallel lines in the first image are in one-to-one correspondence with the M lane lines in the N lane lines on the target road.
Fig. 7 is a schematic diagram of a first image according to an embodiment of the present application. As shown in fig. 7, there are 3 first parallel lines L1, L2, and L3 in the first image, which correspond to 3 lane lines out of N lane lines in the target road, respectively, where M is equal to 3 and N is greater than or equal to 3.
And extracting M first parallel lines in the first image to obtain coordinates of the M first parallel lines in a pixel coordinate system, and converting the coordinates of the M parallel lines in the pixel coordinate system into coordinates of the M parallel lines in a camera coordinate system of the vehicle-mounted camera. One example of the pixel coordinate system may be the pixel coordinate system shown in fig. 1, and another example of the camera coordinate system may be the camera coordinate system of the actual camera shown in fig. 3.
As an example, the parallel line regions and contours in the first image are extracted by a segmentation algorithm, and then edge extraction at the sub-pixel level is performed to obtain M first parallel lines. For example, the segmentation algorithm includes a watershed algorithm, etc.
As another example, straight lines in the first image are extracted through hough transformation, and then, M first parallel lines in the first image are obtained by adopting methods such as clustering and filtering. The Hough transformation is a feature extraction technology in image processing, objects with specific shapes can be detected through a voting algorithm, a clustering method is used for clustering straight line segments with similar slopes and intercept, and a filtering method is used for screening a region of interest through camera reference installation angles and positions.
S602, determining a transformation relation between an actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera according to the coordinates of M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the constraints of M second parallel lines in the aerial view of the first image, wherein the constraints of the M second parallel lines in the aerial view comprise parallel constraints, vertical constraints, pitch proportion constraints and pitch constraints, and the M second parallel lines are in one-to-one correspondence with the M first parallel lines.
The transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is the x-axis rotation angle, the y-axis rotation angle, the z-axis rotation angle and the z-axis translation distance of the virtual ideal camera coordinate system of the vehicle-mounted camera relative to the actual camera coordinate system of the vehicle-mounted camera. One example of the actual camera coordinate system of the vehicle-mounted camera may be the actual camera coordinate system shown in fig. 3, and one example of the virtual ideal camera coordinate system of the vehicle-mounted camera may be the virtual ideal camera coordinate system shown in fig. 3.
In one possible implementation, according to the coordinates of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera, an inverse perspective transformation method is adopted to obtain a bird's eye view of the M first parallel lines. In the aerial view, there are M second parallel lines, and the M second parallel lines are in one-to-one correspondence with the M first parallel lines.
Fig. 8 is a schematic diagram illustrating a bird's eye view of an embodiment of the present application. As shown in fig. 8, in the bird's eye view, there are 3 second parallel lines, M is equal to 3, and is L1, L2, and L3, respectively, where θ 1 is an angle between the second parallel line L1 and the x-axis direction of the virtual ideal coordinate system, θ 2 is an angle between the second parallel line L2 and the x-axis direction of the virtual ideal coordinate system, θ 3 is an angle between the second parallel line L3 and the x-axis direction of the virtual ideal coordinate system, ω 12 is a distance between the second parallel line L1 and the second parallel line L2, and ω 23 is a distance between the second parallel line L2 and the second parallel line L3.
The parallel constraint comprises M second parallel lines, the vertical constraint comprises that any one of the M second parallel lines is perpendicular to the x-axis direction of the virtual ideal camera coordinate system, the distance proportion constraint comprises that the distance proportion between any two of the M second parallel lines is the same as the distance proportion between M lane lines corresponding to the any two second parallel lines, and the distance constraint comprises that the difference between the distance between any two second parallel lines and the distance between M lane lines corresponding to the any two second parallel lines is the smallest.
In one possible implementation, the conversion relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is sequentially optimized by adopting an optimization sequence of the x-axis rotation angle, the y-axis rotation angle, the z-axis rotation angle and the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system.
And determining the x-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system through optimization and other methods according to the coordinates and the parallel constraint of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera.
Illustratively, the parallel constraint includes minimizing a loss function 0.5 Σ (θ i-θi+1)2, θ i is an angle of an i-th second parallel line of the M second parallel lines to an x-axis direction of the virtual ideal camera coordinate system, θ i+1 is an angle of an i+1th second parallel line of the M second parallel lines to an x-axis direction of the virtual ideal camera coordinate system, i is an integer greater than or equal to 1 and less than M, θ i is less than or equal to 90 degrees, and θ i+1 is less than or equal to 90 degrees).
And determining the y-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system through methods such as optimization according to the coordinates and vertical constraints of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera.
Illustratively, the vertical constraint includes minimizing a loss function of 0.5 Σ (θ j-90)2, θ j is an angle between a j-th second parallel line of the M second parallel lines and an x-axis direction of the virtual ideal camera coordinate system, j is an integer greater than or equal to 1 and less than or equal to M, and θ j is less than or equal to 90 degrees.
And determining the z-axis rotation angle of the virtual ideal camera coordinate system relative to the actual camera coordinate system through methods such as optimization according to the coordinate and space proportion constraint of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera.
Exemplary, the pitch proportion constraint includes causing a loss functionAt the minimum, ω s,s+1 is a distance between the s-th second parallel line and the s+1-th second parallel line in the M second parallel lines, ω s+1,s+2 is a distance between the s+1-th second parallel line and the s+2-th second parallel line in the M second parallel lines, W s,s+1 is an actual distance between the lane line corresponding to the s-th second parallel line and the lane line corresponding to the s+1-th second parallel line in the M second parallel lines, W s+1,s+2 is an actual distance between the lane line corresponding to the s+1-th second parallel line and the lane line corresponding to the s+2-th second parallel line in the M second parallel lines, and s is an integer greater than or equal to 1 and less than M-1.
And determining the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system through methods such as optimization according to the coordinates and the space constraint of the M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera.
Illustratively, the spacing constraint further includes 0.5 Σ (ω i,i+1-Wi,i+1)2 is minimum, ω i,i+1 is a distance between an i-th second parallel line and an i+1-th second parallel line of the M second parallel lines, W i,i+1 is an actual distance between a lane line corresponding to the i-th second parallel line and a lane line corresponding to the i+1-th second parallel line of the M second parallel lines, and i is an integer greater than or equal to 1 and less than M.
By using the optimization sequence of the x-axis rotation angle, the y-axis rotation angle, the z-axis rotation angle and the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system, which is provided by the implementation manner, errors introduced by coupling of Euler angles can be avoided, and the accuracy of the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is improved.
In another possible implementation, the constraining of the M second parallel lines in the bird's eye view includes minimizing a loss function αΣ (θ n-90)2+β∑(ωm,m+1-Wm,m+1)2, where α and β are weight parameters, θ n is an angle between the nth of the M second parallel lines and an x-axis direction of the virtual ideal camera coordinate system, ω m,m+1 is a distance between the mth of the M second parallel lines and the (m+1) -th second parallel line, W m,m+1 is an actual distance between a lane line corresponding to the mth of the M second parallel lines and a lane line corresponding to the (m+1) -th second parallel line, n is an integer greater than or equal to 1 and less than or equal to M, and M is an integer greater than or equal to 1 and less than M.
The constraint method provided by the implementation mode can be combined with the optimization and other methods to simultaneously obtain the x-axis rotation angle, the y-axis rotation angle, the z-axis rotation angle and the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system, but the local optimization value, rather than the global optimization value, of the transformation relationship between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is easy to obtain due to improper initial value selection.
S603, determining the transformation relation between the actual camera coordinate system and the world coordinate system of the vehicle-mounted camera according to the transformation relation between the actual camera coordinate system and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system and the world coordinate system.
The transformation relationship between the virtual ideal camera coordinate system and the world coordinate system is an x-axis rotation angle, a y-axis rotation angle, a z-axis rotation angle and a z-axis translation distance of the world coordinate system relative to the virtual ideal camera coordinate system of the vehicle-mounted camera. The transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is an external parameter of the vehicle-mounted camera, and comprises an x-axis rotation angle, a y-axis rotation angle, a z-axis rotation angle and a z-axis translation distance of the world coordinate system relative to the actual camera coordinate system of the vehicle-mounted camera, and the transformation relation can also be called a roll angle (roll), a pitch angle (pitch), a yaw angle (yaw) and the translation distance. One example of the actual camera coordinate system of the vehicle-mounted camera may be the actual camera coordinate system of the vehicle-mounted camera shown in fig. 3, one example of the virtual ideal camera coordinate system of the vehicle-mounted camera may be the virtual ideal camera coordinate system of the vehicle-mounted camera shown in fig. 3 or fig. 4, and one example of the world coordinate system may be the world coordinate system shown in fig. 4.
In one possible implementation, the matrix is transformed byAnd converting the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera into the transformation relation between the virtual ideal camera coordinate system and the world coordinate system, and obtaining the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system, namely the external parameters of the vehicle-mounted camera.
And acquiring positioning information of the vehicle from sensors of the vehicle to which the vehicle-mounted camera belongs, wherein the sensors of the vehicle-mounted camera comprise a wheel speed meter, an inertial sensor (inertial measurement unit, an IMU), a global positioning system (global positioning system, GPS) and the like, and accordingly, the positioning information of the vehicle comprises current position information of the vehicle and/or acceleration of the vehicle and/or wheel speed of the vehicle and the like.
The method comprises the steps of establishing an observation model of a vehicle according to positioning information of the vehicle and the y-axis translation amount of a vehicle-mounted camera in a world coordinate system, obtaining a course angle of the vehicle at the current moment according to the observation model of the vehicle, compensating the course angle of the vehicle at the current moment according to the obtained course angle in external parameters of the vehicle-mounted camera to obtain a compensated course angle, and updating the course angle in external parameters of the vehicle-mounted camera to be the compensated course angle.
Fig. 9 is a schematic diagram of a course angle compensation according to an embodiment of the present application, as shown in fig. 9, a course angle of a vehicle is a course angle of a current moment of the vehicle obtained according to an observation model of the vehicle, a camera course angle is a determined course angle in an external parameter of a vehicle-mounted camera, the course angle of the vehicle is subtracted from the camera course angle to obtain a course angle of the camera to the vehicle, which may also be referred to as a compensated course angle, and the determined course angle in the vehicle-mounted camera is updated to the compensated course angle, so that an influence of the vehicle course angle obtained by positioning the vehicle is compensated.
In another possible implementation manner, the external parameters of the vehicle-mounted camera after the course angle compensation are cached, the external parameters of the vehicle-mounted camera after the course angle compensation are filtered, the external parameters of the vehicle-mounted camera after the filtering are obtained, and the external parameters of the vehicle-mounted camera are updated to the external parameters of the vehicle-mounted camera after the filtering.
By way of example, a plurality of external parameters of the vehicle-mounted camera after course angle compensation are cached, and periodic optimal estimation is performed. If the external parameters of the vehicle-mounted camera after the course angle compensation are cached and exceed the preset threshold value, the external parameters of the vehicle-mounted camera after the course angle compensation are cached and are used for optimal estimation and calibration, and if the external parameters of the vehicle-mounted camera after the course angle compensation are cached and exceed the preset threshold value when the vehicle runs out of the calibration field, the external parameters of the vehicle-mounted camera after the course angle compensation are cached and are selected for optimal estimation every certain number (such as 10 frames). The calibration site is an area where N lane lines in a target road for external parameter calibration of the vehicle-mounted camera are located.
Preferably, a group of data is respectively cached for each type of external parameters of the vehicle-mounted camera after the course angle compensation is completed, namely, a rolling angle, a pitch angle, a course angle and a translation distance in the external parameters of the vehicle-mounted camera after the course angle compensation are respectively cached for a group of data, and a kernel density estimation method is adopted to obtain an external parameter value with the maximum probability in each group of data, namely, the optimal estimation of the type of external parameters is achieved.
Specifically, the kernel density estimation method is to fit a gaussian function to a data statistical histogram, find a gaussian function with the best fitting effect to the histogram of known data, and resist a certain amount of noise interference. For each type of external parameters, a fitted kernel function (preferably a Gaussian kernel function) is estimated through a set of external parameter values corresponding to each type, and the peak value of the kernel function is found, wherein the corresponding abscissa value (where the probability of the Gaussian function is maximum) is the optimal value of each type of external parameters.
Alternatively, the four-dimensional kernel density estimation method can be used to estimate the optimal value of each of the four types of external parameters simultaneously.
Optionally, a group of data is respectively cached for each type of external parameters, and a median filtering method is adopted to obtain the median of each type of external parameters, namely the optimal value of each type of external parameters.
Optionally, a group of data is respectively cached for each type of external parameter, and an average value of each group of data is calculated by adopting an average value filtering method, so that the obtained average value of each group of data is the optimal value of each type of external parameter.
And (3) when the standard deviation of the obtained optimal value of each type of external parameter is smaller than a specific threshold value, converging the calibration result, and stopping the calibration to obtain the external parameter of the vehicle-mounted camera.
According to the technical scheme, according to the parallel constraint, the vertical constraint, the pitch proportion constraint and the pitch constraint of M first parallel lines in a first image acquired by a vehicle-mounted camera in an actual camera coordinate system of the vehicle-mounted camera and the parallel constraint, the vertical constraint, the pitch proportion constraint and the pitch constraint of M second parallel lines in a bird's eye view of the first image, the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera is determined according to the optimization sequence of the x-axis rotation angle, the y-axis rotation angle, the z-axis rotation angle and the z-axis translation distance of the virtual ideal camera coordinate system relative to the actual camera coordinate system, and then the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system of the vehicle-mounted camera and the world coordinate system are determined, so that the flexibility of an external parameter calibration method of the vehicle-mounted camera is improved, and the accuracy of the transformation relation (namely, the external parameters of the vehicle-mounted camera is improved.
Fig. 10 is a flowchart of another calibration method of external parameters of an in-vehicle camera according to an embodiment of the application. As shown in fig. 10, the method includes at least S1001 to S1011.
S1001, acquiring coordinates of M first parallel lines in a first image in an actual camera coordinate system of a vehicle-mounted camera, wherein the first image is an image obtained by shooting a vehicle to which the vehicle-mounted camera belongs in the driving process of the vehicle on a target road, the target road comprises N parallel lane lines, the distance between any two lane lines in the N lane lines is known, the M first parallel lines are in one-to-one correspondence with the M first lane lines in the N lane lines, M is an integer greater than or equal to 3, and N is an integer greater than or equal to M.
S1002, determining a transformation relation between an actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera according to the coordinates of M first parallel lines in the actual camera coordinate system of the vehicle-mounted camera and the constraints of M second parallel lines in the aerial view of the first image, wherein the constraints of the M second parallel lines in the aerial view comprise parallel constraints, vertical constraints, pitch proportion constraints and pitch constraints, and the M second parallel lines are in one-to-one correspondence with the M first parallel lines.
It should be noted that S1001 to S1002 may refer to S601 to S602, and will not be described here.
S1003, determining a first external parameter of the vehicle-mounted camera according to a transformation relationship between an actual camera coordinate system of the vehicle-mounted camera and a virtual ideal camera coordinate system of the vehicle-mounted camera and a transformation relationship between the virtual ideal camera coordinate system and a world coordinate system.
The transformation relationship between the virtual ideal camera coordinate system and the world coordinate system is an x-axis rotation angle, a y-axis rotation angle, a z-axis rotation angle and a z-axis translation distance of the world coordinate system relative to the virtual ideal camera coordinate system of the vehicle-mounted camera. The transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is an external parameter of the vehicle-mounted camera, and comprises an x-axis rotation angle, a y-axis rotation angle, a z-axis rotation angle and a z-axis translation distance of the world coordinate system relative to the actual camera coordinate system of the vehicle-mounted camera, and the transformation relation can also be called a roll angle (roll), a pitch angle (pitch), a yaw angle (yaw) and the translation distance. One example of the actual camera coordinate system of the vehicle-mounted camera may be the actual camera coordinate system of the vehicle-mounted camera shown in fig. 3, one example of the virtual ideal camera coordinate system of the vehicle-mounted camera may be the virtual ideal camera coordinate system of the vehicle-mounted camera shown in fig. 3 or fig. 4, and one example of the world coordinate system may be the world coordinate system shown in fig. 4.
In one possible implementation, the matrix is transformed byAnd converting the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera into the transformation relation between the virtual ideal camera coordinate system and the world coordinate system, and obtaining a first transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system, namely a first external parameter of the vehicle-mounted camera.
S1004, acquiring the y-axis translation amount of the vehicle-mounted camera in the world coordinate system according to the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the virtual ideal camera coordinate system of the vehicle-mounted camera and the transformation relation between the virtual ideal camera coordinate system and the world coordinate system.
As an example, in the world coordinate system shown in fig. 4, the position of the vehicle-mounted camera on the Y w axis of the world coordinate system is the Y-axis translation amount of the vehicle-mounted camera in the world coordinate system.
S1005, according to the positioning information of the vehicle to which the vehicle-mounted camera belongs and the y-axis translation amount of the vehicle-mounted camera in the world coordinate system, an observation model of the vehicle is established, and the observation model comprises the influence of the y-axis translation amount on the state quantity.
In one possible implementation, the positioning information of the vehicle is obtained from a sensor of the vehicle to which the vehicle-mounted camera belongs, wherein the sensor of the vehicle to which the vehicle-mounted camera belongs comprises a wheel speed meter, an IMU, a GPS and the like, and accordingly the positioning information of the vehicle comprises current position information of the vehicle and/or acceleration of the vehicle and/or wheel speed of the vehicle and the like.
S1006, acquiring the course angle of the current moment of the vehicle according to the observation model of the vehicle.
In one possible implementation, the observation model of the vehicle is updated by filtering, and the heading angle of the vehicle at the current moment is obtained.
S1007, compensating the course angle of the current moment of the vehicle according to the course angle in the first external parameter of the vehicle-mounted camera, and obtaining the compensated course angle.
In one possible implementation manner, as shown in the schematic diagram of the course angle compensation shown in fig. 9, the course angle of the vehicle is the course angle of the current moment of the vehicle obtained according to the observation model of the vehicle, the course angle of the camera is the determined course angle in the external parameter of the vehicle-mounted camera, the course angle of the camera is obtained by subtracting the course angle of the camera from the course angle of the vehicle, and the course angle of the camera can also be called as the compensated course angle, and the determined course angle in the vehicle-mounted camera is updated as the compensated course angle, so that the influence of the course angle of the vehicle obtained by positioning the vehicle is compensated.
And S1008, updating the course angle in the first external parameters of the vehicle-mounted camera to the compensated course angle.
The first external parameters of the vehicle-mounted camera comprise a first rolling angle, a first pitch angle, a first course angle and a first translation distance. And updating a first course angle in the first external parameters of the vehicle-mounted camera to be a compensated course angle, wherein the updated first external parameters of the vehicle-mounted camera comprise a first rolling angle, a first pitch angle, the compensated course angle and a first translation distance.
S1009, caching a plurality of first external parameters of the in-vehicle camera.
In one possible implementation manner, during a calibration process of external parameters of the vehicle-mounted camera, the vehicle of the vehicle-mounted camera shoots and obtains a plurality of first images from a period from entering the calibration site to exiting the calibration site, and obtains a plurality of first external parameters of the vehicle-mounted camera according to the plurality of first images, and caches the plurality of first external parameters of the vehicle-mounted camera. The calibration site is an area where N lane lines in the external parameter calibration target road for the vehicle-mounted camera are located.
S1010, filtering the cached first external parameters of the vehicle-mounted camera to obtain second external parameters of the vehicle-mounted camera.
In one possible implementation, a periodic optimal estimation is made of a plurality of first external parameters of the cached onboard camera.
Optionally, if the number of the first external parameters of the vehicle-mounted camera cached is not more than a preset threshold value in the period from the time of entering the calibration site to the time of exiting the calibration site, the first external parameters of the vehicle-mounted camera cached are all used for performing optimal estimation and completing calibration, and if the number of the first external parameters of the vehicle-mounted camera cached is more than the preset threshold value in the period from the time of entering the calibration site to the time of exiting the calibration site, the first external parameters of the vehicle-mounted camera cached at a certain interval (for example, 10 frames) are selected for performing optimal estimation.
Preferably, a group of data is respectively cached for each type of external parameters in the plurality of first external parameters of the vehicle-mounted camera, namely, the roll angle, the pitch angle, the course angle and the translation distance in the plurality of first external parameters of the vehicle-mounted camera are respectively cached for a group of data, and a kernel density estimation method is adopted to obtain an external parameter value with the highest probability in each group of data, namely, the optimal estimation of the type of external parameter is achieved.
Specifically, the kernel density estimation method is to fit a gaussian function to a data statistical histogram, find a gaussian function with the best fitting effect to the histogram of known data, and resist a certain amount of noise interference. For each type of external parameters, a fitted kernel function (preferably a Gaussian kernel function) is estimated through a set of external parameter values corresponding to each type, and the peak value of the kernel function is found, wherein the corresponding abscissa value (where the probability of the Gaussian function is maximum) is the optimal value of each type of external parameters.
Alternatively, the four-dimensional kernel density estimation method can be used to estimate the optimal value of each of the four types of external parameters simultaneously.
Optionally, a group of data is respectively cached for each type of external parameters, and a median filtering method is adopted to obtain the median of each type of external parameters, namely the optimal value of each type of external parameters.
Optionally, a group of data is respectively cached for each type of external parameter, and an average value of each group of data is calculated by adopting an average value filtering method, so that the obtained average value of each group of data is the optimal value of each type of external parameter.
And (3) when the standard deviation of the obtained optimal value of each type of external parameter is smaller than a specific threshold value, converging the calibration result, stopping the calibration, and obtaining a second external parameter of the vehicle-mounted camera.
S1011, updating the external parameter of the on-board camera to the second external parameter.
According to the technical scheme provided by the application, the course angle of the vehicle at the current moment is compensated by using the course angle of the vehicle-mounted camera in the external parameters under the world coordinate system, so that the compensated course angle is obtained, and the course angle in the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is updated to be the compensated course angle, so that when the vehicle does not run completely parallel to the lane line, the influence of the course angle of the vehicle can be compensated, and the accuracy of the transformation relation between the actual camera coordinate system of the vehicle-mounted camera and the world coordinate system is improved.
Fig. 11 is a schematic structural diagram of an external parameter calibration device of an in-vehicle camera according to an embodiment of the present application. As shown in fig. 11, the apparatus 1100 may include an acquisition module 1101 and a processing module 1102. Apparatus 1100 may be used to implement the method shown in any of the embodiments described above.
In one possible implementation, the apparatus 1100 may be used to implement the method illustrated in fig. 6 described above. For example, the acquisition module 1101 is used to implement S601, and the processing module 1102 is used to implement S602 and S603.
In another possible implementation, the apparatus 1100 further includes a compensation module, an update module, and a cache module. The apparatus 1100 in this implementation may be used to implement the method illustrated in fig. 10 described above. For example, the acquisition module 1101 is used to implement S1001, S1004, and S1006, the processing module 1102 is used to implement S1002, S1003, S1005, and S1010, the compensation module is used to implement S1007, the update module is used to implement S1008 and S1011, and the buffer module is used to implement S1009.
Fig. 12 is a schematic structural diagram of an external parameter calibration device of an in-vehicle camera according to another embodiment of the present application. The apparatus 1200 shown in fig. 12 may be used to perform the calibration method of the external parameters of the in-vehicle camera shown in any of the above embodiments.
As shown in fig. 12, the apparatus 1200 of the present embodiment includes a memory 1201, a processor 1202, a communication interface 1203, and a bus 1204. Wherein the memory 1201, the processor 1202 and the communication interface 1203 are communicatively coupled to each other via a bus 1204.
The memory 1201 may be a Read Only Memory (ROM), a static storage device, a dynamic storage device, or a random access memory (random access memory, RAM). The memory 1201 may store a program, and the processor 1202 may be configured to perform the steps of the methods shown in fig. 6 and 10 when the program stored in the memory 1201 is executed by the processor 1202.
The processor 1202 may employ a general-purpose central processing unit (central processing unit, CPU), microprocessor, application Specific Integrated Circuit (ASIC), or one or more integrated circuits for executing related programs to implement the method for calibrating external parameters of the vehicle camera according to the method embodiment of the present application.
The processor 1202 may also be an integrated circuit chip with signal processing capabilities. In implementation, various steps of methods of various embodiments of the application may be performed by integrated logic circuitry in hardware or by instructions in software in processor 1202.
The processor 1202 may also be a general purpose processor, a digital signal processor (DIGITAL SIGNAL processing unit, DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (field programmable GATE ARRAY, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 1201, and the processor 1202 reads information in the memory 1201 and in combination with its hardware performs functions necessary for performing the respective methods in the embodiments of the present application, for example, the steps/functions of the embodiments shown in fig. 6 and 10 may be performed.
The communication interface 1203 may enable communication between the apparatus 1200 and other devices or communication networks using, but is not limited to, a transceiver-like transceiver.
The bus 1204 may include a path to transfer information between various components of the apparatus 1200 (e.g., the memory 1201, the processor 1202, the communication interface 1203).
It should be understood that the apparatus 1200 shown in the embodiment of the present application may be an electronic device, or may be a chip configured in an electronic device.
It should be appreciated that the processor in embodiments of the present application may be a central processing unit (central processing unit, CPU), which may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL processors, DSPs), application Specific Integrated Circuits (ASICs), off-the-shelf programmable gate arrays (field programmable GATE ARRAY, FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It should also be appreciated that the memory in embodiments of the present application may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an erasable programmable ROM (erasable PROM), an electrically erasable programmable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as external cache memory. By way of example, and not limitation, many forms of random access memory (random access memory, RAM) are available, such as static random access memory (STATIC RAM, SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate synchronous dynamic random access memory (double DATA RATE SDRAM, DDR SDRAM), enhanced synchronous dynamic random access memory (ENHANCED SDRAM, ESDRAM), synchronous link dynamic random access memory (SYNCHLINK DRAM, SLDRAM), and direct memory bus random access memory (direct rambus RAM, DR RAM).
The above embodiments may be implemented in whole or in part by software, hardware, firmware, or any other combination. When implemented in software, the above-described embodiments may be implemented in whole or in part in the form of a computer program product. The computer program product comprises one or more computer instructions or computer programs. When the computer instructions or computer program are loaded or executed on a computer, the processes or functions described in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center by wired (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more sets of available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium. The semiconductor medium may be a solid state disk.
It should be understood that the term "and/or" is merely an association relationship describing the associated object, and means that three relationships may exist, for example, a and/or B, and may mean that a exists alone, while a and B exist alone, and B exists alone, wherein a and B may be singular or plural. In addition, the character "/" herein generally indicates that the associated object is an "or" relationship, but may also indicate an "and/or" relationship, and may be understood by referring to the context.
In the present application, "at least one" means one or more, and "a plurality" means two or more. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (a, b, or c) of a, b, c, a-b, a-c, b-c, or a-b-c may be represented, wherein a, b, c may be single or plural.
It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. The storage medium includes various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory, a random access memory, a magnetic disk or an optical disk.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.