CN109624851B - Augmented reality-based driving assistance method and system and readable storage medium - Google Patents
Augmented reality-based driving assistance method and system and readable storage medium Download PDFInfo
- Publication number
- CN109624851B CN109624851B CN201811406314.6A CN201811406314A CN109624851B CN 109624851 B CN109624851 B CN 109624851B CN 201811406314 A CN201811406314 A CN 201811406314A CN 109624851 B CN109624851 B CN 109624851B
- Authority
- CN
- China
- Prior art keywords
- vehicle
- coordinate system
- image
- obstacle
- driving
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 89
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 70
- 239000003550 marker Substances 0.000 claims abstract description 56
- 230000004888 barrier function Effects 0.000 claims description 19
- 238000004590 computer program Methods 0.000 claims description 6
- 238000005516 engineering process Methods 0.000 description 14
- 230000008569 process Effects 0.000 description 11
- 230000003068 static effect Effects 0.000 description 9
- 230000000007 visual effect Effects 0.000 description 3
- 230000007547 defect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000003137 locomotive effect Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000007790 scraping Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/302—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with GPS information or vehicle data, e.g. vehicle speed, gyro, steering angle data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8086—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for vehicle path indication
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8093—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
Abstract
The invention discloses an auxiliary driving method, an auxiliary driving system and a storage medium based on augmented reality, wherein the method comprises the following steps: establishing a corresponding relation between a world coordinate system and an image coordinate system according to internal parameters and external parameters of a camera acquired in advance; when the camera acquires the lane line image and/or the obstacle image, drawing a driving track line of the vehicle in the image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the turning angle of a steering wheel, the corresponding relation between the world coordinate system and the image coordinate system; according to the corresponding relation between the world coordinate system and the image coordinate system, drawing a lane line three-dimensional marker and/or an obstacle three-dimensional marker in the image coordinate system by adopting an augmented reality method; the driving trajectory lines, lane line stereo markers, and/or obstacle stereo markers of the vehicle are presented to the user. Compared with the prior art, the invention improves the driving safety and comfort.
Description
Technical Field
The invention relates to the technical field of assistant driving, in particular to an assistant driving method and system based on augmented reality and a readable storage medium.
Background
The automobile is used as a convenient, fast and efficient vehicle in modern society, and is used by more and more people, but due to the complexity of driving environment and the level difference of drivers, the use of the automobile also brings about a lot of problems.
The traditional automobile only provided with two side rearview mirrors has the inherent defect of limited visual field, so that the safety problem is easily caused.
The invention patent with application number CN201810484966.5 entitled a mixed reality road display system, although applying AR (augmented reality) technology, requires a relay station disposed along a road to store real space model information and virtual mixed model information.
The invention patent with the application number of CN201611245215.5, named as a reversing auxiliary line setting method and a reversing auxiliary system, only adjusts the reversing auxiliary line according to the distance of an obstacle behind a vehicle in the process of drawing the reversing auxiliary line, and the actual distance between the obstacle and the rear end of the vehicle is not intuitive and concrete.
Although the invention patent with the application number of CN201711271533.3 and the name of a driving assistance system based on augmented reality head-up display and multi-screen voice interaction applies the augmented reality technology, more navigation and warning information are superposed on a real scene, distance information cannot be well provided for a driver to refer to, and augmented reality picture display is not provided when a vehicle backs. In addition, two cameras and a plurality of screens are used simultaneously, so that the overall cost is increased.
Therefore, a solution for effectively improving driving safety and comfort is needed.
Disclosure of Invention
The invention mainly aims to provide an auxiliary driving method and system based on augmented reality and a readable storage medium, aiming at improving driving safety and comfort.
In order to achieve the above object, the present invention provides an augmented reality-based assistant driving method, which is applied to a vehicle assistant driving system, wherein cameras are installed at the head and the tail of a vehicle, and the method comprises the following steps:
establishing a corresponding relation between a world coordinate system and an image coordinate system according to internal parameters and external parameters of cameras at the head and the tail of the vehicle, which are acquired in advance;
when a camera at the head or the tail of the vehicle acquires a lane line image and/or an obstacle image, drawing a driving track line of the vehicle in an image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the turning angle of a steering wheel and the corresponding relation between the world coordinate system and the image coordinate system;
according to the corresponding relation between the world coordinate system and the image coordinate system, drawing lane line three-dimensional markers and/or obstacle three-dimensional markers in the image coordinate system by adopting an augmented reality method;
presenting the driving trajectory line, the lane line stereo marker, and/or the obstacle stereo marker of the vehicle to a user.
A further technical solution of the present invention is that, before the step of drawing the lane line three-dimensional markers and/or the obstacle three-dimensional markers in the image coordinate system by using an augmented reality method according to the correspondence between the world coordinate system and the image coordinate system, the method includes:
judging whether the distance between the lane line and/or the barrier and the vehicle is smaller than or equal to a preset distance or not;
if so, drawing the lane line three-dimensional markers and/or the obstacle three-dimensional markers in the image coordinate system by adopting an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system.
A further technical solution of the present invention is that, if yes, the step of drawing the obstacle three-dimensional marker in the image coordinate system by using an augmented reality method according to the correspondence between the world coordinate system and the image coordinate system includes:
judging whether more than two obstacle images are acquired by a camera at the head or the tail of the vehicle;
and if more than two obstacle images are acquired by the camera at the head or the tail of the vehicle, drawing an obstacle three-dimensional marker of an obstacle closest to the vehicle in the image coordinate system by adopting an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system.
A further technical solution of the present invention is that, when the camera at the rear of the vehicle acquires a lane line image and/or an obstacle image, the step of drawing the driving trajectory line of the vehicle in the image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the steering wheel angle, and the correspondence between the world coordinate system and the image coordinate system includes:
when a camera at the tail of the vehicle acquires a lane line image and/or an obstacle image, analyzing an image acquired by the camera at the tail of the vehicle;
if the image collected by the camera at the tail of the vehicle comprises a lane line image and/or a barrier image, drawing a driving track line of the vehicle in an image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the turning angle of a steering wheel and the corresponding relation between the world coordinate system and the image coordinate system;
if the image acquired by the camera at the tail of the vehicle comprises a parking space image, a lane line image and/or a barrier image, establishing a parking track line, and/or drawing a parking space three-dimensional marker in an image coordinate system by adopting an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system, and drawing a driving track line of the vehicle in the image coordinate system according to the driving speed, the width of the vehicle body, the turning angle of a steering wheel and the corresponding relation between the world coordinate system and the image coordinate system.
A further technical solution of the present invention is that, when the camera at the head of the vehicle acquires a lane line image and/or an obstacle image, the step of drawing the driving trajectory line of the vehicle in the image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the steering wheel angle, and the correspondence between the world coordinate system and the image coordinate system includes:
when a camera of the head of the vehicle acquires a lane line image and/or an obstacle image, analyzing an image acquired by the camera of the head of the vehicle;
if the images acquired by the camera of the vehicle head comprise lane line images and/or obstacle images, drawing a driving track line of the vehicle in an image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the turning angle of a steering wheel and the corresponding relation between the world coordinate system and the image coordinate system;
if the image collected by the camera of the head of the vehicle comprises the image of other vehicles, the image of a lane line and/or the image of a barrier, drawing a three-dimensional marker of the other vehicles in the image coordinate system by adopting an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system, and drawing a driving track line of the vehicle in the image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the turning angle of a steering wheel and the corresponding relation between the world coordinate system and the image coordinate system.
A further technical solution of the present invention is that, when the camera of the head of the vehicle acquires an image of an obstacle, the step of drawing a driving trajectory line of the vehicle in the image coordinate system according to a driving speed of the vehicle, a width of a vehicle body, a steering wheel angle, and a correspondence between the world coordinate system and the image coordinate system further includes:
judging whether the obstacle image acquired by the camera of the vehicle head comprises an obstacle image in the front of the vehicle head in an inclined manner;
if so, acquiring the distance between the obstacle of the vehicle head and the distance between the obstacle in the oblique front and the two sides of the vehicle respectively;
drawing a driving auxiliary line according to the driving track line of the vehicle, the distance between the obstacle of the vehicle head and the distance between the obstacle in the oblique front and the two sides of the vehicle respectively;
the step of presenting the driving trajectory line, the lane line stereo marker, and/or the obstacle stereo marker of the vehicle to a user comprises:
presenting the driving assistance line, the lane line three-dimensional markers, and/or the obstacle three-dimensional markers of the vehicle to a user.
A further technical solution of the present invention is that, when the camera at the rear of the vehicle acquires an image of an obstacle, the step of drawing a driving trajectory line of the vehicle in the image coordinate system according to a driving speed of the vehicle, a width of the vehicle body, a steering wheel angle, and a correspondence between the world coordinate system and the image coordinate system further includes:
acquiring the distance between a barrier at the tail of the vehicle and the tail of the vehicle;
comparing the distance between the obstacle at the tail of the vehicle and the tail of the vehicle with a preset distance;
if the distance between the obstacle at the tail of the vehicle and the tail of the vehicle is less than or equal to the preset distance, drawing a driving auxiliary line according to the driving trajectory line of the vehicle and the distance between the obstacle at the tail of the vehicle and the tail of the vehicle;
the step of presenting the driving trajectory line, the lane line stereo marker, and/or the obstacle stereo marker of the vehicle to a user comprises:
presenting the driving assistance line, the lane line three-dimensional markers, and/or the obstacle three-dimensional markers of the vehicle to a user.
According to a further technical scheme, the step of establishing the corresponding relation between the world coordinate system and the image coordinate system according to the internal parameters and the external parameters of the cameras at the head and the tail of the vehicle, which are obtained in advance, comprises the following steps:
and calibrating the cameras at the head and the tail of the vehicle by adopting a checkerboard to obtain the internal parameters and the external parameters of the cameras at the head and the tail of the vehicle.
In order to achieve the above object, the present invention further provides an augmented reality-based driving assistance system, which includes cameras installed at a head and a tail of a vehicle, a memory, a processor, and a computer program stored on the memory, wherein the computer program implements the steps of the method when being executed by the processor.
To achieve the above object, the present invention further proposes a computer-readable storage medium having an augmented reality-based assistant driving program stored thereon, which when executed by a processor implements the steps of the method as described above.
According to the auxiliary driving method and system based on augmented reality and the readable storage medium, by the technical scheme, the corresponding relation between a world coordinate system and an image coordinate system is established according to the internal parameters and the external parameters of the cameras at the head and the tail of the vehicle, which are acquired in advance; when a camera at the head or the tail of the vehicle acquires a lane line image and/or an obstacle image, drawing a driving track line of the vehicle in an image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the turning angle of a steering wheel and the corresponding relation between the world coordinate system and the image coordinate system; according to the corresponding relation between the world coordinate system and the image coordinate system, drawing lane line three-dimensional markers and/or obstacle three-dimensional markers in the image coordinate system by adopting an augmented reality method; the driving track line of the vehicle, the lane line three-dimensional marker and/or the obstacle three-dimensional marker are presented to a user, and compared with the prior art, the driving safety and the driving comfort are improved.
Drawings
FIG. 1 is a schematic flow chart of a first embodiment of an augmented reality-based driving assistance method of the present invention;
FIG. 2 is a schematic flow chart of a second embodiment of the augmented reality-based driving assistance method of the present invention;
fig. 3 is a detailed flowchart of step S30 in the second embodiment of the augmented reality-based assistant driving method according to the present invention;
FIG. 4 is a schematic flow chart of a third embodiment of the augmented reality-based driving assistance method of the present invention;
FIG. 5 is a schematic flow chart of a fourth embodiment of the augmented reality-based driving assistance method of the present invention;
FIG. 6 is a schematic flow chart of a fifth embodiment of the augmented reality-based driving assistance method of the present invention;
FIG. 7 is a schematic diagram of a method for assisting driving based on augmented reality according to various embodiments of the present invention;
fig. 8 is a schematic diagram of a driving trajectory line drawn in each embodiment of the augmented reality-based aided driving method of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Aiming at the defect that the distance between the vehicle and a lane line, an obstacle or a front vehicle is not visual enough in the process of moving the vehicle forwards or backwards in the prior art, the invention provides a solution which enables a driver to judge the distance between the lane line, the obstacle, the front vehicle and the rear vehicle more visually, enables the driver of the vehicle to perceive the distance more visually, can judge the distance more accurately and enables the driving process to be safer.
Referring to fig. 1 to 8, the present invention provides an assisted driving method based on augmented reality.
Specifically, referring to fig. 1, fig. 1 is a flowchart illustrating a driving assistance method based on augmented reality according to a first embodiment of the present invention.
The augmented reality-based assistant driving method is applied to a vehicle assistant driving system, wherein cameras are mounted at the head and the tail of the vehicle.
As shown in fig. 1, the augmented reality-based driving assistance method proposed by this embodiment includes the following steps:
and step S10, establishing a corresponding relation between a world coordinate system and an image coordinate system according to the internal parameters and the external parameters of the cameras at the head and the tail of the vehicle, which are acquired in advance.
The focal length, offset and distortion of the camera itself have a large influence on the light entering the lens, and these parameters are called internal parameters of the camera. The internal reference of the camera determines the projection relation of the camera from a three-dimensional space to a two-dimensional image.
The external parameters of the camera, which determine the relationship between the camera coordinates and world coordinates, are influenced by the mounting position and orientation (e.g., azimuth, pitch, and roll) of the camera, and are relatively independent of the internal parameters.
And step S20, when the camera at the head or the tail of the vehicle collects a lane line image and/or an obstacle image, drawing a driving track line of the vehicle in an image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the turning angle of a steering wheel and the corresponding relation between the world coordinate system and the image coordinate system.
When the driving track line is drawn, the driving track line can be drawn by red, yellow and green lines at the positions from near to far away from the vehicle body, and the width of the two lines of the driving track line is the same as the width of the vehicle body.
And step S30, drawing the lane line three-dimensional markers and/or the obstacle three-dimensional markers in the image coordinate system by adopting an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system.
The lane line three-dimensional markers and the obstacle three-dimensional markers can be drawn in different modes, for example, the lane line three-dimensional markers are semitransparent cylinders with alternate yellow and white, and the obstacle three-dimensional markers are semitransparent cylinders with alternate red and white. Yellow-white alternating cylinders are lane lines that can be crossed, and red-white alternating cylinders are obstacles that cannot be crossed. By such a difference, driving safety can be improved.
And step S40, displaying the driving track line of the vehicle, the lane line stereo marker and/or the obstacle stereo marker to a user.
It can be understood that, since the stereoscopic markers are drawn at the positions of the lane lines and the obstacles by using the augmented reality technology (AR technology), the driving trajectory line of the vehicle may overlap with the lane line stereoscopic markers and/or the obstacle stereoscopic markers, and when the distance is greater than the preset distance, the distance may be determined according to the color of the overlapping portion of the driving trajectory line and the lane line stereoscopic markers and/or the obstacle stereoscopic markers. Different from the existing display method of the driving trajectory line on the plane image, the method enables the driver to more intuitively obtain the distance between the vehicle and the obstacle and enables the driver to drive more safely.
Referring to fig. 2, fig. 2 is a flowchart illustrating an assisted driving method based on augmented reality according to a second embodiment of the present invention.
The present embodiment is different from the first embodiment shown in fig. 1 in that, in step S30, the step of drawing the lane line stereo marker and/or the obstacle stereo marker in the image coordinate system by using the augmented reality method according to the correspondence between the world coordinate system and the image coordinate system includes:
s301, judging whether the distance between the lane line and/or the obstacle and the vehicle is smaller than or equal to a preset distance.
If yes, executing step S30, and drawing lane line three-dimensional markers and/or obstacle three-dimensional markers in the image coordinate system by using an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system.
Further, referring to fig. 3, fig. 3 is a detailed flowchart of step S30 in the present embodiment.
As shown in fig. 3, the step S30 of drawing the lane line stereo marker and/or the obstacle stereo marker in the image coordinate system by using the augmented reality method according to the corresponding relationship between the world coordinate system and the image coordinate system includes the following steps:
step S302, judging whether more than two obstacle images are acquired by the camera at the head or the tail of the vehicle;
step S303, if more than two obstacle images are acquired by the camera at the head or the tail of the vehicle, drawing an obstacle three-dimensional marker of an obstacle closest to the vehicle in the image coordinate system by adopting an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system.
Therefore, the situation that the drawn barrier three-dimensional markers are disordered in the display process is avoided.
Referring to fig. 4, fig. 4 is a flowchart illustrating an assisted driving method based on augmented reality according to a third embodiment of the present invention.
The present embodiment is different from the first embodiment shown in fig. 1 in that, in the step S20, when the camera at the tail of the vehicle acquires the lane line image and/or the obstacle image, the step of drawing the driving track line of the vehicle in the image coordinate system according to the driving speed of the vehicle, the vehicle body width, the steering wheel angle, and the correspondence relationship between the world coordinate system and the image coordinate system includes:
step S201, when the camera at the vehicle tail of the vehicle acquires the lane line image and/or the obstacle image, analyzing the image acquired by the camera at the vehicle tail of the vehicle.
Step S202, if the image collected by the camera at the tail of the vehicle comprises a lane line image and/or a barrier image, the step of drawing the driving track line of the vehicle in the image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the steering wheel angle and the corresponding relation between the world coordinate system and the image coordinate system is executed.
Step S203, if the image collected by the camera at the tail of the vehicle comprises a parking space image, a lane line image and/or a barrier image, establishing a parking track line, and/or drawing a parking space three-dimensional marker in the image coordinate system by adopting an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system, and drawing a driving track line of the vehicle in the image coordinate system according to the driving speed, the vehicle body width, the steering wheel rotation angle and the corresponding relation between the world coordinate system and the image coordinate system.
It can be understood that, in the embodiment, when the parking trajectory line coincides with the driving trajectory line, the vehicle can be well parked in the parking space.
Referring to fig. 5, fig. 5 is a flowchart illustrating an auxiliary driving method based on augmented reality according to a fourth embodiment of the present invention.
The present embodiment is different from the first embodiment shown in fig. 1 in that, in step S20, when the camera at the head of the vehicle acquires the lane line image and/or the obstacle image, the step of drawing the trajectory line of the vehicle in the image coordinate system according to the traveling speed of the vehicle, the vehicle body width, the steering wheel angle, and the correspondence between the world coordinate system and the image coordinate system includes:
and S204, when the camera at the head of the vehicle acquires the lane line image and/or the obstacle image, analyzing the image acquired by the camera at the head of the vehicle.
Step S205, if the image acquired by the camera of the vehicle head includes a lane line image and/or an obstacle image, then drawing a driving trajectory line of the vehicle in the image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the steering wheel angle, and the corresponding relationship between the world coordinate system and the image coordinate system.
Step S206, if the image collected by the camera of the head of the vehicle comprises the image of other vehicle, the image of lane line and/or the image of obstacle, drawing the three-dimensional marker of other vehicle in the image coordinate system by adopting an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system, and drawing the driving track line of the vehicle in the image coordinate system according to the driving speed, the width of the vehicle body, the turning angle of the steering wheel and the corresponding relation between the world coordinate system and the image coordinate system.
And the other vehicle three-dimensional markers can be drawn by adopting trolley icons. Therefore, the car icon is drawn in the image coordinate system, so that the user can more intuitively know the distance between the other vehicle and the vehicle, and the driving safety is further improved.
Further, as an embodiment, the step S20, when the camera at the head of the vehicle captures the image of the obstacle, further includes, after the step of drawing the trajectory line of the vehicle in the image coordinate system, according to the traveling speed of the vehicle, the width of the vehicle body, the steering wheel angle, and the correspondence between the world coordinate system and the image coordinate system:
judging whether the obstacle image acquired by the camera of the vehicle head comprises an obstacle image in the front of the vehicle head in an inclined manner;
if so, acquiring the distance between the obstacle of the vehicle head and the distance between the obstacle in the oblique front and the two sides of the vehicle respectively;
and drawing a driving auxiliary line according to the driving track line of the vehicle, the distance between the obstacle of the vehicle head and the distance between the obstacle in the oblique front and the two sides of the vehicle respectively.
In the step S40, the step of presenting the driving trajectory line, the lane line stereo marker, and/or the obstacle stereo marker of the vehicle to the user includes:
presenting the driving assistance line, the lane line three-dimensional markers, and/or the obstacle three-dimensional markers of the vehicle to a user.
As another embodiment, in step S20, when the camera of the head of the vehicle captures the image of the obstacle, the step of drawing the trajectory line of the vehicle in the image coordinate system according to the traveling speed of the vehicle, the width of the vehicle body, the steering wheel angle, and the correspondence between the world coordinate system and the image coordinate system further includes:
acquiring the distance between a barrier at the tail of the vehicle and the tail of the vehicle;
comparing the distance between the obstacle at the tail of the vehicle and the tail of the vehicle with a preset distance;
and if the distance between the obstacle at the tail of the vehicle and the tail of the vehicle is less than or equal to the preset distance, drawing a driving auxiliary line according to the driving trajectory line of the vehicle and the distance between the obstacle at the tail of the vehicle and the tail of the vehicle.
In the step S40, the step of presenting the driving trajectory line, the lane line stereo marker, and/or the obstacle stereo marker of the vehicle to the user includes:
presenting the driving assistance line, the lane line three-dimensional markers, and/or the obstacle three-dimensional markers of the vehicle to a user.
Thus, the present embodiment further improves driving safety by presenting the driving assistance line of the vehicle, the lane line three-dimensional marker, and/or the obstacle three-dimensional marker to the user.
Referring to fig. 6, fig. 6 is a schematic flow chart of a driving assistance method based on augmented reality according to a fifth embodiment of the present invention.
The present embodiment is different from the first embodiment shown in fig. 1 in that, in the step S10, the step of establishing the correspondence between the world coordinate system and the image coordinate system according to the internal parameters and the external parameters of the cameras at the head and the tail of the vehicle, which are acquired in advance, includes:
and S100, calibrating the cameras at the head and the tail of the vehicle by adopting a checkerboard to acquire internal parameters and external parameters of the cameras at the head and the tail of the vehicle.
The augmented reality-based driving assistance method of the present invention is further described in detail by way of example below.
1. For the rendering of the stereo markers:
firstly, a checkerboard is used for calibrating the front and rear vehicle-mounted cameras to obtain the internal and external parameters of the cameras. And establishing a corresponding relation between a world coordinate system (namely an actual object) and an image coordinate system through the obtained internal and external parameters of the camera.
When the vehicle moves ahead: the camera of locomotive is used to carry out real-time acquisition to the information in the place ahead that the vehicle traveles at the vehicle in-process. The lane lines, obstacles (an obstacle here is defined as an object that is 10cm higher than the road surface on the trajectory ahead of the vehicle in addition to the lane lines and the preceding vehicle accidents), and the preceding vehicle in the image are detected. And drawing yellow-white semi-transparent cylinders with the diameter the same as the width of the lane line and the actual height of 0.5m in the image coordinate system at the position of the lane line to be inspected by using an AR technology according to the internal and external parameters of the camera obtained by static calibration and the corresponding relation between the world coordinate system and the image coordinate system. The first cylinder appears at a position on the lane line closest to the vehicle body or on the extension of the lane line in the image. The farthest cylinder is drawn at a distance of 10m from the first cylinder. The distance interval between each cylinder is 0.5m in the range of 0-5m, and the interval between each cylinder is 1m in the range of 5-10 m. And drawing a semi-transparent cylinder with red and white intervals, the diameter of which is 0.2m, the interval of which is 0.2m and the height of which is 0.5m, at the outline of the obstacle close to the vehicle body side in the image coordinate system by applying an AR technology to the position of the detected obstacle according to the internal and external parameters of the camera obtained by static calibration and the corresponding relation between the world coordinate system and the image coordinate system, wherein the red and white proportion of the cylinder is 7. And when the positions of other vehicles are detected, drawing a semitransparent trolley icon in the image coordinate system by using an AR technology according to the internal and external parameters of the camera obtained by static calibration and the corresponding relation between the world coordinate system and the image coordinate system. In order to avoid the situation that the drawn three-dimensional markers are disordered in the display process, only one closest to the vehicle body is drawn on the lane lines, the obstacles, the front vehicles and the like detected in the same direction. And the front vehicle and the barrier meet the condition that the distance from the vehicle head is less than 10m, the three-dimensional marker is drawn, and the three-dimensional marker is not drawn when the distance from the front vehicle to the vehicle head is greater than 10 m.
When the vehicle backs: when backing up, the drawing method is the same as when the vehicle is driving forward. The same method as that used when the vehicle is traveling forward is used when the lane lines, obstacles, and vehicles directly behind are detected in the field of view to draw the three-dimensional markers. When the vehicle is backed, a yellow-white alternative semitransparent cylinder with the diameter the same as the width of the lane line and the actual height of 0.5m is drawn in the image coordinate system according to the internal and external parameters of the camera obtained by static calibration and the corresponding relation between the world coordinate system and the image coordinate system by using an AR technology at the position of the lane line where the vehicle is inspected. The first cylinder appears at a position on the lane line closest to the vehicle body or on the extension of the lane line in the image. The distance interval of each cylinder was 0.5 m. The farthest cylinder is drawn at a distance of 10m from the first cylinder. The spacing between each cylinder is 0.5m between 0 and 5m and 1m between each cylinder between 5 and 10 m. And drawing a semi-transparent cylinder with red and white intervals, the diameter of which is 0.2m, the interval of which is 0.2m and the height of which is 0.5m, in the image coordinate system along the outline of the obstacle on one side close to the vehicle body by applying the internal and external parameters of the camera obtained by the AR technology according to the static calibration and the corresponding relation between the world coordinate system and the image coordinate system at the position of the detected obstacle, wherein the red and white proportion of the cylinder is 7. And when the positions of other vehicles are detected, drawing a semitransparent trolley icon in the image coordinate system by using an AR technology according to the internal and external parameters of the camera obtained by static calibration and the corresponding relation between the world coordinate system and the image coordinate system. In order to avoid the situation that the drawn three-dimensional markers are disordered in the display process, only one closest to the tail of the vehicle is drawn on the lane lines, the obstacles, the rear vehicles and the like detected in the same direction. And the rear vehicle and the barrier meet the condition that the distance between the rear vehicle and the tail of the vehicle is less than 10m, the three-dimensional marker is drawn, and the drawing is not performed when the distance between the rear vehicle and the barrier is more than 10 m. Meanwhile, parking space frame line detection is required. And if complete parking space information is detected, drawing the three-dimensional marker in the image coordinate system for the first complete parking space closest to the vehicle by applying an AR (augmented reality) technology according to the internal and external parameters of the camera obtained by static calibration and the corresponding relation between the world coordinate system and the image coordinate system. For the parking space to be subjected to three-dimensional marker drawing, the frame line with the shortest actual distance close to the tail of the vehicle does not draw the three-dimensional marker, and the other three frames draw the three-dimensional marker consistent with the lane line. And if the parking space is not detected, the operation of drawing the parking space three-dimensional marker in the previous step is not carried out.
The three-dimensional markers of different styles are used for different objects, so that the driver can distinguish the road condition information conveniently. Yellow-white alternating cylinders are lane lines that can be crossed, red-white alternating cylinders and car logos are non-traversable obstacles and other vehicles. By such a difference, driving safety can be improved.
Referring to fig. 7, fig. 7 shows a three-dimensional marker drawn by taking the obstacle appearing on the right side in front of the vehicle as an example, and the distance between the vehicle and the obstacle is identified according to the overlapping of the traffic track line and the three-dimensional marker. As is clear from the figure, the traffic trajectory of the vehicle overlaps the solid marker less than 4m away from 3m, so that the driver can intuitively perceive that the obstacle is less than 4m away from 3m in front of the vehicle. The driver can accurately judge the operation to be carried out next time, so that the driving safety is improved, and the occurrence of collision and scraping is reduced.
2. Drawing of a driving auxiliary line:
when the vehicle moves ahead: and drawing a track line of the vehicle in the running process according to the running speed, the width of the vehicle body and the steering wheel angle of the vehicle, and simultaneously according to the relation that the internal and external parameters of the camera and the points in the world coordinate system are mapped into the image coordinate system in the static calibration process. The trajectory line within 1m from the vehicle body is drawn with red, 1 to 3m with yellow, and 3 to 5m with green. The width of the two lines is the same as the width of the vehicle body. And when the distance between the vehicle and the front three-dimensional marker is less than 0.5m, displaying the distance between the vehicle and the front object in real time. Because the AR technology is used for drawing the three-dimensional markers at the positions of the lane line, the barrier and the front vehicle, the running track line of the vehicle can be overlapped with the three-dimensional markers at the moment, and when the distance is more than 0.5m, the distance can be judged according to the color of the overlapped part of the track line and the three-dimensional markers. Different from the existing display method of the driving trajectory line on the plane image, the method enables the driver to more intuitively obtain the distance between the vehicle and the front vehicle or the obstacle, and enables the driver to drive more safely.
The distance between the vehicle and the lane lines on the left and right sides and the obstacle is also detected during the forward travel (when the vehicle in the other lane enters the lane in which the vehicle is traveling, the vehicle is treated as an obstacle). The distance between the vehicle and the obstacles on the two sides is displayed in real time at the position, 2m away from the front of the vehicle, of the lowest end (namely the part closest to the vehicle head) of the acquired image, so that the driver can judge the distance between the objects on the two sides of the vehicle, and the driving safety is improved. The driving track line combines the distance display and the distance display at the two sides of the vehicle body to form a driving auxiliary line.
The distance display combined with the AR technology enables a driver to be more visual when judging the vehicle distance. Especially when the distance between the vehicle and the front three-dimensional marker is less than 0.5m, the distance display of the three-dimensional marker drawn by the driving auxiliary line can enable the driver to more accurately and visually acquire the distance information, and the driving safety is improved.
When the vehicle backs: and drawing a track line of the vehicle in the process of backing according to the running speed, the width of the vehicle body and the steering wheel angle of the vehicle, and simultaneously according to the relation that the internal and external parameters of the camera and the points under the world coordinate system obtained in the static calibration process are mapped into the image coordinate system. Traces within 1m of the vehicle's tail are plotted in red, traces from 1 to 3m in yellow, and traces from 3 to 5m in green. The width of the two lines is the same as the width of the vehicle body. And when the distance between the tail of the vehicle and the rear obstacle is less than 0.5m, displaying the distance between the vehicle and the rear obstacle in real time. Because the AR technology is used for drawing the three-dimensional markers at the positions of the lane line, the barrier and the rear vehicle, the track line of the vehicle backing at the moment can be overlapped with the three-dimensional markers, and when the distance is more than 0.5m, the distance can be judged according to the color of the overlapped part of the track line and the three-dimensional markers. Different from the existing display method of the backing track line on a plane image, the stereoscopic overlapping method enables a driver to more intuitively obtain the distance between the vehicle and a rear vehicle or an obstacle, and enables the driver to drive more safely. The distance between the two sides of the display vehicle body is adjusted to the position of the vehicle tail and the position 1m away from the vehicle tail. The vehicle speed is generally slower when backing a car, so that the distance between two sides of the vehicle body is displayed at 1m more safely. When backing, the vision behind the vehicle can be observed only through the rearview mirror, and the vision is not as wide as that of the front view. By using the method, the driver can judge the distance behind the vehicle more visually, and the safety of backing the vehicle is improved. The reversing track line is combined with the distance display and the distance display on the two sides of the vehicle body to form a reversing auxiliary line.
The following description will be made in detail with reference to fig. 8, taking the vehicle advancing time as an example.
It is assumed that the rectangle in fig. 8 represents a car. Initially the trajectory of the vehicle is at an angle theta to the X-axis, where the vehicle speed is v and the steering angle phi is phi, and at this time the time interval deltat is taken, the trajectory of the vehicle is shown as a solid line in the figure, where the vehicle speed v and the steering angle phi may vary. If the time interval is small, the vehicle speed is constant, and the steering wheel angle is kept constant, the trajectory line drawn at this time is a standard circular arc. The illustration is for convenience of expression, so the time intervals are taken slightly larger. Let the coordinates of the center points of the two front wheels of the vehicle at this time be (X0, Y0). According to the relationship in the figure, at the next time of delta t, the coordinates of the centers of the two front wheels are as follows:
at the next Δ t moment, the center coordinates of the two front wheels of the vehicle are:
at the 3 rd time delta t, the center coordinates of two front wheels of the vehicle are as follows:
and the central coordinate formulas of the two front wheels at each moment in the future are analogized. The number of coordinates to be solved is determined according to the vehicle speed and the length of the trajectory line to be drawn. After the coordinates of the central points of the two front wheels are obtained, two trajectory lines with the same width as the vehicle body can be drawn according to the width of the vehicle body and the distance between the wheels and the vehicle head.
When the closest parking spot to the vehicle is detected during reversing. And drawing a suggested parking track according to the relative position relationship between the vehicle and the parking space, the width of the vehicle body, the length of the vehicle body, the speed of the vehicle and the steering wheel angle information. If the driving track of the driver driving the reversing vehicle is coincident with the suggested parking track, the vehicle can be well parked in the corresponding parking space.
In summary, according to the augmented reality-based driving assistance method, by the technical scheme, the corresponding relation between the world coordinate system and the image coordinate system is established according to the pre-acquired internal parameters and external parameters of the cameras at the head and the tail of the vehicle; when a camera at the head or the tail of the vehicle acquires a lane line image and/or an obstacle image, drawing a driving track line of the vehicle in an image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the turning angle of a steering wheel and the corresponding relation between the world coordinate system and the image coordinate system; according to the corresponding relation between the world coordinate system and the image coordinate system, drawing lane line three-dimensional markers and/or obstacle three-dimensional markers in the image coordinate system by adopting an augmented reality method; the driving track line of the vehicle, the lane line three-dimensional marker and/or the obstacle three-dimensional marker are presented to a user, and compared with the prior art, the driving safety and the driving comfort are improved.
In order to achieve the above object, the present invention further provides an augmented reality-based driving assistance system, which includes cameras installed at a head and a tail of a vehicle, a memory, a processor, and a computer program stored in the memory, wherein the computer program is executed by the processor to implement the steps of the method according to the above embodiments, and the specific steps and the obtained technical effects are described in detail above, and are not described again here.
In order to achieve the above object, the present invention further provides a computer-readable storage medium, where an augmented reality-based assistant driving program is stored, and when the augmented reality-based assistant driving program is executed by a processor, the steps of the method are implemented, and the specific steps and the obtained technical effects are described above in detail, and are not described herein again.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (9)
1. An augmented reality-based aided driving method is applied to a vehicle aided driving system, cameras are mounted at the head and the tail of a vehicle, and the method comprises the following steps:
establishing a corresponding relation between a world coordinate system and an image coordinate system according to internal parameters and external parameters of cameras at the head and the tail of the vehicle, which are acquired in advance;
when a camera at the head or the tail of the vehicle acquires a lane line image and/or an obstacle image, drawing a driving track line of the vehicle in an image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the turning angle of a steering wheel and the corresponding relation between the world coordinate system and the image coordinate system;
according to the corresponding relation between the world coordinate system and the image coordinate system, drawing lane line three-dimensional markers and/or obstacle three-dimensional markers in the image coordinate system by adopting an augmented reality method;
presenting a driving trajectory line of the vehicle, the lane line stereo marker, and/or the obstacle stereo marker to a user;
the step of drawing the lane line three-dimensional markers and/or the obstacle three-dimensional markers in the image coordinate system by adopting an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system comprises the following steps:
judging whether the distance between the lane line and/or the barrier and the vehicle is smaller than or equal to a preset distance or not;
if so, drawing the lane line three-dimensional markers and/or the obstacle three-dimensional markers in the image coordinate system by adopting an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system.
2. The augmented reality-based driving assistance method according to claim 1, wherein if the determined result is yes, the step of drawing the obstacle stereoscopic marker in the image coordinate system by using an augmented reality method according to the corresponding relationship between the world coordinate system and the image coordinate system is executed, and the step of drawing the obstacle stereoscopic marker in the image coordinate system by using an augmented reality method includes:
judging whether more than two obstacle images are acquired by a camera at the head or the tail of the vehicle;
and if more than two obstacle images are acquired by the camera at the head or the tail of the vehicle, drawing an obstacle three-dimensional marker of an obstacle closest to the vehicle in the image coordinate system by adopting an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system.
3. The augmented reality-based driving assistance method according to claim 1, wherein the step of drawing the trajectory line of the vehicle in the image coordinate system according to the traveling speed of the vehicle, the body width, the steering wheel angle, the correspondence between the world coordinate system and the image coordinate system when the camera at the tail of the vehicle acquires the lane line image and/or the obstacle image comprises:
when a camera at the tail of the vehicle acquires a lane line image and/or an obstacle image, analyzing an image acquired by the camera at the tail of the vehicle;
if the image collected by the camera at the tail of the vehicle comprises a lane line image and/or a barrier image, drawing a driving track line of the vehicle in an image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the turning angle of a steering wheel and the corresponding relation between the world coordinate system and the image coordinate system;
if the image acquired by the camera at the tail of the vehicle comprises a parking space image, a lane line image and/or a barrier image, establishing a parking track line, and/or drawing a parking space three-dimensional marker in an image coordinate system by adopting an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system, and drawing a driving track line of the vehicle in the image coordinate system according to the driving speed, the width of the vehicle body, the turning angle of a steering wheel and the corresponding relation between the world coordinate system and the image coordinate system.
4. The augmented reality-based driving assistance method according to claim 1, wherein the step of drawing the trajectory of the vehicle in the image coordinate system according to the traveling speed, the body width, the steering wheel angle of the vehicle, and the correspondence between the world coordinate system and the image coordinate system when the camera of the head of the vehicle acquires the lane line image and/or the obstacle image comprises:
when a camera of the head of the vehicle acquires a lane line image and/or an obstacle image, analyzing an image acquired by the camera of the head of the vehicle;
if the images acquired by the camera of the vehicle head comprise lane line images and/or obstacle images, drawing a driving track line of the vehicle in an image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the turning angle of a steering wheel and the corresponding relation between the world coordinate system and the image coordinate system;
if the image collected by the camera of the head of the vehicle comprises the image of other vehicles, the image of a lane line and/or the image of a barrier, drawing a three-dimensional marker of the other vehicles in the image coordinate system by adopting an augmented reality method according to the corresponding relation between the world coordinate system and the image coordinate system, and drawing a driving track line of the vehicle in the image coordinate system according to the driving speed of the vehicle, the width of the vehicle body, the turning angle of a steering wheel and the corresponding relation between the world coordinate system and the image coordinate system.
5. The augmented reality-based driving assistance method according to claim 1, wherein when the camera of the head of the vehicle acquires an image of an obstacle, the step of drawing a driving trajectory line of the vehicle in the image coordinate system according to a driving speed, a body width, a steering wheel angle, and a correspondence between the world coordinate system and the image coordinate system further includes:
judging whether the obstacle image acquired by the camera of the vehicle head comprises an obstacle image in the front of the vehicle head in an inclined manner;
if so, acquiring the distance between the obstacle of the vehicle head and the distance between the obstacle in the oblique front and the two sides of the vehicle respectively;
drawing a driving auxiliary line according to the driving track line of the vehicle, the distance between the obstacle of the vehicle head and the distance between the obstacle in the oblique front and the two sides of the vehicle respectively;
the step of presenting the driving trajectory line, the lane line stereo marker, and/or the obstacle stereo marker of the vehicle to a user comprises:
presenting the driving assistance line, the lane line three-dimensional markers, and/or the obstacle three-dimensional markers of the vehicle to a user.
6. The augmented reality-based driving assistance method according to claim 1, wherein the step of drawing a driving trajectory line of the vehicle in the image coordinate system according to a corresponding relationship between a driving speed, a body width, a steering wheel angle, the world coordinate system and the image coordinate system of the vehicle when an obstacle image is captured by a camera at a tail of the vehicle further comprises:
acquiring the distance between a barrier at the tail of the vehicle and the tail of the vehicle;
comparing the distance between the obstacle at the tail of the vehicle and the tail of the vehicle with a preset distance;
if the distance between the obstacle at the tail of the vehicle and the tail of the vehicle is less than or equal to the preset distance, drawing a driving auxiliary line according to the driving trajectory line of the vehicle and the distance between the obstacle at the tail of the vehicle and the tail of the vehicle;
the step of presenting the driving trajectory line, the lane line stereo marker, and/or the obstacle stereo marker of the vehicle to a user comprises:
presenting the driving assistance line, the lane line three-dimensional markers, and/or the obstacle three-dimensional markers of the vehicle to a user.
7. The augmented reality-based driving assistance method according to any one of claims 1 to 6, wherein the step of establishing the correspondence between the world coordinate system and the image coordinate system according to the internal parameter and the external parameter of the cameras at the head and the tail of the vehicle, which are acquired in advance, comprises:
and calibrating the cameras at the head and the tail of the vehicle by adopting a checkerboard to obtain the internal parameters and the external parameters of the cameras at the head and the tail of the vehicle.
8. An augmented reality based assisted driving system comprising cameras mounted at the head and tail of a vehicle, a memory, a processor and a computer program stored on the memory, the computer program when executed by the processor implementing the steps of the method as claimed in any one of claims 1 to 7.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon an augmented reality based auxiliary driving program, which when executed by a processor implements the steps of the method according to any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811406314.6A CN109624851B (en) | 2018-11-23 | 2018-11-23 | Augmented reality-based driving assistance method and system and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811406314.6A CN109624851B (en) | 2018-11-23 | 2018-11-23 | Augmented reality-based driving assistance method and system and readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109624851A CN109624851A (en) | 2019-04-16 |
CN109624851B true CN109624851B (en) | 2022-04-19 |
Family
ID=66069288
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811406314.6A Active CN109624851B (en) | 2018-11-23 | 2018-11-23 | Augmented reality-based driving assistance method and system and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109624851B (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109975035B (en) * | 2019-04-22 | 2019-12-27 | 中国汽车工程研究院股份有限公司 | Whole-vehicle-level in-loop test bench system of L3-level automatic driving vehicle |
CN110641366B (en) * | 2019-10-12 | 2021-10-19 | 爱驰汽车有限公司 | Obstacle tracking method and system during driving, electronic device and storage medium |
CN110901527A (en) * | 2019-11-29 | 2020-03-24 | 长城汽车股份有限公司 | Vehicle alarm method and device |
CN111243034A (en) * | 2020-01-17 | 2020-06-05 | 广州市晶华精密光学股份有限公司 | Panoramic auxiliary parking calibration method, device, equipment and storage medium |
CN112092731B (en) * | 2020-06-12 | 2023-07-04 | 合肥长安汽车有限公司 | Self-adaptive adjusting method and system for automobile reversing image |
US11562576B2 (en) * | 2020-08-05 | 2023-01-24 | GM Global Technology Operations LLC | Dynamic adjustment of augmented reality image |
CN114111809A (en) * | 2020-09-01 | 2022-03-01 | 阿里巴巴集团控股有限公司 | Navigation method, device, equipment and storage medium for augmented reality |
CN113589820A (en) * | 2021-08-12 | 2021-11-02 | 广州小鹏自动驾驶科技有限公司 | Auxiliary processing method, device and system for remote driving |
DE102022200409A1 (en) * | 2022-01-14 | 2023-07-20 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for deriving a controlled variable for lateral control of a motor vehicle |
CN115214626B (en) * | 2022-04-28 | 2024-04-26 | 广州汽车集团股份有限公司 | Parking control method, parking control device, vehicle and storage medium |
CN115320584A (en) * | 2022-10-13 | 2022-11-11 | 之江实验室 | Vehicle remote driving assistance method and remote driving system considering obstacle early warning |
CN120027819A (en) * | 2023-11-22 | 2025-05-23 | 华为技术有限公司 | Navigation arrow display method, vehicle-mounted device, readable storage medium and chip |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101309818A (en) * | 2005-11-17 | 2008-11-19 | 爱信精机株式会社 | Parking assisting device and parking assisting method |
CN102653272A (en) * | 2011-03-03 | 2012-09-05 | 富士重工业株式会社 | Vehicle driving support apparatus |
CN104374391A (en) * | 2014-11-17 | 2015-02-25 | 深圳市中天安驰有限责任公司 | Vehicle travelling track calculation system and vehicle travelling track calculation method |
CN106598055A (en) * | 2017-01-19 | 2017-04-26 | 北京智行者科技有限公司 | Intelligent vehicle local path planning method, device thereof, and vehicle |
CN107310475A (en) * | 2017-05-17 | 2017-11-03 | 广州小鹏汽车科技有限公司 | A kind of display methods and system of intelligent automobile warning function |
CN108025767A (en) * | 2015-09-17 | 2018-05-11 | 索尼公司 | System and method for driving auxiliary for offer of overtaking other vehicles safely |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102007009745A1 (en) * | 2007-02-28 | 2008-09-04 | Continental Automotive Gmbh | Method for controlling vehicle steering during parking process, involves measuring parking place selected for parking vehicle and establishing orientation field, where orientation field determines number of support points |
CN103105174B (en) * | 2013-01-29 | 2016-06-15 | 四川长虹佳华信息产品有限责任公司 | A kind of vehicle-mounted outdoor scene safety navigation method based on AR augmented reality |
CN103950409B (en) * | 2014-04-24 | 2016-04-27 | 中国科学院深圳先进技术研究院 | parking assistance method and system |
KR20150140449A (en) * | 2014-06-05 | 2015-12-16 | 팅크웨어(주) | Electronic apparatus, control method of electronic apparatus and computer readable recording medium |
KR102214604B1 (en) * | 2014-09-05 | 2021-02-10 | 현대모비스 주식회사 | Driving support image display method |
CN106364403A (en) * | 2016-10-14 | 2017-02-01 | 深圳市元征科技股份有限公司 | Lane recognizing method and mobile terminal |
CN106515582A (en) * | 2016-10-26 | 2017-03-22 | 深圳市元征科技股份有限公司 | Safe driving early warning method and device |
KR20180071137A (en) * | 2016-12-19 | 2018-06-27 | 엘지전자 주식회사 | Display device and operating method thereof |
JP6551384B2 (en) * | 2016-12-26 | 2019-07-31 | トヨタ自動車株式会社 | Vehicle alert device |
CN107228681A (en) * | 2017-06-26 | 2017-10-03 | 上海驾馥电子科技有限公司 | A kind of navigation system for strengthening navigation feature by camera |
CN108725319B (en) * | 2017-10-31 | 2021-05-04 | 无锡职业技术学院 | A video-based reversing guidance method |
-
2018
- 2018-11-23 CN CN201811406314.6A patent/CN109624851B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101309818A (en) * | 2005-11-17 | 2008-11-19 | 爱信精机株式会社 | Parking assisting device and parking assisting method |
CN102653272A (en) * | 2011-03-03 | 2012-09-05 | 富士重工业株式会社 | Vehicle driving support apparatus |
CN104374391A (en) * | 2014-11-17 | 2015-02-25 | 深圳市中天安驰有限责任公司 | Vehicle travelling track calculation system and vehicle travelling track calculation method |
CN108025767A (en) * | 2015-09-17 | 2018-05-11 | 索尼公司 | System and method for driving auxiliary for offer of overtaking other vehicles safely |
CN106598055A (en) * | 2017-01-19 | 2017-04-26 | 北京智行者科技有限公司 | Intelligent vehicle local path planning method, device thereof, and vehicle |
CN107310475A (en) * | 2017-05-17 | 2017-11-03 | 广州小鹏汽车科技有限公司 | A kind of display methods and system of intelligent automobile warning function |
Also Published As
Publication number | Publication date |
---|---|
CN109624851A (en) | 2019-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109624851B (en) | Augmented reality-based driving assistance method and system and readable storage medium | |
US10818180B2 (en) | Parking support device | |
CN103619636B (en) | Method and apparatus for parked vehicle | |
CN107651015B (en) | Method and device for assisting in driving, computer-readable storage medium and vehicle | |
US10179608B2 (en) | Parking assist device | |
KR102205144B1 (en) | Parking assistance method and parking assistance device | |
EP2910423B1 (en) | Surroundings monitoring apparatus and program thereof | |
US20010008992A1 (en) | Periphery monitoring device for motor vehicle and recording medium containing program for determining danger of collision for motor vehicle | |
US11601621B2 (en) | Vehicular display system | |
CN107021018A (en) | A kind of commerial vehicle visible system | |
JP2001163132A (en) | Image converting device for device for monitoring back of vehicle | |
CN110730735B (en) | Parking assist method and parking assist apparatus | |
US10988144B2 (en) | Driver assistance system and method for a motor vehicle for displaying an augmented reality | |
JP2004240480A (en) | Driving support device | |
JP7059525B2 (en) | Parking support method and parking support device | |
JP5617396B2 (en) | Driving assistance device | |
JP4156181B2 (en) | Parking assistance device | |
CN113496601B (en) | Vehicle driving assisting method, device and system | |
US20220219678A1 (en) | Parking Assist Method and Parking Assist Apparatus | |
JP2002029314A (en) | Parking support device | |
US20220086368A1 (en) | Vehicular display system | |
CN117601844A (en) | Method for searching parking space and parking by vehicle autonomous advancing exploration | |
JP7452479B2 (en) | display control device | |
US20250095121A1 (en) | Device and method for surround view camera system with reduced manhattan effect distortion | |
JP2024114056A (en) | Vehicle route display method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |