CN118494331A - Lane lighting method and system based on vehicle forward vision perception - Google Patents
Lane lighting method and system based on vehicle forward vision perception Download PDFInfo
- Publication number
- CN118494331A CN118494331A CN202410642772.9A CN202410642772A CN118494331A CN 118494331 A CN118494331 A CN 118494331A CN 202410642772 A CN202410642772 A CN 202410642772A CN 118494331 A CN118494331 A CN 118494331A
- Authority
- CN
- China
- Prior art keywords
- lane
- coordinate system
- image coordinate
- vehicle
- lane line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000004438 eyesight Effects 0.000 title claims abstract description 42
- 230000008447 perception Effects 0.000 title claims abstract description 28
- 238000000034 method Methods 0.000 title claims abstract description 18
- 239000011159 matrix material Substances 0.000 claims abstract description 47
- 238000005286 illumination Methods 0.000 claims abstract description 34
- 230000009466 transformation Effects 0.000 claims abstract description 18
- 238000001514 detection method Methods 0.000 claims abstract description 14
- 230000006870 function Effects 0.000 claims abstract description 9
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 claims description 8
- 239000011324 bead Substances 0.000 claims description 8
- 238000004891 communication Methods 0.000 claims description 8
- 230000016776 visual perception Effects 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 abstract description 5
- 230000000007 visual effect Effects 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000011295 pitch Substances 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/02—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
- B60Q1/24—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments for lighting other areas than only the way ahead
- B60Q1/247—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments for lighting other areas than only the way ahead for illuminating the close surroundings of the vehicle, e.g. to facilitate entry or exit
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Lighting Device Outwards From Vehicle And Optical Signal (AREA)
Abstract
The application belongs to the technical field of intelligent illumination, and particularly relates to a lane illumination method and a lane illumination system based on vehicle front vision perception, wherein the lane illumination method comprises the following steps of performing field marking on the basis of a vehicle front vision perception system to obtain a transformation relation between a camera image coordinate system and a car light image coordinate system; obtaining a lane line detection result of each frame under a camera image coordinate system through vehicle forward vision perception; judging whether the vehicle is positioned between the left lane line and the right lane line, and triggering a lane illumination function; each frame of lane line under the camera image coordinate system is calculated to be a lane line under the car light image coordinate system through the transformation relation between the camera image coordinate system and the car light image coordinate system; and controlling the HD matrix headlight to project the light of the lane range according to the lane line under the car light image coordinate system by the controller. According to the application, the lane line detection result is combined with the HD matrix headlight technology, accurate light beam control is performed according to the lane shape, the occurrence of visual blind areas is avoided, and the driving safety is improved.
Description
Technical Field
The invention belongs to the technical field of intelligent illumination, and particularly relates to a lane illumination method and system based on vehicle forward vision perception.
Background
With the continuous development of automatic driving technology, a vehicle front vision sensing system plays an increasingly important role in improving driving safety and comfort, and a vehicle front lighting dipped headlight is used as an important component of the vehicle safety, so that the visibility of a road can be improved, and the visual field of a driver is enhanced. However, the conventional vehicle road lighting system often lacks the ability to perceive the road environment, cannot be automatically adjusted according to the road condition, and at a road curve, the vehicle head lamps irradiate the road ahead, so that a dead zone is easily formed at the inner side of the road curve, thereby causing the driving safety to be reduced.
Disclosure of Invention
The invention aims to solve the technical problems that: the existing vehicle head lamp cannot be automatically adjusted according to road conditions, and blind areas are easy to appear on the inner sides of road curves, so that driving safety is reduced.
Therefore, the invention provides a lane illumination method and system based on vehicle forward vision visual perception.
The technical scheme adopted for solving the technical problems is as follows:
A lane illumination method based on vehicle forward vision perception comprises the following steps,
Firstly, performing field marking on the basis of a vehicle forward vision perception system to obtain a transformation relation between a camera image coordinate system and a vehicle lamp image coordinate system;
Step two, obtaining a lane line detection result of each frame under a camera image coordinate system through vehicle forward vision perception;
step three, judging that if the vehicle is positioned between the left lane line and the right lane line, triggering a lane illumination function;
Step four, each frame of lane line under the camera image coordinate system calculates the lane line under the car light image coordinate system through the transformation relation between the camera image coordinate system and the car light image coordinate system;
And fifthly, controlling the HD matrix headlight to project the light of the lane range according to the lane line under the car light image coordinate system by the controller.
Further, in the first step, a point P (x, y) in the vehicle lamp image coordinate system is taken, and a point P '(x', y ') corresponding to the point P in the camera image coordinate system satisfies a transformation relationship between P and P': and respectively taking 3 angular points in a camera image coordinate system and a car light image coordinate system to obtain 6 equations, so that 6 parameters m 1-m6 in an affine transformation matrix T can be solved, and the matrix T can restore any target on the view angle of a car front view camera to the pixel position on the projection pattern.
In the fourth step, 4 points are taken from each lane line obtained in the second step, and the corresponding points in the car light image coordinate system are calculated by using the inverse matrix of the affine transformation matrix T.
Further, in the camera image coordinate system, the pitches between 4 points on the lane line are equal.
Further, a cubic equation is used for defining a lane line y=C 0+C1x+C2x2+C3x3, and corresponding points of 4 points on the lane line in a car lamp image coordinate system are substituted into the lane line expression to obtain a C 0、C1、C2、C3 coefficient, so that an equation of a left lane line is obtained.
Further, the lamp module receives images with the same number and size as the lamp beads in the lamp, and the gray value of the pixel point in the images is larger than 0, the controller sets the corresponding lamp beads to be in a lighting state, otherwise, the lamp beads are extinguished
Further, the element value of the projection pattern corresponding to the front vehicle lane area in the fourth step is set to be positive, and the element value is sent to a controller in the HD matrix headlight module in a UART/SPI communication mode, so that the lane area image information is sent to the vehicle lamp module.
A lane illumination system based on vehicle forward vision perception comprises,
The system comprises a forward vision sensing system, a forward-looking sensing system and a display system, wherein the forward vision sensing system is used for shooting a picture right in front of a vehicle, and the picture right in front of the vehicle comprises a lane line image right in front of the vehicle;
The controller is connected with the front vision sensing system and converts pixel point coordinates in the image acquired by the front vision sensing system into pixel point coordinates in the car light image so as to acquire a lane line expression in the car light image;
The HD matrix headlight is connected with the controller and used for projecting the light in the lane range.
The application has the beneficial effects that by combining the lane line detection result with the HD matrix headlight technology, the dynamic traffic light blanket can be accurately projected towards the self-driving lane in a manner of overlapping brightness on the basis of maintaining the original dipped headlight illumination, compared with the traditional vehicle headlamp illumination manner, the traffic illumination manner not only enhances the visibility of the road environment, but also can accurately control and adjust the light beam according to the shape of the driving lane, thereby avoiding the appearance of a visual blind area, providing clearer, uniform and efficient illumination effect, displaying the information of the lane in front of the vehicle more intuitively, having better driving safety and comfort and improving the driving safety.
Drawings
The invention will be further described with reference to the drawings and examples.
Fig. 1 is a schematic structural view of a lane illumination system based on a vehicle front vision perception in the present invention.
Fig. 2 is a schematic control flow diagram of a lane illumination method based on vehicle forward vision perception in the present invention.
FIG. 3 is a schematic diagram of the field marking method in step one of the present invention.
Fig. 4 is a schematic diagram of a vehicle-mounted front-view camera according to the present invention.
Detailed Description
The invention will now be described in further detail with reference to the accompanying drawings. The drawings are simplified schematic representations which merely illustrate the basic structure of the invention and therefore show only the structures which are relevant to the invention.
In the description of the present invention, it should be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", "axial", "radial", "circumferential", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings are merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the device or element being referred to must have a specific orientation, be configured and operated in a specific orientation, and therefore should not be construed as limiting the present invention. Furthermore, features defining "first", "second" may include one or more such features, either explicitly or implicitly. In the description of the present invention, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the description of the present invention, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
A lane lighting system based on vehicle forward vision perception comprises a forward vision perception system, a controller and an HD matrix headlight.
The forward vision sensing system and the HD matrix headlight are connected with the controller through signals. The forward vision perception system includes a forward vision camera. Specifically, the front-view camera is used for monitoring and analyzing the condition of the road in front of the vehicle in real time, the front-view camera captures images and video data on the road, and the front-view visual perception system can identify and analyze the information of the vehicle, pedestrians, traffic marks, road marks and the like on the road by utilizing technologies such as computer vision, deep learning and the like. The controller is an executable program which can run on the system level chip development board, and the program receives and analyzes the result of the front vision perception system, determines the triggering scene of the car lamp and sends a car lamp control message. The HD matrix headlight is a lamp module arranged at the front part of the vehicle, is independent of a dipped headlight and a high beam, adopts a high definition matrix LED light source, and can accurately control and regulate light beams according to the shape of a driving lane.
The controller comprises a judging module and a calculating module, wherein the front view camera is used for shooting a front picture (comprising a lane line image in front of the vehicle) of the vehicle and transmitting the front picture to the judging module of the controller, when the judging module judges that the vehicle is positioned between the left lane line and the right lane line, the judging module triggers an illumination function, the received front view camera pattern is transmitted to the calculating module, the calculating module converts pixel point coordinates in the front view camera image into pixel point coordinates in the car lamp image and simulates an expression of the lane line in the car lamp image, the calculating module transmits expression information of the lane line to the HD matrix headlight, and the HD matrix headlight projects the light of the lane range forwards.
A lane illumination method based on vehicle forward vision perception, comprising the steps of:
Step 1: and (3) performing field marking on the basis of the vehicle forward vision perception system to obtain a transformation relation between a camera image coordinate system and a car light image coordinate system.
Specifically, an HD matrix headlight module 1, 1 sheets of projection patterns, and a vehicle front view camera 1 are prepared, and the aforementioned projection patterns are read into a controller in the HD matrix headlight module.
It should be noted that, the pixel resolution of the projection pattern is consistent with the resolution of the HD matrix headlight, so as to ensure that each bead in the vehicle lamp can correspond to each pixel in the image, as shown in fig. 1, where the white area corresponds to the bead that needs to be lit in the vehicle lamp.
If the real vehicle condition exists, loading the HD matrix headlight and the vehicle front view camera, projecting the projection pattern to the front side by using the HD matrix headlight on the ground level and the dim surrounding environment, then shooting and saving the picture by using the vehicle front view camera, and if the real vehicle condition does not exist, setting up a rack to fix the relative positions of the HD matrix headlight module and the vehicle front view camera.
As shown in fig. 2, the HD matrix headlight and the camera are stably prevented from being on the stage, with the a-panel in fig. 2 being a front view of the projection pattern projected directly in front of the matrix headlight, the b-panel being a light shadow plan view when the matrix headlight projects the pattern, and the c-panel being a front view of the projection pattern acquired by the camera.
And combining the pixel positions of any 3 corner points of a white area in the projection pattern of the HD matrix headlight and the shooting pattern of the camera in the figure (the upper left corner of the image is an origin, the horizontal right is the positive x-axis direction, and the horizontal downward is the positive y-axis direction), and fitting an affine transformation matrix T, wherein the affine transformation matrix T is shown in the following formula:
Wherein P is the coordinate of a certain angular point of a white area on the projection pattern of the HD matrix headlight, the specific pixel coordinate position is (x, y), P ' is the coordinate of a certain angular point of the white area corresponding to P in the camera shooting pattern, the specific pixel coordinate position is (x ', y '), 6 parameters m 1-m6 in an affine transformation matrix T can be solved by obtaining 6 equations through 3 diagonal points, and the matrix T can restore any target on the view angle of the vehicle forward-looking camera to the pixel position on the projection pattern.
Step 2: and receiving a lane line detection result of each frame under a camera image coordinate system in vehicle forward vision perception in a software communication mode.
And receiving a lane line detection result of each frame under a camera image coordinate system in vehicle forward vision perception in a software communication mode. Specifically, through the pipeline communication (SPI communication, socket communication, specifically considering frame rate, signal length, wiring length and stability requirements), the detection result of each frame of the vehicle and the lane line in the front vision perception of the vehicle is output, wherein the detection result comprises pixel positions Q (x, y) of the vehicle running in the same direction of the own lane (the center position of the front vehicle tail is the standard) and three-dimensional curve equations of two lane lines of the own lane on the picture shot by the front vision camera y=C0+C1x+C2x2+C3x3,y=C4+C5x+C6x2+C7x3.
Step 3: and (3) programming a controller, determining logic for triggering lane illumination, and determining whether to trigger a lane illumination function based on the detection result of each frame of lane line in the step (2).
Specifically, the relevant code is written and compiled into a separate executable program using the c++ programming language, which implements the functions of: (1) Judging whether the vehicle position is positioned between the left lane line and the right lane line according to the detection result of the structural lane line; (2) When the vehicle is positioned between the left lane line and the right lane line, the lane lighting function is triggered.
Specifically, the program receives and analyzes the lane line detection result of each frame in the step 2, and if two lane line results of the own vehicle lane are received and the current vehicle is not in a line pressing state (the state cannot determine which lane the own vehicle belongs to), the lane illumination function is triggered.
Step 4: if the step 3 triggers the lane illumination function, the lane range projected by the HD matrix headlight is calculated by combining the transformation relation between the camera image coordinate system and the car light image coordinate system in the step 1 and the lane line detection result of each frame under the camera image coordinate system in the step 2.
Specifically, 4 points are sampled at equal intervals on a lane line on a picture shot by a front-view camera in step 2, the affine transformation matrix in step 1 is used for converting the 4 points into a pixel coordinate system of a projection pattern, then 2 cubic curve equations on the pixel coordinate system of the projection pattern are fitted, the left lane line is y=c 0+C1x+C2x2+C3x3, the right lane line is y=c 4+C5x+C6x2+C7x3, and the area between the two curves corresponds to the lane area where the front vehicle is located.
In a camera image coordinate system, four points (x 1,y1)、(x2,y2)、(x3,y3)、(x4,y4) are taken from the interval A on the left lane line, and the inverse matrix of the matrix T is multiplied by the four points respectivelyThe equation of the left lane line can be obtained by obtaining 4 points (x1',y1')、(x2',y2')、(x3',y3')、(x4',y4'), in the vehicle lamp image coordinate system, substituting the 4 points into the lane line equation to obtain y1'=C0+C1x1'+C2x1'2+C3x1'3y2'=C0+C1x2'+C2x2'2+C3x2'3,y3'=C0+C1x3'+C2x3'2+C3x3'3,y4'=C0+C1x4'+C2x4'2+C3x4'3,, and then using a matrix elimination method to obtain the coefficient C 0、C1、C2、C3. Similarly, the coefficient C 4、C5、C6、C7 in the right lane line equation can be obtained, so as to obtain the equation of the right lane line, and the Q point coordinate in the vehicle lamp image coordinate system can also be obtained, and more preferably, in this step, in the vehicle lamp image coordinate system, a cut-off line L perpendicular to both the left lane line and the right lane line is simulated through the Q point.
Step 5: and the controller sends corresponding signals to a controller in the HD matrix headlight to control the HD matrix headlight to project the light in the lane range forwards.
Specifically, the element value corresponding to the front lane area on the projection pattern in step 4 is set to be positive, and is sent to the controller in the HD matrix headlight module by means of UART/SPI communication, where the controller in the headlight module may receive a message of UART/SPI channel communication, where the message includes an image (i.e. the gray values of all pixels in the image with the resolution shown in fig. 1) consistent with the number of the bulbs in the headlight, where the image is bounded by the lane lines calculated in step 4, and the gray value of the pixel point between the left and right lane lines is set to be greater than 0, and then the controller sets the corresponding bulbs to be in a lighting state, so as to achieve the effect of lane illumination, and where the illumination range L of the headlight is bounded as shown in fig. 3. Therefore, the light blanket range illuminated by the headlight of the car lamp can be distributed along the lane line, the accurate control of the illumination range of the car lamp is realized, the illumination range only irradiates the tail position of the front car, and the possibility of influencing the sight range of the front car is reduced.
In summary, by combining the lane line detection result with the HD matrix headlight technology, the application can accurately project the dynamic traffic light blanket toward the self-driving lane in a manner of overlapping brightness on the basis of maintaining the original low beam illumination, compared with the traditional illumination manner of the vehicle headlamp, the illumination manner of the traffic light not only enhances the visibility of the road environment, but also can accurately control and adjust the light beam according to the shape of the driving lane, thereby avoiding the occurrence of a visual blind area, providing a clearer, uniform and efficient illumination effect, displaying the lane information in front of the vehicle more intuitively, having better driving safety and comfort and improving the driving safety.
With the above-described preferred embodiments according to the present invention as an illustration, the above-described descriptions can be used by persons skilled in the relevant art to make various changes and modifications without departing from the scope of the technical idea of the present invention. The technical scope of the present invention is not limited to the description, but must be determined as the scope of the claims.
Claims (8)
1. A lane illumination method based on vehicle forward vision perception is characterized by comprising the following steps,
Firstly, performing field marking on the basis of a vehicle forward vision perception system to obtain a transformation relation between a camera image coordinate system and a vehicle lamp image coordinate system;
Step two, obtaining a lane line detection result of each frame under a camera image coordinate system through vehicle forward vision perception;
step three, judging that if the vehicle is positioned between the left lane line and the right lane line, triggering a lane illumination function;
Step four, each frame of lane line under the camera image coordinate system calculates the lane line under the car light image coordinate system through the transformation relation between the camera image coordinate system and the car light image coordinate system;
And fifthly, controlling the HD matrix headlight to project the light of the lane range according to the lane line under the car light image coordinate system by the controller.
2. The method according to claim 1, wherein in the first step, a point P (x, y) in the vehicle headlight image coordinate system is taken, and a point P '(x', y ') corresponding to the point P in the camera image coordinate system, where the transformation relationship between P and P' is satisfied: and respectively taking 3 angular points in a camera image coordinate system and a car light image coordinate system to obtain 6 equations, so that 6 parameters m 1-m6 in an affine transformation matrix T can be solved, and the matrix T can restore any target on the view angle of a car front view camera to the pixel position on the projection pattern.
3. The method for illuminating a lane based on vehicle forward vision perception according to claim 2, wherein in the fourth step, 4 points are taken on each lane line obtained in the second step, and the corresponding points in the vehicle lamp image coordinate system are calculated by using the inverse matrix of the affine transformation matrix T.
4. A method of illuminating a lane based on vehicle forward looking vision perception according to claim 3, wherein the distances between 4 points on the lane line are equal in the camera image coordinate system.
5. The method for illuminating a lane based on the visual perception of forward looking of a vehicle according to claim 4, wherein the lane line y=c 0+C1x+C2x2+C3x3 is defined by a cubic equation, and the corresponding points of 4 points on the lane line in the vehicle lamp image coordinate system are substituted into the lane line expression to obtain a C 0、C1、C2、C3 coefficient, thereby obtaining the equation of the left lane line.
6. The method for illuminating a lane based on vehicle forward vision perception according to claim 1, wherein the lamp module receives images with the same number and size as the number of the lamp beads in the lamp, and the gray value of the pixel point in the images is greater than 0, the controller sets the corresponding lamp beads to be in a lighting state, otherwise, the lamp beads are turned off
7. The method according to claim 6, wherein the element value of the projection pattern corresponding to the front lane region in the fourth step is set to be a positive number, and the element value is sent to the controller in the HD matrix headlight module by UART/SPI communication, so that the lane region image information is sent to the headlight module.
8. A lane illumination system based on vehicle forward vision perception is characterized by comprising,
The system comprises a forward vision sensing system, a forward-looking sensing system and a display system, wherein the forward vision sensing system is used for shooting a picture right in front of a vehicle, and the picture right in front of the vehicle comprises a lane line image right in front of the vehicle;
The controller is connected with the front vision sensing system and converts pixel point coordinates in the image acquired by the front vision sensing system into pixel point coordinates in the car light image so as to acquire a lane line expression in the car light image;
The HD matrix headlight is connected with the controller and used for projecting the light in the lane range.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410642772.9A CN118494331A (en) | 2024-05-23 | 2024-05-23 | Lane lighting method and system based on vehicle forward vision perception |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410642772.9A CN118494331A (en) | 2024-05-23 | 2024-05-23 | Lane lighting method and system based on vehicle forward vision perception |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118494331A true CN118494331A (en) | 2024-08-16 |
Family
ID=92228852
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410642772.9A Pending CN118494331A (en) | 2024-05-23 | 2024-05-23 | Lane lighting method and system based on vehicle forward vision perception |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118494331A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118876962A (en) * | 2024-09-29 | 2024-11-01 | 山东科技大学 | Vehicle posture adjustment method and system based on lane line compensation based on lighting curve |
-
2024
- 2024-05-23 CN CN202410642772.9A patent/CN118494331A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118876962A (en) * | 2024-09-29 | 2024-11-01 | 山东科技大学 | Vehicle posture adjustment method and system based on lane line compensation based on lighting curve |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7899213B2 (en) | Image processing system and vehicle control system | |
US8036424B2 (en) | Field recognition apparatus, method for field recognition and program for the same | |
JP5820843B2 (en) | Ambient environment judgment device | |
WO2019159765A1 (en) | Vehicle detection device and vehicle light system | |
JP2008230364A (en) | Head lamp control device | |
CN118494331A (en) | Lane lighting method and system based on vehicle forward vision perception | |
JP2012240530A (en) | Image processing apparatus | |
JP2009234344A (en) | Adjustment device for photographing means and object detection device | |
WO2019176418A1 (en) | Vehicular lamp, vehicle detection method, and vehicle detection device | |
CN111347969A (en) | Projection system and method of multimedia image and vehicle | |
CN110341582B (en) | A kind of intelligent vehicle lamp semantic projection lighting device and method | |
CN112896036A (en) | Intelligent big lamp system and control method with same | |
CN116698377B (en) | ADB function test method and system for automobile LED matrix headlight | |
JP2020120316A (en) | Vehicle periphery monitoring device | |
JP6151569B2 (en) | Ambient environment judgment device | |
JP2011008459A (en) | In-vehicle camera device | |
WO2022196296A1 (en) | Vehicle lamp control device, vehicle lamp control method and vehicle lamp system | |
WO2023286810A1 (en) | Stationary object information using device, program, stationary object information using method, vehicle system, and stationary object information using system | |
KR101095023B1 (en) | Uplight Control System and Method | |
CN114441139B (en) | Automobile headlight testing method and device | |
CN116567897A (en) | Control method and system for pixel headlight | |
CN115482292B (en) | Control method and device for head lamp in ADB module | |
JP7703113B2 (en) | Headlight control device and headlight control method | |
CN222167400U (en) | Projection module, car light device and vehicle | |
CN114572102B (en) | Intelligent adjusting system and method suitable for car lamp |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |