[go: up one dir, main page]

CN120583559A - Vehicle light control method, device, vehicle and storage medium - Google Patents

Vehicle light control method, device, vehicle and storage medium

Info

Publication number
CN120583559A
CN120583559A CN202510945982.XA CN202510945982A CN120583559A CN 120583559 A CN120583559 A CN 120583559A CN 202510945982 A CN202510945982 A CN 202510945982A CN 120583559 A CN120583559 A CN 120583559A
Authority
CN
China
Prior art keywords
parameter
driver
determining
parameters
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202510945982.XA
Other languages
Chinese (zh)
Inventor
魏延
杨成
陈文�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Changan Automobile Co Ltd
Original Assignee
Chongqing Changan Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Changan Automobile Co Ltd filed Critical Chongqing Changan Automobile Co Ltd
Priority to CN202510945982.XA priority Critical patent/CN120583559A/en
Publication of CN120583559A publication Critical patent/CN120583559A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • B60Q1/06Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle
    • B60Q1/08Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle automatically
    • B60Q1/085Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle automatically due to special conditions, e.g. adverse weather, type of road, badly illuminated road signs or potential dangers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q3/00Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
    • B60Q3/80Circuits; Control arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/11Controlling the light source in response to determined parameters by determining the brightness or colour temperature of ambient light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/16Controlling the light source by timing means
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

本申请涉及一种车灯控制方法、装置、车辆和存储介质,包括在车辆行驶过程中,获取行驶环境参数和光环境参数;基于所述光环境参数,确定视野范围参数;在所述视野范围参数和所述行驶环境参数满足预设事故场景的情况下,基于所述视野范围控制车灯发光;在所述视野范围参数和所述行驶环境参数不满足预设事故场景的情况下,基于所述视野范围参数和所述行驶环境参数确定目标舒适度参数,基于所述目标舒适度参数控制车灯发光,通过本申请实施例可以提高车辆的行驶安全性和舒适性。

The present application relates to a vehicle light control method, device, vehicle and storage medium, including obtaining driving environment parameters and light environment parameters during vehicle driving; determining field of view parameters based on the light environment parameters; controlling the vehicle light emission based on the field of view when the field of view parameters and the driving environment parameters meet a preset accident scenario; determining target comfort parameters based on the field of view parameters and the driving environment parameters when the field of view parameters and the driving environment parameters do not meet the preset accident scenario, and controlling the vehicle light emission based on the target comfort parameters. The embodiments of the present application can improve the driving safety and comfort of the vehicle.

Description

Car lamp control method and device, car and storage medium
Technical Field
The present application relates to the field of light control technology, and in particular, to a vehicle lamp control method, a vehicle lamp control device, a vehicle, and a computer readable storage medium.
Background
When driving a vehicle in a scene with poor visibility such as at night, in rainy days, the driver may use the vehicle lamp to provide illumination. The current lamplight, such as an automatic lamplight function, is to automatically turn on the lamplight in a scene with poor visibility, or to make the light position of the lamplight follow based on road conditions when the lamplight is turned on, or to adjust the intensity of the light. However, these methods only consider the lighting function of the light, but do not consider the process of using the light by the driver, and are inferior in driving safety and comfort when driving using the light.
Disclosure of Invention
The application aims to provide a vehicle lamp control method for solving the problem of poor driving safety and comfort when driving by using lamplight in the prior art, and aims to provide a vehicle lamp control device, a vehicle and a computer readable storage medium.
In order to achieve the above purpose, the technical scheme adopted by the application is as follows:
in a first aspect of the present application, an embodiment of the present application discloses a vehicle lamp control method, including:
Acquiring a running environment parameter and a light environment parameter in the running process of the vehicle;
determining a field of view parameter based on the light environment parameter;
controlling the light emission of the vehicle lamp based on the visual field range under the condition that the visual field range parameter and the running environment parameter meet the preset accident scene;
and under the condition that the visual field range parameter and the running environment parameter do not meet a preset accident scene, determining a target comfort degree parameter based on the visual field range parameter and the running environment parameter, and controlling the vehicle lamp to emit light based on the target comfort degree parameter.
Optionally, the light environment parameters include an obstacle parameter, a reflector parameter, a driver head pose parameter, a driver refraction parameter, and a driver eye coordinate parameter, and determining the view field range parameter based on the light environment parameters includes:
Determining a straight line view parameter based on the obstacle parameter and the driver eye coordinate parameter;
determining a curvilinear field of view parameter based on the driver head pose parameter;
Determining a reflected field of view parameter based on the reflector parameter and the driver eye coordinate parameter;
Determining a visual equidistant region based on the driver head pose parameter and the driver eye coordinate parameter;
Determining a refraction correction parameter based on the driver refraction parameter;
And determining a visual field range parameter by combining the straight visual field parameter, the curve visual field parameter, the reflection visual field parameter, the visual equidistant region and the refraction correction parameter.
Optionally, the determining a straight line view parameter based on the obstacle parameter and the driver eye coordinate parameter includes:
establishing a three-dimensional environment coordinate system based on the eye coordinate parameters of the driver;
Converting the obstacle parameters into the three-dimensional environment coordinate system, and determining an obstacle coordinate range;
and carrying out light projection on the obstacle coordinate range, and determining a straight line visual field parameter.
Optionally, the determining a curve view parameter based on the driver head pose parameter includes:
based on the head posture parameters of the driver, the center coordinates of eyeballs are determined;
Constructing a sight direction vector based on the eyeball center coordinates;
and interpolating the sight line direction vector to determine that the boundary curved surface range is the curve visual field parameter.
Optionally, the determining a reflected field of view parameter based on the reflector parameter and the driver eye coordinate parameter includes:
establishing a specular reflection geometric model based on the reflector parameters and the driver eye coordinate parameters;
And determining that the range parameter of the dazzling viewing angle cone corresponding to the eye coordinate parameter of the driver is a reflection view field parameter based on the specular reflection geometric model.
Optionally, the determining a visual equidistant region based on the driver head pose parameter and the driver eye coordinate parameter includes:
determining a gaze origin based on the driver head pose parameter and the driver eye coordinate parameter;
determining a visual area from the sight starting point to a preset advancing distance;
And equally dividing the visual area to determine a visual equidistant area.
Optionally, the determining the refraction correction parameter based on the driver refraction parameter includes:
Converting the driver refractive parameters into an adjustment amplitude based on a preset refractive compensation model;
Determining the reciprocal of the adjustment amplitude as a focusing distance;
Determining a visual boundary in combination with the focusing distance and the driver refractive parameter;
Based on the visual boundary, refraction correction parameters are determined.
Optionally, the determining the target comfort parameter based on the field of view parameter and the driving environment parameter includes:
And carrying out weighted calculation on the straight line visual field parameter, the curve visual field parameter, the reflection visual field parameter, the visual equidistant region, the refraction correction parameter and the running environment parameter to determine a target comfort degree parameter.
Optionally, the controlling the lighting of the vehicle lamp based on the target comfort parameter includes:
acquiring a current comfort degree parameter;
determining a light control parameter corresponding to the target comfort level parameter under the condition that the current comfort level parameter is smaller than the target comfort level parameter;
and controlling the light of the car lamp to emit light based on the light control parameter.
In a second aspect of the present application, an embodiment of the present application discloses a vehicle lamp control device, including:
the acquisition module is used for acquiring the running environment parameters and the light environment parameters in the running process of the vehicle;
The visual field determining module is used for determining visual field range parameters based on the light environment parameters;
The first control module is used for controlling the car lamp to emit light based on the visual field range under the condition that the visual field range parameter and the running environment parameter meet the preset accident scene;
And the second control module is used for determining a target comfort level parameter based on the visual field range parameter and the running environment parameter and controlling the vehicle lamp to emit light based on the target comfort level parameter under the condition that the visual field range parameter and the running environment parameter do not meet the preset accident scene.
In a third aspect of the present invention, an embodiment of the present invention discloses a vehicle including a processor, a memory, and a computer program stored on the memory and capable of running on the processor, which when executed by the processor, implements the steps of the vehicle lamp control method as described above.
In a fourth aspect of the present invention, an embodiment of the present invention discloses a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the vehicle lamp control method as described above.
The application has the beneficial effects that:
(1) In the running process of the vehicle, the running environment parameters and the light environment parameters are used as reference factors to be compared with the preset accident scene, and under the condition that the running environment parameters and the light environment parameters meet the preset accident scene, the current traffic accident is easy to happen, the light emission of the vehicle lamps is controlled through the visual field range, and the light adjustment means is utilized to enable the driver to shift the point of regard of the driver to the area needing to pay attention to when the driver drives the vehicle to pass through the accident multiple road section, so that the accident occurrence probability is reduced to a certain extent, and the running safety is improved.
(2) In the case that the driving environment parameters and the light environment parameters do not meet the preset accident scene, in order to avoid the interference of light on a driver, the driving environment parameters and the light environment parameters are used for determining the target comfort level parameters, the vehicle lamp is controlled to emit light based on the target comfort level parameters, so that the emitted light can enable the driver to be in a visual comfort range, the light comfort of the driver is improved, and the driving fatigue is relieved.
Drawings
FIG. 1 is a flowchart illustrating steps of an embodiment of a method for controlling a vehicle lamp according to the present application;
FIG. 2 is a flowchart showing steps of another embodiment of a lamp control method according to the present application;
FIG. 3 is a hardware architecture diagram of an example of a lamp control method of the present application;
FIG. 4 is a flowchart showing steps of an example of a lamp control method of the present application;
FIG. 5 is a block diagram showing the construction of an embodiment of a lamp control device according to the present application;
FIG. 6 is a block diagram of a vehicle according to an embodiment of the present application;
fig. 7 is a block diagram of a storage medium according to an embodiment of the present application.
Detailed Description
Further advantages and effects of the present application will become readily apparent to those skilled in the art from the disclosure herein, by referring to the accompanying drawings and the preferred embodiments. The application may be practiced or carried out in other embodiments that depart from the specific details, and the details of the present description may be modified or varied from the spirit and scope of the present application. It should be understood that the preferred embodiments are presented by way of illustration only and not by way of limitation.
It should be noted that the illustrations provided in the following embodiments merely illustrate the basic concept of the present application by way of illustration, and only the components related to the present application are shown in the drawings and are not drawn according to the number, shape and size of the components in actual implementation, and the form, number and proportion of the components in actual implementation may be arbitrarily changed, and the layout of the components may be more complicated.
Referring to fig. 1, a flowchart illustrating steps of an embodiment of a method for controlling a vehicle lamp according to the present invention may specifically include the steps of:
step 101, acquiring a running environment parameter and a light environment parameter in the running process of a vehicle;
In the running process of the vehicle, namely under the condition that the vehicle is in a power-on state and the vehicle speed is not zero, the running environment parameter and the light environment parameter can be obtained. The driving environment parameters are used for representing the environment outside the vehicle, such as road environment, planned driving path, other traffic participation situations and the like, when the vehicle is driven. The light environment parameter is used for representing the optical environment in which the vision of the driver is located when the vehicle is running.
Step 102, determining a visual field range parameter based on the light environment parameter;
The optical environment parameters can be analyzed to determine the corresponding visual range of the driver under the current optical environment parameters, and the visual field range parameters can be determined. The current visual situation of the driver is represented by the visual field range parameter.
Step 103, controlling the light of the car lamp to emit light based on the visual field range under the condition that the visual field range parameter and the running environment parameter meet the preset accident scene;
The preset accident scene is summarized and determined through historical traffic accidents and is used for representing the environment, the state of a driver and the like when the traffic accidents occur. Whether the corresponding parameter range in the preset accident scene is met or not can be determined jointly by combining the visual field range parameter and the running environment parameter. Under the condition that the visual field range parameter and the running environment parameter meet the preset accident scene, the current situation is easy to cause accidents, the vehicle lamps are controlled to emit light through the visual field range, and when a driver runs a vehicle to pass through a road section with frequent accidents, the fixation point of the driver is transferred to an area needing to be focused, so that the occurrence probability of the accidents is reduced to a certain extent, and the running safety is improved.
And 104, determining a target comfort level parameter based on the visual field range parameter and the running environment parameter, and controlling the vehicle lamp to emit light based on the target comfort level parameter under the condition that the visual field range parameter and the running environment parameter do not meet a preset accident scene.
In the case where the visual field range parameter and the running environment parameter do not satisfy the preset accident scene, it is not easy to explain the current situation that the accident occurs, but it is required to provide a comfortable lighting condition for the driver. The target comfort parameter may be determined by a field of view parameter and a driving environment parameter. The level that the driver needs to reach under the current visual field range parameter and the driving environment parameter is represented by the target comfort parameter. And controlling the car lamp to emit light based on the target comfort parameter, so that the emitted light can enable a driver to be in a visual comfort range, thereby improving the light comfort of the driver and relieving the driving fatigue.
The embodiment of the application obtains a running environment parameter and a light environment parameter in the running process of a vehicle, determines a visual field range parameter based on the light environment parameter, controls the light emission of the vehicle lamp based on the visual field range when the visual field range parameter and the running environment parameter meet a preset accident scene, determines a target comfort degree parameter based on the visual field range parameter and the running environment parameter when the visual field range parameter and the running environment parameter do not meet the preset accident scene, and controls the light emission of the vehicle lamp based on the target comfort degree parameter. In the running process of the vehicle, the running environment parameters and the light environment parameters are used as reference factors to be compared with the preset accident scene, and under the condition that the running environment parameters and the light environment parameters meet the preset accident scene, the current traffic accident is easy to occur, the light emission of the vehicle lamps is controlled through the visual field range, and the light adjustment means is utilized to enable the driver to shift the point of regard of the driver to the region needing to pay attention to when the driver drives the vehicle to pass through the accident multiple road section, so that the occurrence probability of the accident is reduced to a certain extent, and the running safety is improved. Under the condition that the driving environment parameters and the light environment parameters do not meet the preset accident scene, in order to avoid the interference of light to a driver, the target comfort level parameters are determined based on the driving environment parameters and the light environment parameters, the light emission of the car lamp is controlled based on the target comfort level parameters, and the emitted light can enable the driver to be in a visual comfort range, so that the light comfort of the driver is improved, and the driving fatigue is relieved.
Referring to fig. 2, a flowchart illustrating steps of another embodiment of a vehicle lamp control method according to the present invention may specifically include the steps of:
Step 201, acquiring a running environment parameter and a light environment parameter in the running process of a vehicle, wherein the light environment parameter comprises an obstacle parameter, a reflector parameter, a driver head posture parameter, a driver refraction parameter and a driver eye coordinate parameter;
The driving environment parameter and the light environment parameter can be obtained when the vehicle starts the lighting system to illuminate and is in driving. The light environment parameters comprise obstacle parameters, reflector parameters, head posture parameters of a driver, refraction parameters of the driver and eye coordinate parameters of the driver. The obstacle parameter refers to data describing the position, shape, height, etc. of an obstacle ahead when the vehicle is traveling. When the laser radar is adopted to collect the obstacle, the point cloud set corresponding to the obstacle is the obstacle parameter. And when the vision sensor is used for collecting the obstacle, the pixel set corresponding to the obstacle is the obstacle parameter. The reflector parameters refer to data describing the reflection sources, such as positions, heights and the like of the reflection sources existing inside and outside the vehicle when the vehicle runs. The driver head pose parameter is used to describe the pose of the driver head. The driver refractive parameter is used to describe the driver's eye refractive condition. The driver's eye coordinate parameter is a coordinate value of the center of the eyeball when the driver drives the vehicle.
Step 202, determining a straight line visual field parameter based on the obstacle parameter and the driver eye coordinate parameter;
Based on the eye coordinate parameters of the driver as starting points, the obstacle parameters corresponding to the obstacles such as the upright posts, the steering wheels, the pedestrians and the road traffic facilities which possibly block the view are identified and analyzed, the view condition when the driver observes the obstacles is fitted, and the straight line view parameters are determined. The optimal visual field range of the driver is described by a straight visual field parameter.
In an alternative embodiment of the present application, the determining a straight line view parameter based on the obstacle parameter and the driver eye coordinate parameter includes:
A substep S2021 of establishing a three-dimensional environmental coordinate system based on the driver eye coordinate parameters;
And establishing a three-dimensional environment coordinate system by taking the eye coordinates of the eye coordinate parameters of the driver as an origin.
Sub-step S2022, converting the obstacle parameters to the three-dimensional environment coordinate system, determining an obstacle coordinate range;
and then converting the obstacle parameters into a three-dimensional environment coordinate system, and determining the obstacle coordinate range. Therefore, the obstacle parameters and the eyes of the driver are processed based on the same coordinate system, and the accuracy of processing is ensured.
In a substep S2023, light projection is performed on the obstacle coordinate range, and a straight line view parameter is determined.
Based on the principle of light projection, the origin is used as a starting point to perform light projection on the coordinate range of the obstacle so as to simulate the observation of eyes of a driver on the obstacle and determine the straight line visual field parameters.
An example is used to describe the process of establishing the straight line visual field parameters, which comprises (1) establishing a coordinate system, namely establishing a three-dimensional environment coordinate system by taking a center point (H point) of an ellipse of a driver's eye of the coordinate parameters of the driver as an origin, and converting point cloud data of an obstacle as the obstacle parameters into the three-dimensional environment coordinate system to construct an environment three-dimensional model. (2) Obstruction identification, the use of an algorithm (e.g., ray casting algorithm (RAY CASTING)) to cast clusters of rays from the eyepoint to the windshield, to detect geometric interference with the pillars (a/B/C pillars), steering wheel rims, instrument panel upper edges, vehicles in front, and traffic facilities (signal lights, guideboards). Calculating a horizontal deflection angle (°) of the unobstructed visual field boundary by taking the longitudinal axis of the vehicle as a reference, wherein the formula is as follows: Where (x obstacle, y obstacle) is the projected coordinates of the obstacle in the coordinate system. Thereby calculating a minimum azimuth angle a min and a maximum azimuth angle a max. And (h) the elevation angle (h) is calculated to be the vertical angle (°) between the ground line and the highest visible obstacle, and the data of the pitch angle sensor are required to be compensated. The maximum elevation h max and the minimum elevation h min are determined. The sight line distance (d) is used for determining the minimum unobstructed sight distance through the nearest neighbor search of the point cloud on the surface of the obstacle. Finally, a field-of-view range parameter set (A min,Amax),(hmin,hmax) is output, d being the straight line field-of-view parameter.
Step 203, determining a curve view parameter based on the head posture parameter of the driver;
the driver's view of the surface range at this pose may be fitted based on the driver head pose parameters.
In an alternative embodiment of the present application, the determining a curve view parameter based on the driver head pose parameter includes:
A substep S2031 of determining eyeball center coordinates based on the driver head pose parameters;
The state of the driver's head may be determined based on the driver head pose parameters, from which the eye center coordinates are identified.
Step S2032, constructing a sight line direction vector based on the eyeball center coordinates;
And constructing a sight line direction vector by taking the eyeball center coordinate as the midpoint of the sight line direction vector.
And step S2033, interpolating the sight line direction vector to determine that the boundary surface range is the curve field parameter.
Interpolation processing is carried out on the sight direction vector, and interpolation modes include, but are not limited to, polynomial interpolation, spline interpolation, rational function interpolation and the like, so that the view range limitation under the gesture is determined, and the boundary curved surface ranges such as the maximum gaze area range curved surface, the maximum vision area range curved surface and the like are determined to be curve view parameters.
A process for establishing curve view parameters is described by using an example and comprises (1) modeling a head posture, namely calculating eyeball center coordinates (x, y) through a rigid body transformation matrix and through the head posture parameters of a driver based on head 6-DoF (data type) data (position+rotation) acquired by an in-vehicle infrared camera as the head posture parameters of the driver. (2) Dynamic view boundary calculation, maximum gaze area calculation, constructing a gaze direction vector with the center coordinates (x, y) of the eye as the centerAnd (theta is a horizontal azimuth angle, phi is a vertical azimuth angle), generating a boundary curved surface through spherical coordinate interpolation, and meeting the requirement of a typical value interval (the specific value of the interval can be determined according to the requirement or an empirical value). The maximum field calculation fits the field limiting surface with eye movement data, and the boundary is constrained by the physiological cone angle (horizontal monocular 160 DEG, vertical 70 DEG) and the head rotation range. Obtaining a gaze region curved surface equation: Visual zone boundary point set (spherical coordinates) { (r jjj) |j=1, 2.
Step 204, determining a reflected visual field parameter based on the reflector parameter and the driver eye coordinate parameter;
And identifying the visual influence of reflective objects such as an in-vehicle rearview mirror, a left rearview mirror, a right rearview mirror, a vehicle screen, reflective road traffic facilities, a mobile phone screen and the like on the driver based on the position relation between the reflector parameters and the eye coordinate parameters of the driver, and determining the reflective visual field parameters. The reflection is characterized by a reflection field of view parameter.
In an alternative embodiment of the present application, the determining a reflected field of view parameter based on the reflector parameter and the driver eye coordinate parameter includes:
Step S2041, based on the reflector parameters and the driver eye coordinate parameters, establishing a specular reflection geometric model;
A specular reflection geometric model may be established based on the positional relationship of the reflector parameters and the driver eye coordinate parameters.
And a substep S2042, determining, based on the specular reflection geometric model, that the range parameter of the flare viewing cone corresponding to the driver eye coordinate parameter is a reflection view parameter.
In the specular reflection geometric model, a range parameter of a glare viewing angle cone which causes glare to a driver in a reflecting object is determined by taking eye coordinates in eye coordinate parameters of the driver as a center. And taking the range parameter of the dazzling viewing angle cone as the reflection view parameter.
In addition, when the range parameter based on the dazzling view angle cone reaches the range with the dazzling risk, a driver can be reminded to turn on the anti-dazzling switch of the sight glass through vehicle-to-vehicle voice, or the vehicle-to-vehicle actively turns on the anti-dazzling switch of the sight glass.
An example of a process for establishing the reflected field of view parameters is described, including (1) determining reflected object parameters, which are extracted by a specular separation algorithm based on multispectral camera data, and which satisfy a brightness threshold L glare>2000cd/m2 (rearview mirror, screen, etc.) as the reflected object parameters. (2) Establishing a specular reflection geometric model by combining the reflected object parameters with the eye coordinate parameters of the driver, wherein the incident vector(Light Source-reflective surface) and reflection vectorThe (reflecting surface-eyeball) satisfies the specular reflection geometric model. (alternatively, the method may comprise,(Normal vector to the reflecting surface)). And generating a dazzling viewing cone, namely calculating a minimum included angle threshold gamma min =1.5 degrees between a reflecting light path and an eyeball, and determining a dazzling cone equation gamma (theta, phi and t) (a time-varying solid angle function). When (when)An alarm is triggered when. The risk level is R glare E [0,1]. In addition, if a continuous glare risk (> 2 seconds) is identified, an anti-glare opening command may be sent to the corresponding rearview mirror control module over a CAN (Controller Area Network ) network bus.
Step 205, determining a visual equidistant region based on the driver head posture parameter and the driver eye coordinate parameter;
the visual equidistant region between the distances of the sight lines from the initial position to the designated position can be predicted according to the current head gesture and eye coordinates of the driver, and the distances between the driver and other objects are determined through the visual equidistant region.
In an alternative embodiment of the present application, the determining the visual equidistant region based on the driver head pose parameter and the driver eye coordinate parameter includes:
A substep S2051 of determining a gaze origin based on the driver head pose parameter and the driver eye coordinate parameter;
The gaze origin may be determined jointly by the driver head pose parameter and the driver eye coordinate parameter.
Sub-step S2052, determining a visual area from the line of sight starting point to a preset advancing distance;
and taking the position of the preset advancing distance as a sight-line end, and determining a visual area from the sight-line start point to the sight-line end. The preset advancing distance may be determined according to requirements, which is not limited in the embodiment of the present application.
And step S2053, equally dividing the visual area to determine a visual equidistant area.
And then equally dividing the visual area to generate a visual equidistant area.
In addition, the visual equidistant region can be subjected to moving time analysis, the time from the initial position to the appointed position of the sight line is calculated, the visual equidistant region is converted into the visual equidistant region, and the position reached after a plurality of times is predicted.
Step 206, determining refraction correction parameters based on the driver refraction parameters;
The initially calibrated visual range may be refraction compensated based on the user's driver refractive parameters (e.g., near, normal, far), and the maximum range that is clearly visible without the aid of an external refractive correction device within a close range may be determined.
In an alternative embodiment of the present application, the determining the refraction correction parameter based on the driver refraction parameter includes:
Substep S2061, converting the driver refractive parameter to an adjustment amplitude based on a preset refractive compensation model;
A preset refractive compensation model may be acquired first. The preset refraction compensation model can be obtained through experiments or simulation degree in advance. The preset refraction compensation model is used for representing the influence of refraction conditions on vision. And converting the refraction parameters of the driver into adjustment amplitude based on a preset refraction compensation model.
Substep S2062, determining the inverse of the adjustment amplitude as a focus distance;
The inverse of the adjustment amplitude is calculated as the focus distance.
Substep S2063 of determining a visual boundary in combination with said focus distance and said driver refractive parameter;
and combining the focusing distance with the refraction parameters of the driver, performing refraction compensation on the initially calibrated visual range, and determining the visual boundary.
Substep S2064, determining refractive correction parameters based on the visual boundaries.
The visual boundaries are combined to determine refractive correction parameters.
Describing the establishment of refractive correction parameters with one example includes (1) a refractive compensation model, input user diopter D (units: D, diopter), conversion to accommodative amplitude(K compensation factor, myopia takes-0.8, hyperopia+1.2, A baseline as base factor). (2) Clear vision boundary calculation, namely calculating the minimum focusing distance according to the Snell's law and the cornea curvature model(Unit: m). And generating a clear visual envelope surface in an eye coordinate system, wherein a spherical crown taking an eyeball as a center and dmin as a radius intersects with a horizontal plane. Determining effective visual boundaries dmin (D, age) and refractive compensated field of view: ( eyeball position) is a refractive correction parameter.
Step 207, determining a field-of-view parameter in combination with the straight-line field-of-view parameter, the curved field-of-view parameter, the reflected field-of-view parameter, the visual equidistant zone, and the refraction correction parameter.
And determining the straight line visual field parameter, the curve visual field parameter, the reflection visual field parameter, the visual equidistant region and the refraction correction parameter as the visual field condition of a driver in the light environment.
Step 208, controlling the light of the car lamp to emit light based on the visual field range when the visual field range parameter and the running environment parameter meet the preset accident scene;
Whether the corresponding parameter range in the preset accident scene is met or not can be determined jointly by combining the visual field range parameter and the running environment parameter. Under the condition that the visual field range parameter and the running environment parameter meet the preset accident scene, the current situation is indicated to easily cause accidents, the lighting of the car lamp is controlled through the visual field range, the light is controlled to irradiate the whole visual field range, and the blind area of the visual field is reduced.
Step 209, determining a target comfort level parameter based on the visual field range parameter and the running environment parameter, and controlling the vehicle lamp to emit light based on the target comfort level parameter when the visual field range parameter and the running environment parameter do not meet a preset accident scene.
In the case where the view range parameter and the running environment parameter do not satisfy the preset accident scene, the target comfort parameter may be determined by the view range parameter and the running environment parameter. The level that the driver needs to reach under the current field of view parameters and the driving environment parameters is characterized by the target comfort parameters. The vehicle lamp lighting is controlled based on the target comfort parameter.
In an alternative embodiment of the present application, the determining the target comfort parameter based on the field of view parameter and the driving environment parameter includes weighting the straight line field of view parameter, the curved field of view parameter, the reflected field of view parameter, the equidistant zone of vision, the refraction correction parameter, and the driving environment parameter to determine the target comfort parameter.
The straight line visual field parameter, the curve visual field parameter, the reflection visual field parameter, the visual equidistant region, the refraction correction parameter and the running environment parameter are multiplied by corresponding weights respectively, and then the weighted values are combined to obtain the target comfort degree parameter.
The weights corresponding to the parameters can be set according to requirements. In different cases, the weights may also be dynamic weight values instead of fixed weight values. Such as deterioration of the light environment parameters (e.g. limited field of view, increased risk of glare), an increase in the target comfort V (i.e. reduced comfort) is required. For example, (1) if the straight line view parameter display view is limited, indicating that the driver's visual range is reduced, more visual power is required, and the target comfort V is increased. (2) If the reflection risk in the reflection analysis is increased (e.g. > 0.5), indicating a high risk of glare, visual fatigue is easily caused and the target comfort V is increased. (3) If the refractive error causes a difficulty in focusing, the clear vision range is reduced and the target comfort V is increased. (4) If the visual equidistant region analysis shows a decrease in equidistant region or an increase in movement time, this indicates a more laborious line of sight switching and an increase in target comfort V. Conversely, if the parameter is within the desired range (e.g., wide field of view, low risk of glare), the target comfort level V is reduced. In one example, the overall contribution of the light environment parameters to the target comfort level V is weighted higher (may account for 50-70% of the total weight), with the specific weights being based on the actual calibration result. The travel environment parameters include a travel path and a road surrounding environment. The impact on the driving path's situation on the target comfort V may include static and dynamic properties of the path, (1) static properties of road type (e.g. expressway, urban road), path geometry (e.g. radius of curvature, slope angle), path length and complexity (e.g. number of intersections). (2) Dynamic properties, current speed, acceleration, steering angle of the vehicle, and real-time changes in path (e.g., turn commands provided by the navigation system). An increase in the complexity or dynamic load of the travel path may result in an increase in the target comfort V. For example, (1) if the curvature of the path is large or the turns are frequent (e.g., urban roads are frequently bent in U-shape), the driver needs to frequently adjust the vision and head posture, the visual load increases, and the target comfort V increases. (2) If the vehicle speed is high (e.g., >80 km/h), the visual information processing demand increases, the target comfort V increases, but if the path is simple (e.g., a straight-line highway), the target comfort V decreases. (3) If the path has a high grade or frequent lane changes, more visual scans are required and the target comfort V increases. Conversely, if the path is straight, low speed and simple (e.g., suburban roads), the target comfort V decreases. In one example, the travel path accounts for 20-30% of the total weight, the specific weight being based on the actual calibration result, which modulates the effect of the light environment by indirectly affecting the visual demand. The surrounding environment of the road includes external environmental factors (1) natural environment such as illumination intensity, weather (such as rainfall level, fog concentration), time (such as day/night). (2) Traffic environment, surrounding vehicle density, pedestrian activity, accident-prone zone marking. (3) Facility environment-brightness, density and location of roadway facilities (e.g., signal lights, guideboards). Deterioration of the environmental conditions may significantly increase the target comfort V. For example, (1) if the ambient lighting is uncomfortable (e.g., bright daytime light results in high contrast lmax/min, or low nighttime light), the vision adjustment burden increases and the target comfort V increases. (2) If the weather is bad (e.g., the rainfall level is high or fog is dense), the visibility is reduced and the target comfort V is increased. (3) If traffic density is high or accident-prone zone markers are present, psychological stress and visual scanning frequency increase and target comfort V increases. (4) If the asset is too dense or too bright (e.g., dense signal lights), the visual disturbance is increased and the target comfort V is increased. Conversely, if the environment is stable (e.g., moderate lighting, sunny days, low traffic density), the target comfort V decreases.
In one example, the target comfort level may be represented in a hierarchical form, such that control may be based on the hierarchy, enabling easier control, reducing complexity of control logic, and reducing computational power requirements. As can be seen from table 1, 5 levels are established, corresponding levels being determined based on the target comfort being in different interval ranges. Wherein Xn is a refractive compensation value, which can be determined according to different refractive ranges.
Reference may be made specifically to table 2.
TABLE 1
Refractive range Refractive compensation value Xn
0.5m-1~-1.0m-1 X1
-1.0m-1~-2.0m-1 X2
-2.0m-1~-4.0m-1 X3
-4.0m-1~-6.0m-1 X4
TABLE 2
In an optional embodiment of the application, the controlling the light emission of the vehicle lamp based on the target comfort level parameter includes obtaining a current comfort level parameter, determining a light control parameter corresponding to the target comfort level parameter when the current comfort level parameter is smaller than the target comfort level parameter, and controlling the light emission of the vehicle lamp based on the light control parameter.
The current comfort level parameter can be obtained, the current comfort level parameter and the target comfort level parameter are compared, under the condition that the current comfort level parameter is smaller than the target comfort level parameter, the current light comfort level is indicated to not meet the requirement, the light control parameter corresponding to the target comfort level parameter can be determined, and the light brightness, the color temperature and the irradiation angle of the vehicle lamp are controlled by adopting the light control parameter corresponding to the target comfort level parameter.
The vehicle lamp driving method comprises the steps of obtaining driving environment parameters and light environment parameters in the driving process of a vehicle, wherein the light environment parameters comprise obstacle parameters, reflector parameters, driver head posture parameters, driver refractive parameters and driver eye coordinate parameters, determining straight line vision parameters based on the obstacle parameters and the driver eye coordinate parameters, determining curve vision parameters based on the driver head posture parameters, determining reflection vision parameters based on the reflector parameters and the driver eye coordinate parameters, determining vision equidistant areas based on the driver head posture parameters and the driver eye coordinate parameters, determining refractive correction parameters based on the driver refractive parameters, determining vision range parameters based on the straight line vision parameters, the curve vision parameters, the reflection vision parameters, the vision equidistant areas and the refractive correction parameters, controlling light emission based on the vision range when the vision range parameters and the driving environment parameters meet a preset accident scene, and determining vehicle lamp comfort parameters based on the vision range and the target comfort parameters when the vision range parameters and the driving environment parameters do not meet the preset accident scene. The method comprises the steps of collecting obstacle parameters, reflector parameters, head posture parameters of a driver, refractive parameters of the driver and eye coordinate parameters of the driver, comprehensively analyzing the light environment of the driver, accurately identifying the light environment in the driving process, comparing the driving environment parameters and the light environment parameters with preset accident scenes by taking the driving environment parameters and the light environment parameters as reference factors, describing that traffic accidents are easy to happen at present under the condition that the driving environment parameters and the light environment parameters meet the preset accident scenes, controlling the light emission of the vehicle lamps through the visual field range, and transferring the fixation point of the driver to the region needing to pay attention to when the driver drives the vehicle to pass through the accident multiple road sections by utilizing an optical adjustment means, so that the occurrence probability of the accidents is reduced to a certain extent, and the driving safety is improved. Under the condition that the driving environment parameters and the light environment parameters do not meet the preset accident scene, in order to avoid the interference of light to a driver, the target comfort level parameters are determined based on the driving environment parameters and the light environment parameters, the light emission of the car lamp is controlled based on the target comfort level parameters, and the emitted light can enable the driver to be in a visual comfort range, so that the light comfort of the driver is improved, and the driving fatigue is relieved.
In order that the procedure of the embodiments of the present application may be apparent to those skilled in the art, the following description will be made with reference to the full range of examples:
The hardware architecture to which this example applies may refer to fig. 3, which includes three major parts, a sensing module, a data processing module, and a control and execution module. The sensing module, the data processing module and the control and execution module are connected to the power bus together, and data interaction is carried out among the sensing module, the data processing module and the control and execution module through a CAN (Controller Area Network ) network. The sensing module, the data processing module and the control and execution module can also be used for fault detection so as to realize self-detection. The data processing module and the control and execution module can also communicate with each other through SPI (SERIAL PERIPHERAL INTERFACE ) mechanical energy. The sensing module comprises a brightness sensing unit, a visual signal unit and other information units. The brightness sensing unit comprises a rainfall light sensor, a photoresistor, a photodiode and the like, the visual signal unit comprises sensors such as a camera in a vehicle and a camera outside the vehicle, and the other information units comprise a laser radar, a millimeter wave radar and the like. The data processing module comprises an accident prediction unit, an optical signal analysis unit and a communication interface. Specifically, the accident prediction unit is provided with an accident database label, including but not limited to visibility, precipitation conditions, traffic light setting conditions, public transportation vehicle conditions (buses, trams, trains, etc.), traffic control conditions, traffic flow conditions, time and place, road spectrum conditions, road traffic sign setting, etc., road surrounding building conditions, etc. The control and execution module comprises an in-car lighting lamp, a matrix pixel headlight and an adjusting motor. Specifically, the interior lighting lamp comprises a light assembly such as a reading lamp and the like, and can be controlled to be turned on, turned off and adjusted in brightness, the matrix pixel headlight is composed of a plurality of lamp bead modules, each lamp bead module can be independently controlled to adjust illumination brightness and color temperature, and the adjusting motor can be used for controlling the matrix pixel headlight to be adjusted transversely or longitudinally, so that the illumination range is controlled. The data processing module is used for dispersing the front irradiation range into a plurality of control subareas, each subarea is mapped with each lamp bead module one by one, and when the adjustment parameters transmitted by the data processing module are received, the brightness and the color temperature of each control subarea are mainly available. The control execution module changes the irradiation range by driving the adjusting motor, and adjusts the light parameters of the corresponding lamp bead module aiming at each control subarea, so that the light environment reaches the comfort level.
The control process of the light based on the above hardware architecture can refer to fig. 4:
Control is started when the vehicle is running.
The sensing module is started. When the vehicle issues a function starting instruction, a vehicle body controller (such as a BCM) sends a starting signal to a sensing module, and the sensing module is started. The sensing module senses the light environment parameters. The brightness sensing unit in the sensing module acquires the ambient brightness ratio Lmax/min (cd/cm 2), the rainfall light sensor unit acquires the ambient average brightness Lave (cd/cm 2), the rainfall grade information and the factory parameter forward light acceptance angle alpha and the factory parameter upper light acceptance angle beta, and sends the Lmax/min, lave, the rainfall grade, alpha and beta information to the data processing module. Meanwhile, the visual signal unit of the sensing module acquires the environment information of the interior and exterior lights of the vehicle, and specifically comprises an average brightness value lave, a maximum brightness value lmax, a minimum brightness value lmin, a brightness contrast value lmax/min, a color C, brightness uniformity lequ, visual resolution R, eye coordinates (xi, yi) of a driver and the like of the lights (including all lights inside and outside the vehicle). And the other information units of the sensing module acquire the driving path information and label information required by other accident databases, and send parameter information such as an average brightness value lave, a maximum brightness value lmax, a minimum brightness value lmin, a brightness contrast value lmax/min, a color C, brightness uniformity lequ, visual resolution R, driver eye coordinates (xi, yi) and the like to the data processing module.
The data processing module analyzes the parameters of the light environment. After the data processing module receives the light environment parameters, the light environment parameters are initially analyzed by the light signal analysis unit. Specifically, elements such as columns, steering wheels, pedestrians, road traffic facilities and the like which can obstruct the view are identified and analyzed, and optimal view range parameters are given, wherein the parameters comprise straight line view parameters such as azimuth angle (A), elevation angle (h), sight distance (d) and the like. And analyzing the view range limitation under the gesture according to the gesture of the head of the driver recognized by the camera in the vehicle, and giving out curve view parameters such as a curve surface of the maximum staring area range, a curve surface of the maximum vision area range and the like. And (3) carrying out recognition analysis on the reflection of the bright surface visual field at the eye coordinates of the driver, and giving out the reflection visual field parameters of the dazzling visual pyramid region. The initially calibrated visual range is refraction-compensated in combination with the user-entered refraction conditions (myopia, normal, hyperopia), and the maximum range that is clearly visible in the near range without the aid of an external refraction correction device is determined as the refraction correction parameter. And predicting the distance from the initial position to the appointed position of the sight according to the current head gesture and eye coordinates of the driver, and generating a visual equidistant region.
And acquiring the running environment parameters. Tag information required for acquiring travel path information, road surrounding environment information, and other accident databases can be input to the accident prediction unit.
And (5) analyzing by an accident prediction unit. And inputting label information required by the preliminary analysis result of the light environment parameters, the driving path information, the surrounding environment information of the road and other accident databases into an accident prediction unit.
And judging whether triggering is carried out. And comparing the actual information with the accident scene database, and if the triggering criterion is met, namely the actually measured data set is within the interval range of the scene database data set, sending a start light early warning instruction to the control and execution module. The light early warning instruction carries light parameters corresponding to the visual field range. If the triggering criterion is not met, namely the actually measured data set is not in the interval range of the scene database data set, the data processing module synthesizes the preliminary analysis result of the light environment, the driving path information and the surrounding environment information of the road, and the target comfort level V is calculated. If the current light parameter is the current light parameter which does not meet the target comfort level, sending a light comfort instruction to the control and execution module, and adjusting the light parameter until the current comfort level V meets the range of the interval corresponding to the target comfort level in real time.
After the light early warning instruction or the light comfort instruction is input to the control and execution module, the corresponding instruction is finished through adjustment such as brightness adjustment, color temperature adjustment, irradiation angle adjustment and the like, and the adjustment of the light environment is realized.
Brightness adjustment the control and execution module adjusts the brightness level based on the signal received from the data processing module. Specifically, after receiving the related instruction, the control and execution module sends a control instruction to a brightness adjusting module of an interior and exterior lighting lamp group (a matrix pixel headlight, an interior common lighting lamp, a rearview mirror danger prompting lamp and an atmosphere lamp) to adjust current or voltage so as to change brightness, or can adjust duty ratio through PWM (pulse width modulation), and control average brightness by utilizing the persistence effect of human eyes and changing the power-on time (namely duty ratio) of a light source through rapidly switching the light source. In particular, the in-vehicle screen and HUD (head up display) can directly adjust the screen brightness or display brightness.
And the control and execution module judges the color temperature level to be adjusted according to the signals received from the data processing module. The control and execution module sends a color temperature adjusting instruction to a color temperature adjusting module of an in-car and out-car lamp group (a matrix pixel headlight, an in-car common lighting lamp, a rearview mirror danger prompting lamp and an atmosphere lamp). The color temperature of the headlight can be changed by adjusting the filter parameters of the headlight or the specific working parameters of the bulb by taking the matrix pixel headlight as an example, or by switching the bulb combination with different color temperatures or using the light-emitting element with variable color temperature, and the color temperature can be adjusted by selecting different light-emitting sources according to the requirement by utilizing the switching mechanism of the control circuit. Meanwhile, the color temperature of the headlight can be controlled by adjusting the proportion of fluorescent powder in the headlight or using an optical coating capable of adjusting the color temperature and utilizing an electronic control signal to change the optical characteristic.
And the irradiation angle adjustment is carried out when the control and execution module judges the irradiation angle level to be adjusted according to the vehicle posture signal received from the data processing module. The control and execution module sends an instruction to the angle adjusting module of the matrix pixel headlight, and the irradiation angle of the headlight is changed by adjusting a motor driving device or a mechanical transmission part of the headlight. The information such as the inclination and the steering angle of the vehicle body can be detected through a sensor arranged on the vehicle, a proper irradiation angle adjusting value is calculated through an electronic control algorithm, and the accurate adjustment of the irradiation angle of the headlight is realized through a motor or a hydraulic device. Meanwhile, the irradiation angle of the headlight can be automatically adjusted according to the running state of the vehicle by being linked with a suspension system or a dynamic stability control system of the vehicle, so that the best illumination effect can be provided under different road conditions.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts are not necessarily required by the embodiments of the application.
Referring to fig. 5, there is shown a block diagram of an embodiment of a lamp control device of the present application, which may specifically include the following modules:
An obtaining module 501, configured to obtain a driving environment parameter and a light environment parameter during a driving process of a vehicle;
a field of view determination module 502 for determining a field of view parameter based on the light environment parameter;
a first control module 503, configured to control the light emission of the vehicle lamp based on the field of view when the field of view parameter and the driving environment parameter satisfy a preset accident scene;
And the second control module 504 is configured to determine a target comfort level parameter based on the field-of-view range parameter and the driving environment parameter, and control the vehicle lamp to emit light based on the target comfort level parameter, when the field-of-view range parameter and the driving environment parameter do not satisfy a preset accident scene.
In an alternative embodiment of the present application, the light environment parameters include an obstacle parameter, a reflector parameter, a driver head pose parameter, a driver refraction parameter, and a driver eye coordinate parameter, and the field of view determining module 502 includes:
A first determination submodule for determining a straight line visual field parameter based on the obstacle parameter and the driver eye coordinate parameter;
A second determination submodule for determining a curve view parameter based on the driver head pose parameter;
A third determination sub-module for determining a reflected field of view parameter based on the reflector parameter and the driver eye coordinate parameter;
A fourth determination submodule for determining a visual equidistant region based on the driver head pose parameter and the driver eye coordinate parameter;
a fifth determination sub-module for determining a refraction correction parameter based on the driver refraction parameter;
And the combining sub-module is used for combining the straight line visual field parameter, the curve visual field parameter, the reflection visual field parameter, the visual equidistant region and the refraction correction parameter to determine the visual field range parameter.
In an alternative embodiment of the present application, the first determining submodule includes:
the first establishing unit is used for establishing a three-dimensional environment coordinate system based on the eye coordinate parameters of the driver;
An obstacle coordinate range determining unit configured to convert the obstacle parameter to the three-dimensional environment coordinate system, and determine an obstacle coordinate range;
And the straight line visual field parameter determining unit is used for performing light projection on the obstacle coordinate range and determining straight line visual field parameters.
In an alternative embodiment of the present application, the second determining submodule includes:
an eyeball center coordinate determination unit configured to determine an eyeball center coordinate based on the driver head posture parameter;
a construction unit configured to construct a line-of-sight direction vector based on the eyeball center coordinates;
And the curve view parameter determining unit is used for interpolating the sight line direction vector and determining the boundary curved surface range as the curve view parameter.
In an alternative embodiment of the present application, the third determining submodule includes:
A second establishing unit, configured to establish a specular reflection geometric model based on the reflector parameter and the driver eye coordinate parameter;
And the reflected visual field parameter determining unit is used for determining that the range parameter of the dazzling visual angle cone corresponding to the eye coordinate parameter of the driver is the reflected visual field parameter based on the specular reflection geometric model.
In an alternative embodiment of the present application, the fourth determining submodule includes:
A gaze origin determining unit configured to determine a gaze origin based on the driver head pose parameter and the driver eye coordinate parameter;
A visual area determining unit, configured to determine a visual area from the line of sight starting point to a preset advancing distance;
And the dividing unit is used for equally dividing the visual area and determining a visual equidistant area.
In an alternative embodiment of the present application, the fifth determining submodule includes:
The conversion unit is used for converting the refraction parameters of the driver into adjustment amplitude based on a preset refraction compensation model;
a focusing distance determining unit for determining the inverse of the adjustment amplitude as a focusing distance;
A visual boundary determination unit for determining a visual boundary in combination with the focusing distance and the driver refractive parameter;
and the refraction correction parameter determining unit is used for determining refraction correction parameters based on the visual boundary.
In an alternative embodiment of the present application, the second control module 504 includes:
And the target comfort level parameter determination submodule is used for carrying out weighted calculation on the straight line visual field parameter, the curve visual field parameter, the reflection visual field parameter, the visual equidistant region, the refraction correction parameter and the running environment parameter to determine the target comfort level parameter.
In an alternative embodiment of the present application, the second control module 504 includes:
the acquisition sub-module is used for acquiring the current comfort level parameter;
A control parameter determining sub-module, configured to determine a light control parameter corresponding to the target comfort parameter when the current comfort parameter is less than the target comfort parameter;
And the control sub-module is used for controlling the car lamp to emit light based on the lamplight control parameters.
The embodiment of the application obtains a running environment parameter and a light environment parameter in the running process of a vehicle, determines a visual field range parameter based on the light environment parameter, controls the light emission of the vehicle lamp based on the visual field range when the visual field range parameter and the running environment parameter meet a preset accident scene, determines a target comfort degree parameter based on the visual field range parameter and the running environment parameter when the visual field range parameter and the running environment parameter do not meet the preset accident scene, and controls the light emission of the vehicle lamp based on the target comfort degree parameter. In the running process of the vehicle, the running environment parameters and the light environment parameters are used as reference factors to be compared with the preset accident scene, and under the condition that the running environment parameters and the light environment parameters meet the preset accident scene, the current traffic accident is easy to occur, the light emission of the vehicle lamps is controlled through the visual field range, and the light adjustment means is utilized to enable the driver to shift the point of regard of the driver to the region needing to pay attention to when the driver drives the vehicle to pass through the accident multiple road section, so that the occurrence probability of the accident is reduced to a certain extent, and the running safety is improved. Under the condition that the driving environment parameters and the light environment parameters do not meet the preset accident scene, in order to avoid the interference of light to a driver, the target comfort level parameters are determined based on the driving environment parameters and the light environment parameters, the light emission of the car lamp is controlled based on the target comfort level parameters, and the emitted light can enable the driver to be in a visual comfort range, so that the light comfort of the driver is improved, and the driving fatigue is relieved.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
Referring to fig. 6, an embodiment of the application also provides a vehicle comprising a processor 601, a memory 602 and a computer program stored on said memory and capable of running on said processor, said computer program implementing the steps of the vehicle lamp control method as described above when being executed by said processor. The car lamp control method comprises the following steps:
Acquiring a running environment parameter and a light environment parameter in the running process of the vehicle;
determining a field of view parameter based on the light environment parameter;
controlling the light emission of the vehicle lamp based on the visual field range under the condition that the visual field range parameter and the running environment parameter meet the preset accident scene;
and under the condition that the visual field range parameter and the running environment parameter do not meet a preset accident scene, determining a target comfort degree parameter based on the visual field range parameter and the running environment parameter, and controlling the vehicle lamp to emit light based on the target comfort degree parameter.
Optionally, the light environment parameters include an obstacle parameter, a reflector parameter, a driver head pose parameter, a driver refraction parameter, and a driver eye coordinate parameter, and determining the view field range parameter based on the light environment parameters includes:
Determining a straight line view parameter based on the obstacle parameter and the driver eye coordinate parameter;
determining a curvilinear field of view parameter based on the driver head pose parameter;
Determining a reflected field of view parameter based on the reflector parameter and the driver eye coordinate parameter;
Determining a visual equidistant region based on the driver head pose parameter and the driver eye coordinate parameter;
Determining a refraction correction parameter based on the driver refraction parameter;
And determining a visual field range parameter by combining the straight visual field parameter, the curve visual field parameter, the reflection visual field parameter, the visual equidistant region and the refraction correction parameter.
Optionally, the determining a straight line view parameter based on the obstacle parameter and the driver eye coordinate parameter includes:
establishing a three-dimensional environment coordinate system based on the eye coordinate parameters of the driver;
Converting the obstacle parameters into the three-dimensional environment coordinate system, and determining an obstacle coordinate range;
and carrying out light projection on the obstacle coordinate range, and determining a straight line visual field parameter.
Optionally, the determining a curve view parameter based on the driver head pose parameter includes:
based on the head posture parameters of the driver, the center coordinates of eyeballs are determined;
Constructing a sight direction vector based on the eyeball center coordinates;
and interpolating the sight line direction vector to determine that the boundary curved surface range is the curve visual field parameter.
Optionally, the determining a reflected field of view parameter based on the reflector parameter and the driver eye coordinate parameter includes:
establishing a specular reflection geometric model based on the reflector parameters and the driver eye coordinate parameters;
And determining that the range parameter of the dazzling viewing angle cone corresponding to the eye coordinate parameter of the driver is a reflection view field parameter based on the specular reflection geometric model.
Optionally, the determining a visual equidistant region based on the driver head pose parameter and the driver eye coordinate parameter includes:
determining a gaze origin based on the driver head pose parameter and the driver eye coordinate parameter;
determining a visual area from the sight starting point to a preset advancing distance;
And equally dividing the visual area to determine a visual equidistant area.
Optionally, the determining the refraction correction parameter based on the driver refraction parameter includes:
Converting the driver refractive parameters into an adjustment amplitude based on a preset refractive compensation model;
Determining the reciprocal of the adjustment amplitude as a focusing distance;
Determining a visual boundary in combination with the focusing distance and the driver refractive parameter;
Based on the visual boundary, refraction correction parameters are determined.
Optionally, the determining the target comfort parameter based on the field of view parameter and the driving environment parameter includes:
And carrying out weighted calculation on the straight line visual field parameter, the curve visual field parameter, the reflection visual field parameter, the visual equidistant region, the refraction correction parameter and the running environment parameter to determine a target comfort degree parameter.
Optionally, the controlling the lighting of the vehicle lamp based on the target comfort parameter includes:
acquiring a current comfort degree parameter;
determining a light control parameter corresponding to the target comfort level parameter under the condition that the current comfort level parameter is smaller than the target comfort level parameter;
and controlling the light of the car lamp to emit light based on the light control parameter.
The embodiment of the application obtains a running environment parameter and a light environment parameter in the running process of a vehicle, determines a visual field range parameter based on the light environment parameter, controls the light emission of the vehicle lamp based on the visual field range when the visual field range parameter and the running environment parameter meet a preset accident scene, determines a target comfort degree parameter based on the visual field range parameter and the running environment parameter when the visual field range parameter and the running environment parameter do not meet the preset accident scene, and controls the light emission of the vehicle lamp based on the target comfort degree parameter. In the running process of the vehicle, the running environment parameters and the light environment parameters are used as reference factors to be compared with the preset accident scene, and under the condition that the running environment parameters and the light environment parameters meet the preset accident scene, the current traffic accident is easy to occur, the light emission of the vehicle lamps is controlled through the visual field range, and the light adjustment means is utilized to enable the driver to shift the point of regard of the driver to the region needing to pay attention to when the driver drives the vehicle to pass through the accident multiple road section, so that the occurrence probability of the accident is reduced to a certain extent, and the running safety is improved. Under the condition that the driving environment parameters and the light environment parameters do not meet the preset accident scene, in order to avoid the interference of light to a driver, the target comfort level parameters are determined based on the driving environment parameters and the light environment parameters, the light emission of the car lamp is controlled based on the target comfort level parameters, and the emitted light can enable the driver to be in a visual comfort range, so that the light comfort of the driver is improved, and the driving fatigue is relieved.
The memory may include a random access memory (Random Access Memory, abbreviated as RAM) or a non-volatile memory (non-volatile memory), such as at least one disk memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central Processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), a digital signal processor (DIGITAL SIGNAL Processing, DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable gate array (FPGA) or other Programmable logic device, discrete gate or transistor logic device, or discrete hardware components.
Referring to fig. 7, an embodiment of the present application further provides a computer readable storage medium 701, where the storage medium 701 stores a computer program, and the computer program when executed by a processor performs the steps of the vehicle lamp control method according to any one of the embodiments of the present application. The car lamp control method comprises the following steps:
Acquiring a running environment parameter and a light environment parameter in the running process of the vehicle;
determining a field of view parameter based on the light environment parameter;
controlling the light emission of the vehicle lamp based on the visual field range under the condition that the visual field range parameter and the running environment parameter meet the preset accident scene;
and under the condition that the visual field range parameter and the running environment parameter do not meet a preset accident scene, determining a target comfort degree parameter based on the visual field range parameter and the running environment parameter, and controlling the vehicle lamp to emit light based on the target comfort degree parameter.
Optionally, the light environment parameters include an obstacle parameter, a reflector parameter, a driver head pose parameter, a driver refraction parameter, and a driver eye coordinate parameter, and determining the view field range parameter based on the light environment parameters includes:
Determining a straight line view parameter based on the obstacle parameter and the driver eye coordinate parameter;
determining a curvilinear field of view parameter based on the driver head pose parameter;
Determining a reflected field of view parameter based on the reflector parameter and the driver eye coordinate parameter;
Determining a visual equidistant region based on the driver head pose parameter and the driver eye coordinate parameter;
Determining a refraction correction parameter based on the driver refraction parameter;
And determining a visual field range parameter by combining the straight visual field parameter, the curve visual field parameter, the reflection visual field parameter, the visual equidistant region and the refraction correction parameter.
Optionally, the determining a straight line view parameter based on the obstacle parameter and the driver eye coordinate parameter includes:
establishing a three-dimensional environment coordinate system based on the eye coordinate parameters of the driver;
Converting the obstacle parameters into the three-dimensional environment coordinate system, and determining an obstacle coordinate range;
and carrying out light projection on the obstacle coordinate range, and determining a straight line visual field parameter.
Optionally, the determining a curve view parameter based on the driver head pose parameter includes:
based on the head posture parameters of the driver, the center coordinates of eyeballs are determined;
Constructing a sight direction vector based on the eyeball center coordinates;
and interpolating the sight line direction vector to determine that the boundary curved surface range is the curve visual field parameter.
Optionally, the determining a reflected field of view parameter based on the reflector parameter and the driver eye coordinate parameter includes:
establishing a specular reflection geometric model based on the reflector parameters and the driver eye coordinate parameters;
And determining that the range parameter of the dazzling viewing angle cone corresponding to the eye coordinate parameter of the driver is a reflection view field parameter based on the specular reflection geometric model.
Optionally, the determining a visual equidistant region based on the driver head pose parameter and the driver eye coordinate parameter includes:
determining a gaze origin based on the driver head pose parameter and the driver eye coordinate parameter;
determining a visual area from the sight starting point to a preset advancing distance;
And equally dividing the visual area to determine a visual equidistant area.
Optionally, the determining the refraction correction parameter based on the driver refraction parameter includes:
Converting the driver refractive parameters into an adjustment amplitude based on a preset refractive compensation model;
Determining the reciprocal of the adjustment amplitude as a focusing distance;
Determining a visual boundary in combination with the focusing distance and the driver refractive parameter;
Based on the visual boundary, refraction correction parameters are determined.
Optionally, the determining the target comfort parameter based on the field of view parameter and the driving environment parameter includes:
And carrying out weighted calculation on the straight line visual field parameter, the curve visual field parameter, the reflection visual field parameter, the visual equidistant region, the refraction correction parameter and the running environment parameter to determine a target comfort degree parameter.
Optionally, the controlling the lighting of the vehicle lamp based on the target comfort parameter includes:
acquiring a current comfort degree parameter;
determining a light control parameter corresponding to the target comfort level parameter under the condition that the current comfort level parameter is smaller than the target comfort level parameter;
and controlling the light of the car lamp to emit light based on the light control parameter.
The embodiment of the application obtains a running environment parameter and a light environment parameter in the running process of a vehicle, determines a visual field range parameter based on the light environment parameter, controls the light emission of the vehicle lamp based on the visual field range when the visual field range parameter and the running environment parameter meet a preset accident scene, determines a target comfort degree parameter based on the visual field range parameter and the running environment parameter when the visual field range parameter and the running environment parameter do not meet the preset accident scene, and controls the light emission of the vehicle lamp based on the target comfort degree parameter. In the running process of the vehicle, the running environment parameters and the light environment parameters are used as reference factors to be compared with the preset accident scene, and under the condition that the running environment parameters and the light environment parameters meet the preset accident scene, the current traffic accident is easy to occur, the light emission of the vehicle lamps is controlled through the visual field range, and the light adjustment means is utilized to enable the driver to shift the point of regard of the driver to the region needing to pay attention to when the driver drives the vehicle to pass through the accident multiple road section, so that the occurrence probability of the accident is reduced to a certain extent, and the running safety is improved. Under the condition that the driving environment parameters and the light environment parameters do not meet the preset accident scene, in order to avoid the interference of light to a driver, the target comfort level parameters are determined based on the driving environment parameters and the light environment parameters, the light emission of the car lamp is controlled based on the target comfort level parameters, and the emitted light can enable the driver to be in a visual comfort range, so that the light comfort of the driver is improved, and the driving fatigue is relieved.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
It will be apparent to those skilled in the art that embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the application may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the scope of the embodiments of the application.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article or terminal device comprising the element.
The method for controlling the vehicle lamp, the vehicle lamp control device, the vehicle and the computer readable storage medium provided by the application are described in detail, the specific examples are used for describing the principle and the implementation of the application, the description of the examples is only used for helping to understand the method and the core idea of the application, and meanwhile, the technical personnel in the art can change the specific implementation and the application scope according to the idea of the application, so that the content of the description is not to be construed as limiting the application.

Claims (12)

1. A vehicle lamp control method, characterized by comprising:
Acquiring a running environment parameter and a light environment parameter in the running process of the vehicle;
determining a field of view parameter based on the light environment parameter;
controlling the light emission of the vehicle lamp based on the visual field range under the condition that the visual field range parameter and the running environment parameter meet the preset accident scene;
and under the condition that the visual field range parameter and the running environment parameter do not meet a preset accident scene, determining a target comfort degree parameter based on the visual field range parameter and the running environment parameter, and controlling the vehicle lamp to emit light based on the target comfort degree parameter.
2. The method of claim 1, wherein the light environment parameters include an obstacle parameter, a reflector parameter, a driver head pose parameter, a driver refractive parameter, a driver eye coordinate parameter, the determining a field of view parameter based on the light environment parameters comprising:
Determining a straight line view parameter based on the obstacle parameter and the driver eye coordinate parameter;
determining a curvilinear field of view parameter based on the driver head pose parameter;
Determining a reflected field of view parameter based on the reflector parameter and the driver eye coordinate parameter;
Determining a visual equidistant region based on the driver head pose parameter and the driver eye coordinate parameter;
Determining a refraction correction parameter based on the driver refraction parameter;
And determining a visual field range parameter by combining the straight visual field parameter, the curve visual field parameter, the reflection visual field parameter, the visual equidistant region and the refraction correction parameter.
3. The method of claim 2, wherein the determining a straight line view parameter based on the obstacle parameter and the driver eye coordinate parameter comprises:
establishing a three-dimensional environment coordinate system based on the eye coordinate parameters of the driver;
Converting the obstacle parameters into the three-dimensional environment coordinate system, and determining an obstacle coordinate range;
and carrying out light projection on the obstacle coordinate range, and determining a straight line visual field parameter.
4. The method of claim 2, wherein the determining a curvilinear field of view parameter based on the driver head pose parameter comprises:
based on the head posture parameters of the driver, the center coordinates of eyeballs are determined;
Constructing a sight direction vector based on the eyeball center coordinates;
and interpolating the sight line direction vector to determine that the boundary curved surface range is the curve visual field parameter.
5. The method of claim 2, wherein the determining a reflected field of view parameter based on the reflector parameter and the driver eye coordinate parameter comprises:
establishing a specular reflection geometric model based on the reflector parameters and the driver eye coordinate parameters;
And determining that the range parameter of the dazzling viewing angle cone corresponding to the eye coordinate parameter of the driver is a reflection view field parameter based on the specular reflection geometric model.
6. The method of claim 2, wherein the determining a visual equidistant region based on the driver head pose parameter and the driver eye coordinate parameter comprises:
determining a gaze origin based on the driver head pose parameter and the driver eye coordinate parameter;
determining a visual area from the sight starting point to a preset advancing distance;
And equally dividing the visual area to determine a visual equidistant area.
7. The method of claim 2, wherein the determining refractive correction parameters based on the driver refractive parameters comprises:
Converting the driver refractive parameters into an adjustment amplitude based on a preset refractive compensation model;
Determining the reciprocal of the adjustment amplitude as a focusing distance;
Determining a visual boundary in combination with the focusing distance and the driver refractive parameter;
Based on the visual boundary, refraction correction parameters are determined.
8. The method of claim 2, wherein the determining a target comfort parameter based on the field of view parameter and the driving environment parameter comprises:
And carrying out weighted calculation on the straight line visual field parameter, the curve visual field parameter, the reflection visual field parameter, the visual equidistant region, the refraction correction parameter and the running environment parameter to determine a target comfort degree parameter.
9. The method of claim 1, wherein the controlling the vehicle light lighting based on the target comfort parameter comprises:
acquiring a current comfort degree parameter;
determining a light control parameter corresponding to the target comfort level parameter under the condition that the current comfort level parameter is smaller than the target comfort level parameter;
and controlling the light of the car lamp to emit light based on the light control parameter.
10. A vehicle lamp control device, characterized by comprising:
the acquisition module is used for acquiring the running environment parameters and the light environment parameters in the running process of the vehicle;
The visual field determining module is used for determining visual field range parameters based on the light environment parameters;
The first control module is used for controlling the car lamp to emit light based on the visual field range under the condition that the visual field range parameter and the running environment parameter meet the preset accident scene;
And the second control module is used for determining a target comfort level parameter based on the visual field range parameter and the running environment parameter and controlling the vehicle lamp to emit light based on the target comfort level parameter under the condition that the visual field range parameter and the running environment parameter do not meet the preset accident scene.
11. A vehicle comprising a processor, a memory and a computer program stored on the memory and operable on the processor, which when executed by the processor, performs the steps of the vehicle lamp control method as claimed in any one of claims 1 to 9.
12. A computer-readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the steps of the vehicle lamp control method as claimed in any one of claims 1 to 9.
CN202510945982.XA 2025-07-09 2025-07-09 Vehicle light control method, device, vehicle and storage medium Pending CN120583559A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202510945982.XA CN120583559A (en) 2025-07-09 2025-07-09 Vehicle light control method, device, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202510945982.XA CN120583559A (en) 2025-07-09 2025-07-09 Vehicle light control method, device, vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN120583559A true CN120583559A (en) 2025-09-02

Family

ID=96860099

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202510945982.XA Pending CN120583559A (en) 2025-07-09 2025-07-09 Vehicle light control method, device, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN120583559A (en)

Similar Documents

Publication Publication Date Title
US20240326726A1 (en) Vehicular control system
JP2783079B2 (en) Light distribution control device for headlamp
US11731558B2 (en) Rearview device simulation
US10889232B2 (en) Vehicle control method that compares a light distribution pattern extracted from an image captured by a camera with a reference light distribution pattern
US11040659B2 (en) Rear-view mirror simulation
JP3619628B2 (en) Driving environment recognition device
US10479269B2 (en) Lighting apparatus for vehicle and vehicle having the same
JP6460058B2 (en) Vehicle control device
US10427588B1 (en) Automatic beam-shaping using an on-car camera system
JP6929481B1 (en) Light distribution control device, light distribution control method and light distribution control program
CN112406687A (en) 'man-vehicle-road' cooperative programmable matrix headlamp system and method
JP3857698B2 (en) Driving environment recognition device
JPH06191344A (en) Headlamp light distribution control device
CN117429342A (en) Light control method, control device, storage medium and vehicle
CN120583559A (en) Vehicle light control method, device, vehicle and storage medium
JP2021146897A (en) Headlight control device
CN117922420A (en) Control method, device, equipment and storage medium for automobile headlamp
CN119893792A (en) High-pixel imaging method and system for car light distribution system
JP7344635B2 (en) heads up display device
CN116691492A (en) A pre-opening system and control method for automobile headlights
CN115593312A (en) Electronic rearview mirror mode switching method based on environment monitoring analysis
JP7501416B2 (en) Vehicle lighting control device
CN120439928A (en) Lighting control method, device, system, electronic device, storage medium, and vehicle
CN120921911A (en) Display control device, method, head-up display device and computer storage medium
CN121240289A (en) Electric vehicle intelligent light adjusting system based on multi-mode data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination