CN109141461A - Automobile digital map navigation control system and method - Google Patents
Automobile digital map navigation control system and method Download PDFInfo
- Publication number
- CN109141461A CN109141461A CN201710441580.1A CN201710441580A CN109141461A CN 109141461 A CN109141461 A CN 109141461A CN 201710441580 A CN201710441580 A CN 201710441580A CN 109141461 A CN109141461 A CN 109141461A
- Authority
- CN
- China
- Prior art keywords
- eyeball
- display
- driver
- processor unit
- central processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Human Computer Interaction (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
This application discloses a kind of central processor units for the control of automobile digital map navigation, it includes the first data-interface and the second data-interface, first data-interface can receive the data of the detection module from the eyeball movement for detecting driver, wherein, the detection module includes for the RF transmitter to the eyeball of driver transmitting infrared ray, and camera, the camera can capture the facial image of driver and the infrared signal from ocular reflex, second data-interface can issue the instruction for visually showing map layer to a display, tracking data based on the eyeball movement obtained when observing the display via the received driver of the first data-interface by the detection module, the central processor unit is issued from second data-interface to the display and is shown on it The instruction that the map layer shown is correspondingly changed.Disclosed herein as well is the system and method using the central processor unit.
Description
Technical field
The application relates generally to automobile digital map navigation control method and system based on eye tracking technology.
Background technique
In current automotive field, digital map navigation technology is applied more and more widely.In general, in the instrument of automobile
A touch-screen type display can be set on dash board or console, navigation map can be shown on the display.Then, it drives
The person of sailing can be searched by way of sliding of the finger on indicator screen and/or by voice prompting based on navigation map
The operations such as rope driving destination, programme path, scaling map.
But need the hand of driver to leave steering wheel by finger manipulation display, it is being travelled in automobile
This will lead to potential danger in the process.In addition, although voice prompt operation does not require the hand of driver to leave steering wheel,
This is higher to the sound isolation requirement of cabin interior.For example, if wind, which is made an uproar, tire is made an uproar or engine noise is incoming drives in driving process
It sails interior excessively, will affect the accuracy of voice operating.In some cases, this precision of identifying speech is not high.This
Outside, if driver speaks with dialect, voice operating will be more difficult.
Summary of the invention
For problem mentioned above, the application is directed to improved automobile digital map navigation control method and system, from
And driver can be more safe and operate digital map navigation by noise jamming, and to different driver adaptability effects
More preferably.
According to the one aspect of the application, a kind of central processor unit for the control of automobile digital map navigation is provided,
The central processor unit includes the first data-interface and the second data-interface, and first data-interface, which can receive, to be come
From the data of the detection module of the eyeball movement for detecting driver, wherein the detection module includes for driver
Eyeball transmitting infrared ray RF transmitter and camera, the camera can capture the facial image of driver
With the infrared signal from ocular reflex, second data-interface can issue to a display and visually show Map
The instruction of layer is obtained when based on driver's observation display received via first data-interface by the detection module
The tracking data of the eyeball movement obtained, the central processor unit are emitted in from second data-interface to the display
The instruction that the map layer shown thereon is correspondingly changed.
According to further aspect of the application, a kind of automobile digital map navigation control system is provided comprising:
The detection module being arranged in automobile bodies, it includes for the infrared ray to the eyeball of driver transmitting infrared ray
The camera of transmitter and the facial image for capturing driver and the infrared signal from ocular reflex;
The display being arranged in automobile bodies is fixed relative to the detection module position, and in the display
Map layer can be visually shown on device;And
Central processor unit is connected with the detection module and the display data, wherein is seated in driver
When driver's seat and the observation display, the tracking data of the eyeball movement based on the driver obtained by the detection module,
The map layer that the central processor unit instruction is shown on the display is correspondingly changed.
Optionally, the detection module is mounted on the face that can easily observe driver in the compartment of automobile
Position, also, the display is mounted on the position that can be observed convenient for driver in the compartment of automobile.
Optionally, the central processor unit includes memory module, has been previously stored driving in the memory module
The corresponding relationship that the eyeball movement of member changes with map layer, according to the corresponding relationship of the storage, the central processing unit list
The map layer that metainstruction is shown on the display is correspondingly changed.
Optionally, by means of the prompt information and/or voice prompting that show on the display, the central processing unit
New corresponding relationship that the eyeball movement that unit can define driver changes with map layer and by the new corresponding relationship
It is stored in memory module, for then calling.
Optionally, the map layer changes movement, scaling, pause including map layer, the eyeball of the driver
Movement includes opening eyes, staring, blinking.
Optionally, eyeball concern area is defined on the display, and eyeball concern area is located at the map layer
Showing edge within, also define a virtual boundary on the display, the virtual boundary is located at the map layer
Between showing edge and eyeball concern area, when being located in eyeball concern area, the map layer can not be changed.
Alternatively, eyeball concern area is defined on the display, and eyeball concern area is located at the Map
Within the showing edge of layer, a virtual boundary is also defined on the display, the virtual boundary is located at the map layer
Showing edge and eyeball concern area between, when detecting that eyeball focus is located at the eyeball and pays close attention to outside area, with having in mind
Ball focus is detected, the map layer correspondingly shown movement mobile towards the virtual boundary.
Optionally, the map layer is correspondingly shown movement speed at a distance from eyeball focus movement at just
Than.
Optionally, the map layer is correspondingly shown the direction in mobile direction and eyeball focus movement substantially
It is identical.
Optionally, the detection module is located at steering wheel rear, and the display is located on automobile center console.
According to further aspect of the application, a kind of automobile digital map navigation control method is provided, comprising:
Detection module and display are provided in automobile bodies, their positions are relatively fixed, and the detection module includes to use
In the RF transmitter to the eyeball of driver transmitting infrared ray and facial image for capturing driver and from eyeball
The camera of the infrared signal of reflection, the display can visually show map layer;
When driver is seated in driver's seat and observes the display, it is based at least partially on and is obtained by the detection module
Driver eyeball movement tracking data, the map layer shown on the display, which is able to carry out, correspondingly to be changed.
Using the present processes and system, the hand without driver leaves steering wheel, improves the safety of running car
Property.In addition, sound isolation requirement when to running car reduces, but also the manufacturing cost of automobile is without correspondingling increase.
Detailed description of the invention
The aforementioned and other side of the application will be more fully understood from aftermentioned detailed description and in conjunction with following attached drawing
Face.It should be pointed out that the ratio of each attached drawing is possible to different for clearly explained purpose, but this will not influence to this
The understanding of application.In the accompanying drawings:
Fig. 1 is diagrammatically illustrated according to one embodiment of the application for capturing the oculomotor detection mould of driver
Block installation diagram in the car;
Fig. 2 diagrammatically illustrates the system frame of the automobile digital map navigation control method according to one embodiment of the application
Figure;And
Fig. 3 is diagrammatically illustrated to be shown using the vehicle-mounted automobile digital map navigation of the automobile digital map navigation control method of the application
The display figure of device.
Specific embodiment
In each attached drawing of the application, structure is identical or intimate feature is indicated by the same numbers.
Present applicant proposes a kind of automobile digital map navigation control systems and method based on eye tracking technology.The system
It is achieved in that in the layont of Interior Cab one of automobile with method for capturing the oculomotor detection module of driver.Detect mould
Block may include for the RF transmitter (such as LED infrared transmitter) to eyeball transmitting infrared ray and for capturing people
The camera of face image and/or the infrared ray from ocular reflex.It, can be with by the infrared signal of analysis camera capture reflection
Obtain the information for being able to reflect the various movements of eyeball.It will be apparent to those skilled in the art that the system and method for the application can be with
Any it can capture the oculomotor detection module of driver using what can be obtained in the market and realize.
Fig. 1, which is diagrammatically illustrated, to be realized according to the automobile digital map navigation control system of the application and driving for capturing for method
Installation diagram of the oculomotor detection module of the person of sailing in automobile.As shown in Figure 1, such as side of may be mounted at of detection module 100
To disk rear, facilitates the camera of detection module 100 to capture the facial characteristics of driver and/or capture the infrared reflection of eyeball
Signal.It is alternatively possible to light filling lamps and lanterns are equipped with for the camera, convenient for the camera shooting under dim environment.Detection module 100 can
It is connected by way of wiring or wirelessly with car running computer data.In addition, the map on the console 200 of automobile is led
Boat display (Fig. 1 is hidden) is also connected with car running computer data.In this way, data of the car running computer in processing detection module 100
The display interface operation on digital map navigation display can be correspondingly triggered afterwards.
The position of detection module 100 is not limited to the form of Fig. 1.For example, detection module 100 can also be arranged in vehicle
Interior reflective mirror lower section is directly arranged on automobile center console positioned at the front of co-driver or is arranged in court on the A column of automobile
To driver side to etc. it is any convenient for capture driver's facial characteristics position.It is important that no matter detection module 100 is arranged in
Which interior position, it is necessary to ensure that be fixed in position between detection module 100 and digital map navigation display.As a same reason, it removes
Other than on console, digital map navigation display also can be set in instrument board or integrate with instrument board, even
It can be used as the virtual image and projected in front of driver using head-up display device via windshield glass.In this way, in the eye of driver
When eyeball observes map navigation indicator, the eyeball action message of driver is captured by detection module 100, counter can push away driver
Eyeball observe map navigation indicator when focal position.
Fig. 2 diagrammatically illustrates the automobile digital map navigation based on eye tracking technology of one embodiment according to the application
The block diagram of control method.It will be apparent to those skilled in the art that the present processes can be in the car running computer of automobile or another
It is performed in a manner of computer instruction in one independent central processor unit, wherein the central processor unit can be equipped with
First data-interface and the second data-interface, first data-interface can receive the data for carrying out self-detection module 100, described
Second data-interface can issue the instruction for visually showing map layer to the display.
Firstly, starting the automobile digital map navigation control method based on eye tracking technology in step S10.For example, the step
It can start after driver enters in compartment and starts engine.Then, in step S20, the camera of detection module 100 is captured
The facial characteristics of driver.For example, the entire face-image of camera capture driver or part face-image.In step S30,
Judge whether driver drives a car for the first time based on the facial characteristics captured in step S20.If it is judged that for "Yes"
Words, then in step S40, tracked using the RF transmitter and camera of detection module 100 and record the eyeball of driver
Movement.Then, in step S50, the image based on console is shown and voice prompting, and the eyeball of driver is moved accordingly
Make, so that car running computer corrects subsequent automobile digital map navigation control action according to the eyeball motor habit of driver.Such as only
Only as an example, two points can be correspondingly shown on the display of console, then the mesh of voice prompting driver
Optical focus is moved to another point from a point, to correct eye tracking as a result, improving subsequent automobile digital map navigation control
Precision.
Then, in step S60, judge whether above-mentioned correction has been completed.If it is judged that being "Yes", then step is gone to
S70.If it is judged that being "No", then step S40 is passed again to.The judging result of step S30 mentioned above is "No",
Go to step S70.
In step S70, driver can be acted by the eyeball of predefined or customized eyeball act to operate
In the digital map navigation display of console 200.For example, the eyeball of predefined is dynamic as schematic and unrestricted example
Work can be as listed by following table 1.
Table 1
It will be apparent to those skilled in the art that simply showing map operation above and eyeball movement is corresponding non-limiting
Example.Map operation in table 1 can also be realized by the movement of other eyeballs or map operation and eyeball movement in table 1
It can also be interchangeable with one another.For example, the operation for expanding map is realized by staring, and the operation of map is reduced by opening eyes wide Lai real
It is existing.
Fig. 3 shows the display interface figure of automobile digital map navigation display.Display interface includes positioned at the ground of display bottom
Figure figure layer 210.The map layer can for example use existing network and/or offline map data, such as Baidu map, paddy
Sing map etc..By taking moving map as an example, map operation described briefly below.Substantially centrally define an eyeball first over the display
Pay close attention to area 220.In addition, also defining a virtual boundary 230 in the periphery in eyeball concern area 220 over the display.Virtual boundary
230 closer to map layer 210 edge.For example, the definition data of eyeball concern area 220 and virtual boundary 230 only exist
It stores in car running computer, and will not show in the display interface of display.It is captured in car running computer based on detection module 100
After data judge that the eyeball focus of driver comes into eyeball concern area 220, it can be provided eyeball focal position exists
The mobile any operation that not will lead to map layer 210 in eyeball concern area 220.Detection module 100 is based in car running computer to catch
After the data obtained judge that the eyeball focus of driver has been positioned at outside eyeball concern area 220, as eyeball focus is increasingly towards
Virtual boundary 230 is mobile, and the map shown in map layer 210 is mobile also along same direction.Furthermore, it is possible to be arranged to map
The mobile speed of figure layer and eyeball focus from eyeball pay close attention to area 220 move towards virtual boundary 230 at a distance from it is directly proportional, and/or
The mobile direction of map layer is roughly the same with the mobile direction of eyeball focus.For example, as shown in figure 3, when eyeball focus is in eye
Ball concern area 220 is moved to outside after the P1 point of map (moving horizontally distance is L1), and map is to correspondingly moving left, so that P1
The center O in point towards eyeball concern area 220 is mobile.Furthermore when eyeball focus is moved to the P2 of map outside eyeball concern area 220
After point (moving horizontally distance is L2), map is to correspondingly moving left, so that center O of the P2 point towards eyeball concern area 220
Point movement.L1 is less than L2, then the mobile speed of a preceding map is less than the mobile speed of a rear map.In another substitution
In embodiment, the mobile speed of map also can be set into after the movement of each eyeball focus away from directly proportional at a distance from the O of center.
For another example according to table 1, in the certain point such as 1 second for a period of time for detecting that driver keeps a close watch on map layer 210
When above, if detecting that the eyes of driver are opened wide, correspondingly amplification shows map layer centered on the point.Amplification is aobvious
The speed shown can be directly proportional to the degree that the eyes of driver are opened wide.For another example according to table 1, detecting that driver keeps a close watch on
When the certain point on map layer 210 such as 1 second or more for a period of time, if detecting one timing of eye gaze of driver
Between, such as 1 second, 2 seconds or other suitable times, then display map layer is correspondingly reduced centered on the point, reduces display
Speed will not be felt fatigue and be preferred with human eye.For example, when the eyes of driver stare at display, particularly eyeball concern area
220 blink eyes it is primary after, map layer 210 can suspend mobile or scaling.Certainly, it according to needs, also can specify that into
Eyes twice or repeatedly after, map layer 210 just suspends mobile or scaling.
Optionally, in addition, multiple icon areas 240, Mei Getu can be virtually shown on automobile digital map navigation display
Mark area 240 can represent a map operation function or an automotive media or airconditioning control function.For example, an icon
Area 240 can represent display road conditions, and when the eyeball focus for detecting driver enters in the region, car running computer is according to nothing
The data that line networking obtains show real-time road on map layer 210.For another example an icon area 240 can represent driving
Member is customized to act associated map operation with eyeball.For example, when detecting that the eyeball focus of driver enters in the region
When, car running computer can select some map operation shown on viewing display to act by voice prompting, then pass through again
Verbal cue, driver's eyeball makes customized movement, after then driver confirms, map operation is acted and is driven
It is associated that the person's of sailing eyeball makes customized movement, and is recorded in the database of car running computer.It is made after driver same
After the eyeball movement of sample, correspondingly invocation map operational motion.
In conclusion according to an embodiment of the present application, providing a kind of central processing for the control of automobile digital map navigation
Device unit, the central processor unit include the first data-interface and the second data-interface, the first data-interface energy
Enough data for receiving the detection module from the eyeball movement for detecting driver, wherein the detection module includes to be used for
To the RF transmitter and camera of the eyeball of driver transmitting infrared ray, the camera can capture driver's
Facial image and infrared signal from ocular reflex, second data-interface can be issued to a display and visually be shown
The instruction of pictorial map figure layer, by the inspection when based on driver's observation display received via first data-interface
The tracking data for the eyeball movement that module obtains is surveyed, the central processor unit is from second data-interface to the display
Device issues the instruction that the map layer shown on it is correspondingly changed.
In addition, according to an embodiment of the present application, additionally providing a kind of automobile digital map navigation control system comprising:
The detection module being arranged in automobile bodies, it includes for the infrared ray to the eyeball of driver transmitting infrared ray
The camera of transmitter and the facial image for capturing driver and the infrared signal from ocular reflex;
The display being arranged in automobile bodies is fixed relative to the detection module position, and in the display
Map layer can be visually shown on device;And
Central processor unit is connected with the detection module and the display data, wherein is seated in driver
When driver's seat and the observation display, the tracking data of the eyeball movement based on the driver obtained by the detection module,
The map layer that the central processor unit instruction is shown on the display is correspondingly changed.
In addition, according to an embodiment of the present application, additionally providing a kind of automobile digital map navigation control method, comprising:
Detection module and display are provided in automobile bodies, their positions are relatively fixed, and the detection module includes to use
In the RF transmitter to the eyeball of driver transmitting infrared ray and facial image for capturing driver and from eyeball
The camera of the infrared signal of reflection, the display can visually show map layer;
When driver is seated in driver's seat and observes the display, based on the driver's obtained by the detection module
The tracking data of eyeball movement, instructs the map layer shown on the display correspondingly to be changed.
Using the present processes and system, the hand without driver leaves steering wheel, improves the safety of running car
Property.In addition, sound isolation requirement when to running car reduces, but also the manufacturing cost of automobile is without correspondingling increase.The application's
Method and system individually uses in the car, can also be used as the touch and/or voice reminder used in existing automobile
The supplement of formula automobile digital map navigation method and system further improves the convenience that driver operates map.
Although specific implementations of the present application are described in detail here, they are used for the purpose of the purpose explained and give
Out, and it is not considered that they are construed as limiting scope of the present application.Under the premise of not departing from the application spirit and scope, respectively
Kind replacement, change and transformation can be contemplated out.
Claims (12)
1. a kind of central processor unit for the control of automobile digital map navigation, the central processor unit includes the first data
Interface and the second data-interface, first data-interface can receive the inspection from the eyeball movement for detecting driver
Survey module data, wherein the detection module include for the eyeball of driver transmitting infrared ray RF transmitter,
And camera, the camera can capture the facial image of driver and the infrared signal from ocular reflex, described
Two data-interfaces can issue to a display and visually show the instruction of map layer, based on connecing via first data
The tracking data of the eyeball movement obtained when the received driver of mouth observes the display by the detection module, the center
Processor unit issues the map layer shown on it from second data-interface to the display and is correspondingly changed
The instruction of change.
2. central processor unit according to claim 1, which is characterized in that the central processor unit includes storage
Module is previously stored the corresponding relationship that the eyeball movement of driver changes with map layer, foundation in the memory module
The corresponding relationship of the storage, the map layer that the central processor unit instruction is shown on the display carry out corresponding
Ground changes.
3. central processor unit according to claim 1 or 2, which is characterized in that the map layer, which changes, includes ground
The eyeball movement of movement, scaling, the pause of figure figure layer, the driver includes opening eyes, staring, blinking.
4. central processor unit according to any one of claims 1 to 3, which is characterized in that the detection module is in automobile
Compartment in be mounted on the position that can easily observe the face of driver, also, the display is in the compartment of automobile
Inside being mounted on can be convenient for the position of driver's observation.
5. central processor unit according to any one of claims 1 to 4, which is characterized in that by means of in the display
The prompt information and/or voice prompting of upper display, the central processor unit can define eyeball movement and the ground of driver
The new corresponding relationship is simultaneously stored in memory module by the new corresponding relationship of figure figure layer change, for then calling.
6. central processor unit according to any one of claims 1 to 5, which is characterized in that define on the display
One eyeball pays close attention to area, and eyeball concern area is located within the showing edge of the map layer, also fixed on the display
An adopted virtual boundary, the virtual boundary are located between the showing edge of the map layer and eyeball concern area, are examining
Measure eyeball focus be located at the eyeball concern area in when, the map layer can not be changed.
7. central processor unit according to any one of claims 1 to 5, which is characterized in that define on the display
One eyeball pays close attention to area, and eyeball concern area is located within the showing edge of the map layer, also fixed on the display
An adopted virtual boundary, the virtual boundary are located between the showing edge of the map layer and eyeball concern area, are examining
When measuring eyeball focus and being located at outside eyeball concern area, with eyeball focus be detected it is mobile towards the virtual boundary,
The map layer is correspondingly shown movement.
8. central processor unit according to claim 7, which is characterized in that the map layer is correspondingly shown and moves
Dynamic speed is directly proportional at a distance from eyeball focus movement.
9. central processor unit according to claim 8, which is characterized in that the map layer is correspondingly shown and moves
Dynamic direction is roughly the same with the mobile direction of the eyeball focus.
10. central processor unit according to any one of claims 1 to 9, which is characterized in that the detection module is located at side
To disk rear, the display is located on automobile center console.
11. a kind of automobile digital map navigation control system comprising:
The detection module being arranged in automobile bodies, it includes for the infrared ray transmitting to the eyeball of driver transmitting infrared ray
Device and camera, the camera can capture the facial image of driver and the infrared signal from ocular reflex;
The display being arranged in automobile bodies is fixed relative to the detection module position, and on the display
It can visually show map layer;And
Central processor unit according to any one of claims 1 to 10.
12. a kind of automobile digital map navigation control method comprising:
Detection module and display are provided in automobile bodies, their positions are relatively fixed, the detection module include for
The RF transmitter and facial image for capturing driver of the eyeball transmitting infrared ray of driver and from ocular reflex
Infrared signal camera, the display can visually show map layer;
For configuration of automobiles central processor unit according to any one of claims 1 to 10, and by the central processing unit
Unit is connected with the detection module and display data.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710441580.1A CN109141461A (en) | 2017-06-13 | 2017-06-13 | Automobile digital map navigation control system and method |
| DE102018208833.0A DE102018208833A1 (en) | 2017-06-13 | 2018-06-05 | System and method for controlling the map navigation of a vehicle |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710441580.1A CN109141461A (en) | 2017-06-13 | 2017-06-13 | Automobile digital map navigation control system and method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN109141461A true CN109141461A (en) | 2019-01-04 |
Family
ID=64332937
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201710441580.1A Pending CN109141461A (en) | 2017-06-13 | 2017-06-13 | Automobile digital map navigation control system and method |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN109141461A (en) |
| DE (1) | DE102018208833A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108613683A (en) * | 2018-06-26 | 2018-10-02 | 威马智慧出行科技(上海)有限公司 | On-vehicle navigation apparatus, method and automobile |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN119261758A (en) * | 2023-07-05 | 2025-01-07 | 佛吉亚歌乐电子(厦门)有限公司 | A method and device for displaying an image outside a vehicle |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101950200A (en) * | 2010-09-21 | 2011-01-19 | 浙江大学 | Camera based method and device for controlling game map and role shift by eyeballs |
| CN102902468A (en) * | 2012-10-23 | 2013-01-30 | 陈婉莹 | Map browsing method and device of mobile terminal |
| CN103063224A (en) * | 2011-10-18 | 2013-04-24 | 罗伯特·博世有限公司 | Method of operating navigation system |
| CN103500061A (en) * | 2013-09-26 | 2014-01-08 | 三星电子(中国)研发中心 | Method and equipment for controlling displayer |
| CN104598138A (en) * | 2014-12-24 | 2015-05-06 | 三星电子(中国)研发中心 | Method and device for controlling electronic map |
| CN105128862A (en) * | 2015-08-18 | 2015-12-09 | 上海擎感智能科技有限公司 | Vehicle terminal eyeball identification control method and vehicle terminal eyeball identification control system |
| CN105843383A (en) * | 2016-03-21 | 2016-08-10 | 努比亚技术有限公司 | Application starting device and application starting method |
| US20160370860A1 (en) * | 2011-02-09 | 2016-12-22 | Apple Inc. | Gaze detection in a 3d mapping environment |
| US20170068387A1 (en) * | 2013-03-15 | 2017-03-09 | Elwha, Llc | Systems and methods for parallax compensation |
-
2017
- 2017-06-13 CN CN201710441580.1A patent/CN109141461A/en active Pending
-
2018
- 2018-06-05 DE DE102018208833.0A patent/DE102018208833A1/en active Pending
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101950200A (en) * | 2010-09-21 | 2011-01-19 | 浙江大学 | Camera based method and device for controlling game map and role shift by eyeballs |
| US20160370860A1 (en) * | 2011-02-09 | 2016-12-22 | Apple Inc. | Gaze detection in a 3d mapping environment |
| CN103063224A (en) * | 2011-10-18 | 2013-04-24 | 罗伯特·博世有限公司 | Method of operating navigation system |
| CN102902468A (en) * | 2012-10-23 | 2013-01-30 | 陈婉莹 | Map browsing method and device of mobile terminal |
| US20170068387A1 (en) * | 2013-03-15 | 2017-03-09 | Elwha, Llc | Systems and methods for parallax compensation |
| CN103500061A (en) * | 2013-09-26 | 2014-01-08 | 三星电子(中国)研发中心 | Method and equipment for controlling displayer |
| CN104598138A (en) * | 2014-12-24 | 2015-05-06 | 三星电子(中国)研发中心 | Method and device for controlling electronic map |
| CN105128862A (en) * | 2015-08-18 | 2015-12-09 | 上海擎感智能科技有限公司 | Vehicle terminal eyeball identification control method and vehicle terminal eyeball identification control system |
| CN105843383A (en) * | 2016-03-21 | 2016-08-10 | 努比亚技术有限公司 | Application starting device and application starting method |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108613683A (en) * | 2018-06-26 | 2018-10-02 | 威马智慧出行科技(上海)有限公司 | On-vehicle navigation apparatus, method and automobile |
Also Published As
| Publication number | Publication date |
|---|---|
| DE102018208833A1 (en) | 2018-12-13 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10207716B2 (en) | Integrated vehicle monitoring system | |
| US10908677B2 (en) | Vehicle system for providing driver feedback in response to an occupant's emotion | |
| US20210078408A1 (en) | System and method for correlating user attention direction and outside view | |
| JP6976089B2 (en) | Driving support device and driving support method | |
| CN106218506B (en) | Vehicle display device and vehicle including the vehicle display device | |
| US10471894B2 (en) | Method and apparatus for controlling vehicular user interface under driving circumstance | |
| US10067341B1 (en) | Enhanced heads-up display system | |
| US20180229654A1 (en) | Sensing application use while driving | |
| US10666901B1 (en) | System for soothing an occupant in a vehicle | |
| CN104660980B (en) | In-vehicle image processing device and semiconductor device | |
| CN110696613A (en) | Passenger head-up displays for vehicles | |
| JP2017007652A (en) | Method for recognizing context for language control, method for determining a language control signal for language control, and apparatus for implementing the method | |
| WO2007105792A1 (en) | Monitor and monitoring method, controller and control method, and program | |
| US20150125126A1 (en) | Detection system in a vehicle for recording the speaking activity of a vehicle occupant | |
| US20160307056A1 (en) | Arrangement for creating an image of a scene | |
| KR20080020956A (en) | How to operate a night-view system of a vehicle and the corresponding night-view system | |
| US11308721B2 (en) | System for detecting the face of a driver and method associated thereto | |
| US11115587B2 (en) | Recording reproduction apparatus, recording reproduction method, and program | |
| US20150124097A1 (en) | Optical reproduction and detection system in a vehicle | |
| CN110171357A (en) | Vehicle and its control method | |
| US11881054B2 (en) | Device and method for determining image data of the eyes, eye positions and/or a viewing direction of a vehicle user in a vehicle | |
| JP6805974B2 (en) | Driving support device and computer program | |
| KR101980966B1 (en) | Method and device for representing the surroundings of a motor vehicle | |
| CN109141461A (en) | Automobile digital map navigation control system and method | |
| JP2012011810A (en) | Face image pickup apparatus for vehicle |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |