CN114041069A - Image control device, image control method, and program - Google Patents
Image control device, image control method, and program Download PDFInfo
- Publication number
- CN114041069A CN114041069A CN202080043548.8A CN202080043548A CN114041069A CN 114041069 A CN114041069 A CN 114041069A CN 202080043548 A CN202080043548 A CN 202080043548A CN 114041069 A CN114041069 A CN 114041069A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- estimated
- orientation
- deviation
- azimuth
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 41
- 238000004364 calculation method Methods 0.000 claims abstract description 19
- 238000001514 detection method Methods 0.000 claims description 32
- 238000005259 measurement Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 16
- 230000010365 information processing Effects 0.000 description 10
- 230000001133 acceleration Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 4
- 230000002159 abnormal effect Effects 0.000 description 3
- 230000007704 transition Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000000875 corresponding effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/18—Stabilised platforms, e.g. by gyroscope
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/365—Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/49—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/53—Determining attitude
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
- H04W4/027—Services making use of location information using location based information parameters using movement velocity, acceleration information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/396—Determining accuracy or reliability of position or pseudorange measurements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Automation & Control Theory (AREA)
- Signal Processing (AREA)
- Navigation (AREA)
Abstract
The image control device (10) is provided with: a1 st acquisition unit (11) that acquires the position of a vehicle that is positioned and measured by a satellite positioning system; a2 nd acquisition unit (12) that acquires an estimated position of the vehicle estimated based on a dead reckoning algorithm; an orientation deviation estimation unit (13) that estimates a deviation in the orientation of the vehicle from the position of the vehicle that has been position-measured by the satellite positioning system and the estimated position of the vehicle that has been estimated based on the dead reckoning; an orientation calculation unit (14) that calculates an estimated orientation of the vehicle on the basis of a deviation in the estimated orientation of the vehicle; and an output unit (15) that outputs the estimated orientation of the vehicle to a video display device (60) that displays information based on the orientation of the vehicle.
Description
Technical Field
The present disclosure relates to a video control device, a video control method, and a program.
Background
(Prior art document)
(patent document)
Patent document 1: japanese laid-open patent publication No. 7-257228
Patent document 2: japanese patent laid-open publication No. 2018-045103
Route information to a destination displayed on a video display device such as an HUD may include information based on the direction of the vehicle (the direction of the vehicle itself) for guiding the vehicle to the destination, for example. The information based on the direction of the vehicle is, for example, an arrow mark (see fig. 9 described later) extending in the traveling direction from the vehicle itself. In order to generate information based on the orientation of the vehicle, it is necessary to estimate the orientation of the vehicle, but when the estimated orientation of the vehicle deviates from the actual orientation, there is a possibility that a vehicle occupant who sees the information based on the orientation of the vehicle feels strange. For example, an arrow mark extending in the traveling direction from the vehicle itself as a starting point may be displayed as an off-road because the orientation of the vehicle itself is estimated to be deviated.
Disclosure of Invention
Accordingly, the present disclosure provides a video control device and the like capable of improving the accuracy of estimating the orientation of a vehicle.
An image control device according to an aspect of the present disclosure includes: a1 st obtaining unit that obtains a position of a vehicle that is position-measured by a satellite positioning system; a2 nd obtaining unit that obtains an estimated position of the vehicle estimated based on a dead reckoning; an orientation deviation estimating unit that estimates a deviation of an orientation of the vehicle based on a position of the vehicle that is position-measured by a satellite positioning system and an estimated position of the vehicle that is estimated based on a dead reckoning algorithm; an orientation calculation unit that calculates an estimated orientation of the vehicle based on a deviation of the estimated orientation of the vehicle; and an output unit that outputs the estimated direction of the vehicle to a video display device that displays information based on the direction of the vehicle.
In addition, a video control method according to an aspect of the present disclosure obtains a position of a vehicle that is position-measured by a satellite positioning system, obtains an estimated position of the vehicle that is estimated based on dead reckoning, estimates a deviation of an orientation of the vehicle based on the position of the vehicle that is position-measured by the satellite positioning system and the estimated position of the vehicle that is estimated based on dead reckoning, calculates an estimated orientation of the vehicle based on the estimated deviation of the orientation of the vehicle, and outputs the estimated orientation of the vehicle to a video display device that displays information based on the orientation of the vehicle.
A program according to an aspect of the present disclosure is a program for causing a computer to execute the above-described video control method.
The accuracy of estimating the orientation of the vehicle can be improved by the video control device and the like of the present disclosure.
Drawings
Fig. 1 is a block diagram showing an example of the configuration of a video control apparatus according to the embodiment and peripheral apparatuses.
Fig. 2 is a diagram for explaining a method of calculating the orientation of the vehicle using the gyro sensor.
Fig. 3 is a diagram for explaining the update of the zero point of the gyro sensor.
Fig. 4 is a flowchart showing an example of the operation of the video control apparatus according to the embodiment.
Fig. 5 is a diagram illustrating an example of a method of estimating a deviation of the heading of a vehicle according to the embodiment.
Fig. 6 is a diagram illustrating another example of the method for estimating the deviation of the heading of the vehicle according to the embodiment.
Fig. 7 is a diagram for explaining a method of eliminating a drawing delay according to the embodiment.
Fig. 8 is a diagram showing transition of a deviation of the orientation when the video control apparatus according to the embodiment is applied.
Fig. 9 is a diagram showing an example of display of the video display device.
Detailed Description
(embodiment mode)
The following describes the video control apparatus and the like according to the embodiments with reference to the drawings.
Fig. 1 is a block diagram showing an example of the configuration of a video control apparatus 10 and its peripheral apparatuses according to the embodiment. In fig. 1, in addition to the image control apparatus 10, an information processing apparatus 20, an ECU30, a gyro sensor 40, an acceleration sensor 50, and an image display apparatus 60 are shown. These devices are mounted on, for example, a vehicle (e.g., an automobile).
The information processing device 20 is a device capable of obtaining information such as the position and direction of the vehicle, and a route to a destination of the vehicle (for example, the shortest route to the destination or a route avoiding traffic congestion), which are measured by a satellite Positioning system such as gps (global Positioning system).
The ECU30 is an ECU that processes a vehicle speed signal and the like, for example, and is a device capable of outputting the current vehicle speed to a can (controller Area network) bus.
The gyro sensor 40 is a sensor that detects yaw rate information of the vehicle and outputs a detection result. By integrating the detection results, the direction of the vehicle can be calculated.
The acceleration sensor 50 is a sensor that detects acceleration of the vehicle and outputs a detection result. According to the detection result, whether the vehicle stops or not can be judged.
The image display device 60 is a device that displays information based on the direction of the vehicle, and is, for example, a HUD, an electronic mirror, a car navigation system, or the like. In the present embodiment, the video display device 60 is a HUD. The information based on the direction of the vehicle is, for example, an arrow mark extending in the traveling direction from the vehicle (see fig. 9 described later). The video display device 60 displays information based on the direction of the vehicle based on the information on the route to the destination of the vehicle obtained from the information processing device 20 and the estimated direction of the vehicle obtained from the video control device 10. In addition, the information based on the orientation of the vehicle may be the orientation itself of the vehicle. In other words, the image display device 60 can display the direction of the vehicle.
The video control device 10 calculates an estimated direction of the vehicle and outputs the estimated direction to the video display device 60. Since the display content of the video display device 60 (information based on the orientation of the vehicle) changes according to the estimated orientation calculated by the video control device 10, the video control device 10 can also be said to be a device that controls the display content of the video display device 60. The video control device 10 includes: a1 st obtaining unit 11, a2 nd obtaining unit 12, an azimuth deviation estimating unit 13, an azimuth calculating unit 14, and an output unit 15. The video control apparatus 10 is a computer including a processor, a memory, a communication circuit, and the like. The memory is a rom (read Only memory), a ram (random Access memory), or the like, and can store a program executed by the processor. In addition, the memory stores information obtained by the video control apparatus 10. For example, the processor operates according to a program to realize the functions of the 1 st obtaining unit 11, the 2 nd obtaining unit 12, the bearing deviation estimating unit 13, the bearing calculating unit 14, and the output unit 15.
The 1 st obtaining unit 11 obtains the position of the vehicle that is positioned and measured by the satellite positioning system. Specifically, the 1 st obtaining unit 11 obtains the position of the vehicle that is positioned and measured by the satellite positioning system from the information processing device 20. The 1 st obtaining unit 11 obtains a1 st position and a2 nd position, the 1 st position being a position of the vehicle that is position-determined and measured by the satellite positioning system, and the 2 nd position being a position that is position-determined and measured by the satellite positioning system when the vehicle moves from the 1 st position. The 1 st obtaining unit 11 can estimate the travel locus of the vehicle from the 1 st position to the 2 nd position by sequentially obtaining the positions of the vehicle subjected to positioning measurement by the satellite positioning system.
The 2 nd obtaining unit 12 obtains an estimated position of the vehicle estimated based on the dead reckoning. For example, the 2 nd obtaining unit 12 obtains the estimated position of the vehicle by estimating the estimated position of the vehicle at the time of the 2 nd position measurement (specifically, the timing of the 2 nd position measurement) based on a dead reckoning method using the 1 st position, the direction of the vehicle at the 1 st position, the detection result of the gyro sensor 40 provided in the vehicle, and the speed information of the vehicle. The dead reckoning is a technique of relatively measuring the position of a moving body using the azimuth of the moving body at a certain location, the detection result of a gyro sensor from the location, the speed information of the moving body, and the like, instead of directly measuring the position as in the satellite positioning system. The 2 nd obtaining unit 12 can estimate the travel locus of the vehicle from the 1 st position to the estimated position based on the dead reckoning.
When the 1 st position obtained by the 1 st obtaining unit 11 is the same as the 1 st position obtained by the 2 nd obtaining unit, the 2 nd obtaining unit 12 may obtain the 1 st position from the information processing device 20, or may obtain the 1 st position from the 1 st obtaining unit 11. The 2 nd obtaining unit 12 obtains the direction of the vehicle at the 1 st position from the information processing device 20, for example, when the vehicle starts (when the vehicle starts traveling). In other words, when the vehicle starts to travel, the azimuth of the vehicle at the 1 st position obtained by the 2 nd obtaining unit 12 is the azimuth of the vehicle that is measured by being positioned by the satellite positioning system. The 2 nd obtaining unit 12 obtains the direction of the vehicle at the 1 st position from the direction calculating unit 14, for example, after the start of the traveling of the vehicle. The 2 nd obtaining unit 12 obtains the detection result of the gyro sensor 40 from the gyro sensor 40, and obtains the speed information of the vehicle from the ECU30 via the CAN bus or the like. The 2 nd obtaining unit 12 obtains, for example, a notification of the timing of the 2 nd position measurement from the 1 st obtaining unit 11.
Here, a method of calculating the direction of the vehicle based on the detection result of the gyro sensor 40 will be described with reference to fig. 2.
Fig. 2 is a diagram for explaining a method of calculating the orientation of the vehicle using the gyro sensor 40.
For example, the direction (initial direction) of the vehicle is known, assuming that the time T is 0. The integrated value of the detection results (in other words, yaw rate) of the gyro sensor 40 between the time T0 and the time T1 is added to the initial bearing, and the bearing of the vehicle at the time T1 can be calculated. The integrated value of the detection results of the gyro sensor 40 between the time T1 and the time T2 is added to the vehicle direction at the time T1, and the vehicle direction at the time T2 can be calculated. When the vehicle direction at a certain moment is known in this way, the vehicle direction can be accurately calculated using the vehicle direction at that moment and the integrated value of the detection results of the gyro sensor 40. However, even when the initial heading includes a deviation, the heading of the vehicle calculated using the initial heading is deviated from the initial heading accurately.
Even in a state where the vehicle is stopped, the zero point of the gyro sensor 40 (the output of the gyro sensor 40 in the state where the vehicle is stopped) may not become zero due to the gyro drift. In this case, the integrated value of the gyro sensor 40 also deviates. Then, the zero point of the gyro sensor 40 is updated at this time. This will be described with reference to fig. 3.
Fig. 3 is a diagram for explaining the update of the zero point of the gyro sensor 40.
As shown in fig. 3, when the vehicle continues to travel, the zero point of the gyro sensor 40 shifts from zero due to the gyro drift from a certain point of time. Since the integrated value of the detection results of the gyro sensor 40 changes due to the fluctuation of the zero point of the gyro sensor 40 (in other words, due to the gyro drift), the azimuth of the vehicle calculated using the gyro sensor 40 varies. Then, for example, the 2 nd obtaining unit 12 updates the zero point. Specifically, the 2 nd obtaining unit 12 updates the zero point by setting the average value of the detection results of the gyro sensor 40 for the past 3 seconds to zero when the vehicle is stopped (for example, when the vehicle speed for the past 3 seconds is 0 and the amount of change in the acceleration of the vehicle is equal to or less than a predetermined threshold value). Thus, the orientation of the vehicle can be kept from deviating even more. The 2 nd obtaining unit 12 obtains the acceleration of the vehicle from the acceleration sensor 50. For example, when the vehicle is stopped, the 2 nd obtaining unit 12 does not update the zero point when the average value of the detection results of the gyro sensor 40 for the past 3 seconds is zero (in other words, the zero point does not change). In other words, the 2 nd obtaining unit 12 updates the zero point when the zero point fluctuation is confirmed.
Returning to the description of fig. 1, the bearing deviation estimating unit 13 estimates a deviation of the bearing of the vehicle from the position of the vehicle that is measured by positioning with the satellite positioning system and the estimated position of the vehicle that is estimated based on the dead reckoning. The operation of the bearing deviation estimating unit 13 will be described in detail later.
The heading calculation unit 14 calculates the estimated heading of the vehicle from the deviation of the heading of the vehicle estimated by the heading deviation estimation unit 13. The operation of the azimuth calculating unit 14 will be described in detail later.
The output unit 15 outputs the estimated direction of the vehicle to the image display device 60.
Next, the operation of the video control apparatus 10 will be described.
Fig. 4 is a flowchart showing an example of the operation of the video control apparatus 10 according to the embodiment.
The 1 st obtaining unit 11 obtains the position of the vehicle that is positioned and measured by the satellite positioning system (step S11). Specifically, the 1 st obtaining unit 11 obtains the 1 st position of the vehicle subjected to positioning measurement by the satellite positioning system and the 2 nd position of the vehicle subjected to positioning measurement by the satellite positioning system when the vehicle moves from the 1 st position.
The 2 nd obtaining unit 12 obtains the estimated position of the vehicle estimated based on the dead reckoning (step S12). The estimated position of the vehicle is the position of the vehicle when the vehicle is measured at the 2 nd position, estimated based on a dead reckoning method using the 1 st position, the direction of the vehicle at the 1 st position, the detection result of the gyro sensor 40 provided in the vehicle, and the speed information of the vehicle.
The bearing deviation estimating unit 13 estimates a deviation of the bearing of the vehicle from the position of the vehicle subjected to positioning measurement by the satellite positioning system and the estimated position of the vehicle estimated by the dead reckoning (step S13). Specifically, the bearing deviation estimating unit 13 estimates a deviation of the bearing of the vehicle based on the 1 st position of the vehicle, the 2 nd position of the vehicle, and the estimated position of the vehicle, the 1 st position of the vehicle being a position to be measured by the satellite positioning system, the 2 nd position of the vehicle being a position to be measured by the satellite positioning system when the vehicle moves from the 1 st position, the estimated position of the vehicle being an estimated position of the vehicle when the vehicle is measured by the 2 nd position, the estimated position being estimated based on a dead reckoning method using the 1 st position, the bearing of the vehicle at the 1 st position, a detection result of a gyro sensor provided in the vehicle, and speed information of the vehicle. The method of estimating the deviation of the orientation of the vehicle will be described in detail with reference to fig. 5.
Fig. 5 is a diagram illustrating an example of a method of estimating a deviation of the heading of a vehicle according to the embodiment. Fig. 5 schematically shows the travel locus of the vehicle on the horizontal plane, and it is assumed that the vehicle moves from the lower left side (origin side) to the upper right side (positive X-axis and Y-axis sides). The solid line in fig. 5 indicates an actual travel locus, and the image control device 10 and the like cannot recognize the travel locus, but are illustrated for comparison with a travel locus of a satellite positioning system and a travel locus of a dead reckoning algorithm. The dashed line in fig. 5 represents the measured travel path positioned by the satellite positioning system. The chain line in fig. 5 indicates the travel locus estimated based on the dead reckoning.
Position a1 in fig. 5 represents the 1 st position as measured by the satellite positioning system, and position a2 represents the 2 nd position as measured by the satellite positioning system as the vehicle moves from position a 1. The image control device 10 can recognize a travel locus by the satellite positioning system by sequentially obtaining the positions of the vehicles that are positioned and measured by the satellite positioning system. However, the position of the vehicle to be measured by the satellite positioning system may be deviated from the actual traveling position by about 10 meters, including an error.
Position B1 in fig. 5 represents the estimated position estimated based on dead reckoning when position a2 was measured by the position fix. The vision control device 10 obtains the orientation (initial orientation) of the vehicle at the position a1, and sequentially obtains the detection result from the gyro sensor 40 at the position a1 and the speed information of the vehicle at the position a1, thereby being able to recognize the travel track based on the dead reckoning algorithm. When the dead reckoning is performed with high accuracy, since a change in the direction of the vehicle and a change in the speed of the vehicle from a certain point can be accurately reflected on the travel trajectory, it can be seen from fig. 5 that the travel trajectory based on the dead reckoning is similar in shape to the actual travel trajectory. However, when the initial orientation of the vehicle used in the dead reckoning deviates from the actual orientation of the vehicle, the position of the travel track estimated by the dead reckoning deviates from the actual position of the travel track by the amount of the deviation. The position a1 is a position at which the process of estimating the deviation of the orientation of the vehicle is started, and is, for example, a starting position at which the vehicle runs (a starting position of the vehicle) or an updated position of the zero point of the gyro sensor 40. In the case where the position a1 is the start position of travel of the vehicle, the initial bearing is the bearing of the vehicle that is position-measured, for example, by a satellite positioning system. Since the accuracy of the orientation of the vehicle that is positioned and measured by the satellite positioning system is low, there is a high possibility that the initial orientation deviates from the actual orientation of the vehicle. When the position a1 is the updated position of the zero point of the gyro sensor 40, the initial heading is, for example, the heading of the vehicle calculated based on the detection result of the gyro sensor 40. Since the zero point of the gyro sensor 40 is updated when the zero point of the gyro sensor 40 is changed, there is a high possibility that the orientation of the vehicle (in other words, the initial orientation) calculated from the detection result of the gyro sensor 40 is deviated from the actual orientation of the vehicle when the zero point of the gyro sensor 40 is updated.
It is assumed that the zero point of the gyro sensor 40 is updated at the start position of the running of the vehicle or the update position of the zero point of the gyro sensor 40, and the zero point of the gyro sensor 40 does not change. Further, if the vehicle stops at these positions, the zero point of the gyro sensor 40 does not fluctuate temporarily due to the gyro drift after the vehicle starts moving from these positions. In this way, while the vehicle is moving from the position a1 located and measured by the satellite positioning system to the position a2, the deviation of the orientation of the vehicle is set so as not to change except for the deviation occurring at the position a 1.
In a movement for estimating the deviation of the vehicle azimuth while performing dead reckoning with high accuracy, the angle formed by the straight line connecting the position a1 and the position a2 and the straight line connecting the position a1 and the position B1 can be regarded as the deviation of the vehicle azimuth, on the premise that the deviation of the vehicle azimuth does not vary. In other words, the bearing deviation estimation unit 13 can estimate a deviation of the bearing of the vehicle that is measured by positioning with the satellite positioning system, a deviation of the bearing of the vehicle due to the gyro drift, or the like.
Returning to the explanation of fig. 4, the heading calculation unit 14 calculates the estimated heading of the vehicle from the deviation of the heading of the vehicle estimated by the heading deviation estimation unit 13 (step S14). For example, the heading calculation unit 14 calculates the estimated heading of the vehicle by correcting (for example, adding or subtracting the deviation of the heading of the vehicle) the heading of the vehicle calculated based on the detection result of the gyro sensor 40 during the movement from the position a2 using the deviation of the estimated heading of the vehicle. Thus, the accuracy of estimation of the orientation of the vehicle can be improved.
Then, the output unit 15 outputs the estimated direction of the vehicle to the video display device 60 (step S15). Accordingly, the video display device 60 can display information based on the direction of the vehicle from the direction of the vehicle with little deviation.
The heading reference deviation estimating unit 13 may estimate the deviation of the heading reference of the vehicle, and may continue to sequentially estimate the deviation of the heading reference of the vehicle starting from the position a1 even after the estimated heading reference of the vehicle is calculated. This is because, when estimating the deviation of the orientation of the vehicle, the accuracy of estimating the deviation of the orientation of the vehicle is higher as the distance the vehicle travels is longer, and the details will be described later with reference to fig. 6. However, since the longer the distance the vehicle moves without stopping, the more the vehicle has deviated its azimuth due to the gyro drift (in other words, the zero point of the gyro sensor 40 has fluctuated), when the zero point of the gyro sensor 40 has fluctuated and the zero point has been updated, the processing of estimating the deviation of the azimuth of the vehicle is performed with the new 1 st position as the starting point instead of the position a 1.
As shown in fig. 5, the position of the vehicle to be measured by the satellite positioning system includes an error, and the greater the error is, the more the straight line connecting the position a1 and the position a2 is deviated, so that the accuracy of estimating the deviation of the azimuth of the vehicle becomes lower. On the other hand, the longer the distance between the position a1 and the position a2 (the distance the vehicle moves in order to estimate the deviation of the orientation of the vehicle), the less the influence of the error on estimating the deviation of the orientation of the vehicle. However, the deviation of the orientation of the vehicle is calculated on the premise that the deviation of the orientation of the vehicle does not vary during the movement for estimating the deviation of the orientation of the vehicle with high accuracy of the dead reckoning algorithm, and the accuracy of the dead reckoning algorithm is lowered or influenced by the gyro drift as the distance that the vehicle actually moves is longer. Therefore, in order to maintain the above-described premise, it is necessary to shorten the distance that the vehicle travels as much as possible.
The heading direction deviation estimating unit 13 may perform a process of estimating the heading direction of the vehicle in short sections of, for example, about 10m, and the heading direction calculating unit 14 may calculate the estimated heading direction of the vehicle from the estimation result of the heading direction of the vehicle in each section. Specifically, the heading reference deviation estimating unit 13 may estimate the deviation of the heading reference of the vehicle in each of the plurality of consecutive sections based on the 1 st position, the 2 nd position, and the estimated position in each of the plurality of consecutive sections. The heading calculation unit 14 may calculate the estimated heading of the vehicle from a deviation of the heading of the vehicle in 2 or more consecutive sections. This will be described with reference to fig. 6.
Fig. 6 is a diagram illustrating another example of the method for estimating the deviation of the heading of the vehicle according to the embodiment. Fig. 6 schematically shows the travel locus of the vehicle on the horizontal plane as in fig. 5, assuming that the vehicle moves from the lower left side (origin side) to the upper right side (positive X-axis and Y-axis sides). However, in fig. 6, the 1 st position, the 2 nd position, and the estimated position are present in each of a plurality of consecutive sections (here, 3 sections, that is, the 1 st section to the 3 rd section are shown as an example). In other words, the process of estimating the deviation of the heading of the vehicle using the 1 st position (position a1), the 2 nd position (position a2), and the estimated position (position B1) in fig. 5 is performed in each of the 1 st to 3 rd sections. In section 1, position a1 represents the 1 st position as measured by the satellite positioning system, position a2 represents the 2 nd position as measured by the satellite positioning system as the vehicle moves from position a1, and position B1 represents the estimated position as measured by the dead reckoning algorithm as position a 2. In addition, in section 2, position a2 represents the 1 st position as measured by the satellite positioning system, position A3 represents the 2 nd position as measured by the satellite positioning system as the vehicle moves from position a2, and position B2 represents the estimated position as measured by the dead reckoning algorithm as position A3. In addition, in section 3, position A3 represents the 1 st position as measured by the satellite positioning system, position a4 represents the 2 nd position as measured by the satellite positioning system as the vehicle moves from position A3, and position B3 represents the estimated position as measured by the dead reckoning algorithm as position a 4.
Here, an example in which the deviation of the azimuth of the vehicle is estimated in 3 sections is shown as an example, but the deviation of the azimuth of the vehicle may be estimated in each section by dividing a distance of several hundred meters into short sections of about 10m, for example. In a short section of about 10m, it is easy to perform the dead reckoning with high accuracy, and it is ensured that the deviation of the vehicle's heading does not vary during the movement for estimating the deviation of the vehicle's heading. However, since each section is relatively short, the influence of an error in the position of the vehicle that is positioned and measured by the satellite positioning system is large, and the deviation of the estimated azimuth of the vehicle in each section tends to be large. The distance between the sections is not limited to 10m, and may be about 20m or 30 m.
In contrast, the heading calculation unit 14 uses the deviation of the heading of the vehicle in 2 or more consecutive sections, thereby reducing the influence of the error in the position of the vehicle that is positioned and measured by the satellite positioning system. For example, by calculating the average value, the median value, or the like of the directions of the vehicle in 2 or more sections, even when the deviation of the direction of the vehicle estimated in each section is large, the accuracy of estimating the direction of the vehicle can be further improved.
For example, the heading calculation unit 14 may calculate the estimated heading of the vehicle from an average value of deviations of the heading of the vehicle in each of a plurality of consecutive sections excluding a section in which the heading of the vehicle calculated from the detection result of the gyro sensor 40 varies by 1 st or more from the preceding area adjacent to the preceding area among the plurality of consecutive sections.
For example, in the example of fig. 6, of the adjacent 1 st section and 2 nd section, the 1 st section is a preceding section adjacent to the 2 nd section, and of the adjacent 2 nd section and 3 rd section, the 2 nd section is a preceding section adjacent to the 3 rd section. For example, in the 1 st section and the 2 nd section adjacent to each other, the azimuth of the vehicle calculated from the detection result of the gyro sensor 40 in the 1 st section (preceding section) is assumed to be north. And it is assumed that the orientation of the vehicle calculated from the detection result of the gyro sensor 40 in the 2 nd section (rear section) is the east. In this case, for example, the vehicle orientation changes by about 90 degrees at a distance of about 10m, and there is a high possibility that a sharp turn or a right-left turn may occur around the 2 nd section, and there is a high possibility that the accuracy of the positioning measurement by the satellite positioning system or the accuracy of the dead reckoning algorithm is insufficient. Thus, the average value of the deviation of the orientation of the vehicle in each of a plurality of consecutive sections excluding the section in which the orientation of the vehicle is calculated based on the detection result of the gyro sensor 40 with respect to the section in which the orientation of the vehicle is shifted by the 1 st threshold value or more between adjacent front areas among the plurality of consecutive sections can be calculated. Since the average value is calculated by excluding the deviation of the estimated vehicle heading in a section where the accuracy of positioning measurement by the satellite positioning system or the accuracy of the dead reckoning is likely to be insufficient, the accuracy of estimating the vehicle heading can be further improved. Although the 1 st threshold is, for example, 90 degrees as described above, the 1 st threshold is not particularly limited and may be set as appropriate.
For example, the heading calculation unit 14 may calculate the estimated heading of the vehicle from an average value of deviations of the heading of the vehicle in each of a plurality of consecutive sections excluding a section in which the deviation of the heading of the vehicle is equal to or greater than the 2 nd threshold value in the plurality of consecutive sections.
For example, in a certain section of a plurality of consecutive sections, the deviation of the estimated orientation of the vehicle may be large (for example, 20 degrees). In such a section, the operation of the satellite positioning system becomes unstable, and there is a high possibility that the 2 nd position of the vehicle to be positioned and measured by the satellite positioning system is greatly deviated from the actual position. Then, an average value of the deviation of the orientation of the vehicle in each of a plurality of consecutive sections excluding the section in which the deviation of the orientation of the vehicle in the plurality of consecutive sections is not less than the 2 nd threshold value is calculated. Since the average value is calculated by excluding the deviation of the estimated vehicle heading in the section where the possibility of the satellite positioning system being unstable is high, the accuracy of estimating the vehicle heading can be further improved. Although the 2 nd threshold is, for example, 20 degrees as described above, the 2 nd threshold is not particularly limited and may be set as appropriate.
The heading calculation unit 14 may calculate the estimated heading of the vehicle from an average value of deviations of the heading of the vehicle in each of a plurality of consecutive sections excluding both sections, one of which is a section in which the heading of the vehicle is shifted by a1 st threshold value or more with respect to the adjacent preceding section among the plurality of consecutive sections, the heading of the vehicle being calculated based on the detection result of the gyro sensor 40, and the other of which is a section in which the deviations of the heading of the vehicle are equal to or more than a2 nd threshold value.
For example, the heading calculation unit 14 may calculate the estimated heading of the vehicle from a median of deviations of the heading of the vehicle in each of a plurality of consecutive sections.
In a section where the accuracy of positioning measurement by the satellite positioning system or the accuracy of the dead reckoning is insufficient (for example, in the above description, a section excluded when calculating the average value), the deviation of the vehicle azimuth may be an abnormal value with respect to the deviation of the vehicle azimuth in another section. When the average value is calculated including such abnormal values, the average value may be greatly affected by the abnormal values and may not be a normal value. Then, by calculating the median, the accuracy of estimating the direction of the vehicle can be further improved.
For example, the heading calculation unit 14 may calculate the estimated heading of the vehicle from a weighted average of the deviations of the heading of the vehicle in each of the plurality of consecutive sections.
For example, in a section where the accuracy of positioning measurement by the satellite positioning system or the accuracy of the dead reckoning is insufficient (for example, in the above description, a section excluded when calculating the average value), the weight of the deviation of the vehicle azimuth is set to be relatively small (for example, 0.5) to calculate the weighted average value, and the accuracy of estimating the vehicle azimuth can be further improved.
The output unit 15 outputs the estimated vehicle heading calculated in this way to the video display device 60, but in the video display device 60, when information based on the vehicle heading is displayed on the video display device 60 in accordance with the estimated vehicle heading, a drawing delay of about 33.3ms, for example, occurs. In other words, in the video control device 10, even if the estimated direction is calculated with high accuracy, the information based on the direction of the vehicle is displayed on the video display device 60 based on the estimated direction of the vehicle about 33.3ms ago. Then, the output unit 15 may output the estimated direction of the vehicle in the future predicted from the past amount of fluctuation of the estimated direction of the vehicle to the video display device 60. This will be described with reference to fig. 7.
Fig. 7 is a diagram for explaining a method of eliminating a drawing delay according to the embodiment.
As shown in the upper side of fig. 7, when the current estimated azimuth (the chain line in fig. 7) is output to the image display device 60, information based on the azimuth of the vehicle from the current estimated azimuth is displayed on the image display device 60 after about 33.3 ms. However, the orientation of the vehicle changes from time to time, and sometimes becomes a different orientation (two-dot chain line in fig. 7) from the current orientation at the time of drawing (in other words, after about 33.3ms from the current time), and a deviation in the orientation of the vehicle due to a drawing delay may occur.
Then, as shown in the lower side of fig. 7, the output unit 15 predicts the estimated direction of the vehicle in the future (solid line in fig. 7) from the past amount of fluctuation of the estimated direction of the vehicle calculated so far, without outputting the estimated direction of the vehicle to the video display device 60 as it is, and outputs the predicted direction to the video display device 60. For example, the output unit 15 predicts the estimated orientation of the vehicle after about 33.3ms from the amount of fluctuation from about 16.6ms before to the present. Therefore, the difference between the predicted estimated heading used for drawing and the estimated heading at the time of drawing becomes small, and therefore the deviation of the heading of the vehicle due to drawing delay becomes small. That is, the information based on the direction of the vehicle is accurately displayed on the image display device 60.
Next, the transition of the azimuth deviation when the video control device 10 of the present embodiment is applied will be described with reference to fig. 8.
Fig. 8 is a diagram showing transition of the variation of the orientation when the video control apparatus 10 according to the embodiment is applied. For example, in the present embodiment, the deviation of the orientation of the vehicle is estimated, and the orientation of the vehicle is corrected, so that the error of the orientation of the vehicle as a whole can be reduced.
As shown in fig. 8, for example, the detection results of the gyro sensor 40 are accumulated at the same frequency as the drawing rate of 60Hz to predict the orientation of the vehicle. For example, as the initial orientation at the time of vehicle start, an orientation of the vehicle that is position-measured by a satellite positioning system is used. The vehicle azimuth to be measured by the satellite positioning system is low in accuracy until the vehicle azimuth is corrected by estimating the deviation of the vehicle azimuth, and the vehicle azimuth calculated from the detection result of the gyro sensor 40 includes an error of the initial azimuth.
After the vehicle starts moving, a deviation of the orientation of the vehicle is estimated from the position of the vehicle that is position-measured by the satellite positioning system and the estimated position of the vehicle that is estimated based on the dead reckoning. Further, the orientation of the vehicle is corrected based on the deviation of the estimated orientation of the vehicle. Accordingly, the error in the orientation of the vehicle can be set to, for example, 0.8 degrees or less.
When the vehicle continues to travel to some extent thereafter, the orientation of the vehicle may be greatly deviated due to the drift of the gyroscope. Then, the vehicle is stopped once, and the zero point of the gyro sensor 40 is corrected so that the deviation of the orientation of the vehicle does not increase. Thereafter, the deviation of the orientation of the vehicle is estimated again from the position of the vehicle that is position-measured by the satellite positioning system and the estimated position of the vehicle that is estimated based on the dead reckoning, and the orientation of the vehicle is corrected based on the deviation of the estimated orientation of the vehicle. Accordingly, the error in the orientation of the vehicle can be reduced to 0.8 degrees or less again. Then, for example, when the zero point of the gyro sensor 40 is changed and the zero point of the gyro sensor 40 is corrected, the process of estimating the deviation of the orientation of the vehicle is performed so that the error of the orientation of the vehicle is reduced.
As described above, by improving the accuracy of estimation of the vehicle direction, the information based on the vehicle direction can be accurately displayed on the video display device 60.
Fig. 9 is a diagram showing an example of display by the video display device 60. As described above, the image display device 60 is, for example, a HUD, and in fig. 9, the display area D on the front windshield of the vehicle is shown as the display area of the image display device 60.
The information based on the direction of the vehicle is, for example, an arrow mark C extending in the traveling direction from the vehicle itself. In order to display the arrow mark C extending in the traveling direction from the vehicle itself as a starting point on the display area D, it is necessary to estimate the direction of the vehicle. When the estimated orientation of the vehicle deviates from the actual orientation, arrow mark C may be displayed as crossing the road (for example, the vehicle is heading toward a sidewalk, a building, or the like). Since the video control device 10 according to the present embodiment can improve the accuracy of estimating the direction of the vehicle, the arrow mark C can be displayed so as to be directed toward the traveling direction of the vehicle.
As described above, the video control apparatus 10 according to the present embodiment includes: a1 st obtaining unit 11 that obtains a position of a vehicle that is position-measured by a satellite positioning system; a2 nd obtaining unit 12 that obtains an estimated position of the vehicle estimated based on a dead reckoning; an azimuth deviation estimating unit 13 that estimates a deviation of an azimuth of the vehicle based on a position of the vehicle subjected to positioning measurement by the satellite positioning system and an estimated position of the vehicle estimated based on the dead reckoning; a bearing calculation unit 14 that calculates an estimated bearing of the vehicle based on a deviation of the estimated bearing of the vehicle; and an output unit 15 that outputs the estimated direction of the vehicle to a video display device 60 that displays information based on the direction of the vehicle.
With the satellite positioning system, not only the position of the vehicle but also the orientation of the vehicle can be measured by positioning according to the temporal change in the position of the vehicle. However, the positioning measurement by the satellite positioning system is performed only at a frequency of 1Hz (about 1 second and 1 time), and since there is a communication delay when the azimuth of the vehicle to be measured is obtained, the azimuth of the vehicle changes from moment to moment while the vehicle is moving, and information based on the azimuth of the vehicle is displayed on the video display device based on the azimuth of the vehicle with a large delay.
In this case, the deviation of the orientation of the vehicle is estimated based on the position of the vehicle that is position-measured by the satellite positioning system and the estimated position of the vehicle that is estimated based on the dead reckoning. The position of the vehicle to be positioned and measured by the satellite positioning system is a position close to the actual position of the vehicle, although an error occurs to some extent. On the other hand, when the orientation of the vehicle used in the dead reckoning algorithm deviates, the estimated position of the vehicle estimated based on the dead reckoning algorithm is reflected in the estimated position to some extent accurately. That is, the deviation between the position of the vehicle that is position-determined and measured by the satellite positioning system and the estimated position of the vehicle that is estimated based on the dead reckoning is correlated with the deviation of the heading of the vehicle, so the deviation of the heading of the vehicle can be estimated by the deviation between the position of the vehicle that is position-determined and measured by the satellite positioning system and the estimated position of the vehicle that is estimated based on the dead reckoning. Thus, the accuracy of estimating the orientation of the vehicle can be improved.
The bearing deviation estimating unit 13 may estimate the deviation of the bearing of the vehicle from a1 st position of the vehicle, a2 nd position of the vehicle, and an estimated position of the vehicle, the 1 st position of the vehicle being a position to be measured by the satellite positioning system, the 2 nd position of the vehicle being a position to be measured by the satellite positioning system when the vehicle moves from the 1 st position, the estimated position of the vehicle being an estimated position of the vehicle at the time of the 2 nd position measurement estimated based on a dead reckoning method, the dead reckoning method being a method using the 1 st position, the bearing of the vehicle at the 1 st position, a detection result of the gyro sensor 40 provided in the vehicle, and speed information of the vehicle. Specifically, the bearing deviation estimating unit 13 estimates an angle formed by a straight line connecting the 1 st position and the 2 nd position and a straight line connecting the 1 st position and the estimated position as a deviation of the bearing of the vehicle.
The 1 st position and the 2 nd position are positions close to the actual position of the vehicle. The estimated position is estimated based on a dead reckoning method using a detection result of a gyro sensor or the like with reference to the direction of the vehicle at the 1 st position. As the vehicle moves from the 1 st position to the 2 nd position, the travel trajectory of the vehicle based on the dead reckoning is deviated by an amount of deviation of the orientation of the vehicle at the 1 st position as compared with the travel trajectory of the vehicle from the 1 st position to the 2 nd position that is position-measured by the satellite positioning system. In this way, since the deviation between the 2 nd position of the vehicle subjected to positioning measurement by the satellite positioning system and the estimated position of the vehicle estimated by the dead reckoning is a deviation corresponding to the azimuth of the vehicle at the 1 st position, the deviation of the azimuth of the vehicle can be estimated from the deviation between the 2 nd position and the estimated position. Specifically, since the angle formed by the straight line connecting the 1 st position and the 2 nd position and the straight line connecting the 1 st position and the estimated position can be regarded as the deviation of the heading of the vehicle at the 1 st position, the deviation of the heading of the vehicle can be easily estimated by calculating the angle.
The heading direction deviation estimating unit 13 may estimate a deviation of the heading direction of the vehicle for each of the plurality of consecutive sections based on the 1 st position, the 2 nd position, and the estimated position in each of the plurality of consecutive sections, and the heading direction calculating unit 14 may calculate the estimated heading direction of the vehicle based on a deviation of the heading direction of the vehicle for 2 or more sections of the plurality of consecutive sections.
For example, the accuracy of estimating the orientation of the vehicle can be further improved by calculating the average value or the median value of the deviations of the orientations of the vehicle for 2 or more sections among the deviations of the orientation of the vehicle in each of the plurality of continuous sections.
The heading calculation unit 14 may calculate the estimated heading of the vehicle from an average value of deviations of the heading of the vehicle in each of a plurality of consecutive sections excluding a section in which the heading of the vehicle calculated from the detection result of the gyro sensor 40 varies by a1 st threshold value or more with respect to the preceding area adjacent to the preceding area among the plurality of consecutive sections.
In a section in which the direction of the vehicle changes by 1 st threshold value or more with respect to an adjacent preceding section, there is a high possibility of a sharp turn or a right-left turn, and there is a high possibility that the accuracy of positioning measurement by the satellite positioning system or the accuracy of the dead reckoning algorithm is insufficient. Therefore, by calculating the average value of the deviation of the vehicle heading in each of the plurality of consecutive sections excluding such a section, the accuracy of estimating the heading of the vehicle can be further improved.
The heading calculation unit 14 may calculate the estimated heading of the vehicle from an average value of deviations of the heading of the vehicle in each of a plurality of consecutive sections excluding a section in which the deviation of the heading of the vehicle is equal to or greater than the 2 nd threshold value among the plurality of consecutive sections.
In a section where the estimated deviation of the azimuth of the vehicle is equal to or greater than the 2 nd threshold, the operation of the satellite positioning system is unstable, and there is a high possibility that the 2 nd position of the vehicle to be positioned and measured by the satellite positioning system is greatly deviated from the actual position. Therefore, by calculating the average value of the deviation of the vehicle heading in each of the plurality of consecutive sections excluding such a section, the accuracy of estimating the heading of the vehicle can be further improved.
The heading calculation unit 14 may calculate the estimated heading of the vehicle from a median of deviations of the heading of the vehicle in each of the plurality of consecutive sections.
As described above, in the case of the deviation of the vehicle azimuth in each of the plurality of consecutive sections, the influence of the deviation (i.e., the offset value) of the vehicle azimuth in the section in which the accuracy of the positioning measurement by the satellite positioning system or the accuracy of the dead reckoning is insufficient can be suppressed, and therefore the accuracy of estimating the vehicle azimuth can be further improved.
The heading calculation unit 14 may calculate the estimated heading of the vehicle from a weighted average of the deviations of the heading of the vehicle in each of the plurality of consecutive sections.
As described above, among the variations in the azimuth of the vehicle in each of the plurality of consecutive sections, the weighted average value can be calculated by setting a small weight for the variations in the azimuth of the vehicle in the section where the accuracy of positioning measurement by the satellite positioning system or the accuracy of the dead reckoning is insufficient, and the influence of the variations in the azimuth of the vehicle in the section can be suppressed, so that the accuracy of estimating the azimuth of the vehicle can be further improved.
The output unit 15 may output the estimated direction of the vehicle in the future, which is predicted from the past amount of fluctuation of the estimated direction of the vehicle, to the video display device 60.
When information based on the direction of the vehicle is displayed on the video display device 60, a drawing delay of about 33.3ms occurs. Therefore, when the estimated direction of the vehicle is output to the image display device 60 as it is, the direction with respect to the vehicle changes from moment to moment, and information based on the direction of the vehicle from the estimated direction of the vehicle about 33.3ms ago is displayed on the image display device 60. Then, by predicting the past fluctuation amount of the estimated heading of the vehicle calculated up to now (for example, the fluctuation amount from about 16.6ms to the present), predicting the estimated heading of the vehicle in the future (about 33.3ms later), and outputting the predicted heading to the video display device 60, the influence of the drawing delay can be suppressed, and information based on the heading of the vehicle can be accurately displayed on the video display device 60.
(other embodiments)
As described above, the embodiments have been described as an example of the technique according to the present disclosure. However, the technique according to the present disclosure is not limited to this, and can be applied to an embodiment in which changes, substitutions, additions, omissions, and the like are appropriately made. For example, the following modifications are also included in one embodiment of the present disclosure.
For example, although the video control device 10 is provided separately from the information processing device 20 and the video display device 60 in the above embodiment, the present invention is not limited to this. For example, the video control device 10 may be provided integrally with the information processing device 20, may be provided integrally with the video display device 60, or may be provided integrally with the video control device 10, the information processing device 20, and the video display device 60.
For example, although the vehicle is described as an automobile in the above embodiment, the vehicle is not limited to an automobile, and may be a two-wheeled vehicle, a construction machine, an agricultural machine, or the like.
The present disclosure can be realized not only as the video control device 10 but also as a video control method including steps (processes) performed by each component constituting the video control device 10.
Specifically, as shown in fig. 4, in the video control method, the position of the vehicle positioned and measured by the satellite positioning system is obtained (step S11), the estimated position of the vehicle estimated by the dead reckoning is obtained (step S12), the deviation of the heading of the vehicle is estimated from the position of the vehicle positioned and measured by the satellite positioning system and the estimated position of the vehicle estimated by the dead reckoning (step S13), the estimated heading of the vehicle is calculated from the deviation of the estimated heading of the vehicle (step S14), and the estimated heading of the vehicle is output to the video display device 60, which is a device that displays information based on the heading of the vehicle (step S15).
For example, the steps in the video control method may be executed by a computer (computer system). Further, the present disclosure can be realized as a program for causing a computer to execute the steps included in the video control method. The present disclosure can be realized as a non-transitory computer-readable recording medium such as a CD-ROM on which the program is recorded.
For example, when the present disclosure is implemented by a program (software), each step is executed by executing the program using hardware resources such as a computer CPU, a memory, and an input/output circuit. In other words, each step is executed by the CPU obtaining data from the memory, the input/output circuit, or the like and performing an operation, or outputting an operation result to the memory, the input/output circuit, or the like.
The respective components included in the video control apparatus 10 according to the above-described embodiment may be implemented as dedicated or general-purpose circuits.
Each of the components included in the video controller 10 of the above-described embodiment may be realized by an lsi (large Scale integration) that is an Integrated Circuit (IC).
Further, the integration is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor. A Programmable FPGA (Field Programmable Gate Array) or a reconfigurable processor that reconfigures the connection and setting of circuit cells inside an LSI may be used.
Furthermore, when an integrated circuit technology capable of replacing an LSI appears along with the progress of a semiconductor technology or another derived technology, it is needless to say that the integrated circuit of each component included in the video control apparatus 10 can be realized using this technology.
In addition, the present disclosure includes forms obtained by implementing various modifications to the embodiments as would be expected by those skilled in the art, and forms realized by arbitrarily combining the components and functions in the respective embodiments within a scope not departing from the spirit of the present application.
The present disclosure can be applied to, for example, a device that displays information based on the orientation of a vehicle.
Description of the symbols
10 image control device
11 1 st obtaining part
12 the 2 nd obtaining part
13 azimuth deviation estimating unit
14 azimuth calculating unit
15 output part
20 information processing device
30 ECU
40 gyro sensor
50 acceleration sensor
60 image display device
Claims (11)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019116243 | 2019-06-24 | ||
JP2019-116243 | 2019-06-24 | ||
PCT/JP2020/015238 WO2020261696A1 (en) | 2019-06-24 | 2020-04-02 | Video control device, video control method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114041069A true CN114041069A (en) | 2022-02-11 |
Family
ID=74061592
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202080043548.8A Pending CN114041069A (en) | 2019-06-24 | 2020-04-02 | Image control device, image control method, and program |
Country Status (5)
Country | Link |
---|---|
US (1) | US12158343B2 (en) |
JP (1) | JP7361348B2 (en) |
CN (1) | CN114041069A (en) |
DE (1) | DE112020003021T5 (en) |
WO (1) | WO2020261696A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7361348B2 (en) * | 2019-06-24 | 2023-10-16 | パナソニックIpマネジメント株式会社 | Video control device, video control method and program |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000055678A (en) * | 1998-08-04 | 2000-02-25 | Denso Corp | Current position detecting device for vehicle |
US6211798B1 (en) * | 1995-11-14 | 2001-04-03 | Mannesmann Ag | Process and guidance system for ensuring reliable guidance of a vehicle |
US6502033B1 (en) * | 2000-10-05 | 2002-12-31 | Navigation Technologies Corp. | Turn detection algorithm for vehicle positioning |
CN105300395A (en) * | 2014-07-11 | 2016-02-03 | 北京协进科技发展有限公司 | Navigation and positioning method and device |
CN108344426A (en) * | 2018-01-24 | 2018-07-31 | 浙江大学 | A kind of heading angle deviation method of estimation of the water surface/between submarine navigation device and positioning device |
CN108871336A (en) * | 2018-06-20 | 2018-11-23 | 湘潭大学 | A kind of vehicle location estimating system and method |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2616911B2 (en) * | 1986-10-24 | 1997-06-04 | 日本無線株式会社 | Hybrid position measuring device |
JPH07257228A (en) | 1994-03-18 | 1995-10-09 | Nissan Motor Co Ltd | Vehicle display |
JP3645945B2 (en) * | 1994-08-23 | 2005-05-11 | クラリオン株式会社 | Gyro drift correction method and correction circuit |
JPH09196692A (en) * | 1996-01-19 | 1997-07-31 | Yazaki Corp | Vehicle navigation system |
JP4600357B2 (en) * | 2006-06-21 | 2010-12-15 | トヨタ自動車株式会社 | Positioning device |
JP2008157705A (en) * | 2006-12-22 | 2008-07-10 | Clarion Co Ltd | Navigation system and gps position accuracy determination method |
JP4984969B2 (en) * | 2007-03-01 | 2012-07-25 | 株式会社デンソー | Vehicle navigation device |
JP2009058242A (en) * | 2007-08-30 | 2009-03-19 | Alpine Electronics Inc | Method and device for correcting vehicle position-azimuth |
US9008906B2 (en) * | 2011-11-16 | 2015-04-14 | Flextronics Ap, Llc | Occupant sharing of displayed content in vehicles |
US9052203B2 (en) * | 2012-03-17 | 2015-06-09 | Mcube, Inc. | Methods and apparatus for low-cost inertial dead-reckoning using context detection |
US8843314B2 (en) | 2013-01-04 | 2014-09-23 | General Motors Llc | High fidelity horizontal position error estimation for vehicular GPS/DR navigation |
US11751123B2 (en) * | 2013-05-08 | 2023-09-05 | Cellcontrol, Inc. | Context-aware mobile device management |
CN109416256B (en) * | 2016-07-05 | 2022-03-22 | 三菱电机株式会社 | Travel lane estimation system |
JP6569999B2 (en) * | 2016-09-14 | 2019-09-04 | パナソニックIpマネジメント株式会社 | Display device |
US10880716B2 (en) * | 2017-02-04 | 2020-12-29 | Federico Fraccaroli | Method, system, and apparatus for providing content, functionalities, and services in connection with the reception of an electromagnetic signal |
WO2018179644A1 (en) * | 2017-03-27 | 2018-10-04 | ソニー株式会社 | Information processing device, information processing method, and recording medium |
KR102751569B1 (en) * | 2019-02-28 | 2025-01-10 | 삼성전자 주식회사 | Electronic device and method of providing information of electronic device |
JP7036080B2 (en) * | 2019-04-02 | 2022-03-15 | 株式会社豊田中央研究所 | Inertial navigation system |
US11199410B2 (en) * | 2019-04-30 | 2021-12-14 | Stmicroelectronics, Inc. | Dead reckoning by determining misalignment angle between movement direction and sensor heading direction |
US11747142B2 (en) * | 2019-04-30 | 2023-09-05 | Stmicroelectronics, Inc. | Inertial navigation system capable of dead reckoning in vehicles |
JP7361348B2 (en) * | 2019-06-24 | 2023-10-16 | パナソニックIpマネジメント株式会社 | Video control device, video control method and program |
CN114466308B (en) * | 2020-10-22 | 2023-10-10 | 华为技术有限公司 | Positioning method and electronic equipment |
US11897486B1 (en) * | 2021-11-29 | 2024-02-13 | Zoox, Inc. | Sensor consensus monitor |
US20230300567A1 (en) * | 2022-03-16 | 2023-09-21 | Qualcomm Incorporated | User equipment (ue) based relative position |
JP2023142997A (en) * | 2022-03-25 | 2023-10-06 | パナソニックIpマネジメント株式会社 | Display device and display method |
US20240102807A1 (en) * | 2022-09-27 | 2024-03-28 | Caret Holdings, Inc. | Data features integration pipeline |
-
2020
- 2020-04-02 JP JP2021527393A patent/JP7361348B2/en active Active
- 2020-04-02 CN CN202080043548.8A patent/CN114041069A/en active Pending
- 2020-04-02 WO PCT/JP2020/015238 patent/WO2020261696A1/en active Application Filing
- 2020-04-02 DE DE112020003021.0T patent/DE112020003021T5/en active Pending
-
2021
- 2021-12-20 US US17/556,552 patent/US12158343B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6211798B1 (en) * | 1995-11-14 | 2001-04-03 | Mannesmann Ag | Process and guidance system for ensuring reliable guidance of a vehicle |
JP2000055678A (en) * | 1998-08-04 | 2000-02-25 | Denso Corp | Current position detecting device for vehicle |
US6502033B1 (en) * | 2000-10-05 | 2002-12-31 | Navigation Technologies Corp. | Turn detection algorithm for vehicle positioning |
CN105300395A (en) * | 2014-07-11 | 2016-02-03 | 北京协进科技发展有限公司 | Navigation and positioning method and device |
CN108344426A (en) * | 2018-01-24 | 2018-07-31 | 浙江大学 | A kind of heading angle deviation method of estimation of the water surface/between submarine navigation device and positioning device |
CN108871336A (en) * | 2018-06-20 | 2018-11-23 | 湘潭大学 | A kind of vehicle location estimating system and method |
Non-Patent Citations (3)
Title |
---|
HAN, YQ; CHEN, JB; (...); YIN, JY: "Unscented Kalman Filter for DR/GPS Integrated Navigation System", INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTER SCIENCE AND ENGINEERING, 21 July 2010 (2010-07-21) * |
崔莅杭;张英敏;: "位置修正技术在航位推算中的应用", 火力与指挥控制, no. 08, 15 August 2012 (2012-08-15) * |
黄攀: "基于GPS/DR紧组合车载导航系统研究及实现", 中国硕士学位论文全文数据库 工程科技Ⅱ辑, 15 April 2014 (2014-04-15) * |
Also Published As
Publication number | Publication date |
---|---|
WO2020261696A1 (en) | 2020-12-30 |
US12158343B2 (en) | 2024-12-03 |
JP7361348B2 (en) | 2023-10-16 |
JPWO2020261696A1 (en) | 2020-12-30 |
US20220113138A1 (en) | 2022-04-14 |
DE112020003021T5 (en) | 2022-03-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107077794B (en) | In-vehicle control device, own vehicle position and attitude determination device, in-vehicle display device | |
KR102086270B1 (en) | Control method and traveling control device of the traveling control device | |
US9645250B2 (en) | Fail operational vehicle speed estimation through data fusion of 6-DOF IMU, GPS, and radar | |
CN105934786B (en) | Abnormal traveling portion level detecting apparatus and abnormal traveling location detection method | |
WO2018179616A1 (en) | Vehicle position deduction apparatus | |
EP2664894A2 (en) | Navigation apparatus | |
US20100017180A1 (en) | Method and device for object tracking in a driver assistance system of a motor vehicle | |
JP6342104B1 (en) | Vehicle position estimation device | |
WO2017150106A1 (en) | In-vehicle device and estimation method | |
KR102441073B1 (en) | Apparatus for compensating sensing value of gyroscope sensor, system having the same and method thereof | |
CN114041069A (en) | Image control device, image control method, and program | |
JP6419242B2 (en) | Moving distance measuring device, moving distance measuring method, and moving distance measuring program | |
JP7555662B2 (en) | DISPLAY CORRECTION SYSTEM, DISPLAY SYSTEM, DISPLAY CORRECTION METHOD, AND PROGRAM | |
JP6303418B2 (en) | Vehicle travel guidance device and vehicle travel guidance method | |
CN111982179B (en) | Abnormality detection device, abnormality detection method, and computer-readable medium | |
US11754403B2 (en) | Self-position correction method and self-position correction device | |
JP6413816B2 (en) | Random driving judgment device | |
JP7658337B2 (en) | Vehicle display control device, vehicle display device, vehicle, vehicle display control method, and vehicle display control program | |
JP7313325B2 (en) | Self-localization device | |
JP2006224904A (en) | Vehicle control device | |
JP2012225841A (en) | Correction device, correction method and program for correction device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20240327 Address after: Kanagawa Prefecture, Japan Applicant after: Panasonic Automotive Electronic Systems Co.,Ltd. Country or region after: Ri Ben Address before: Osaka, Japan Applicant before: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT Co.,Ltd. Country or region before: Ri Ben |