CN107796415B - Navigation device and display control method thereof - Google Patents
Navigation device and display control method thereof Download PDFInfo
- Publication number
- CN107796415B CN107796415B CN201610809418.6A CN201610809418A CN107796415B CN 107796415 B CN107796415 B CN 107796415B CN 201610809418 A CN201610809418 A CN 201610809418A CN 107796415 B CN107796415 B CN 107796415B
- Authority
- CN
- China
- Prior art keywords
- language type
- voice
- unit
- facility
- map
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 20
- 230000006870 function Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 9
- 238000001514 detection method Methods 0.000 description 6
- 238000003672 processing method Methods 0.000 description 6
- 241000405147 Hermes Species 0.000 description 4
- 238000013459 approach Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000004753 textile Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3629—Guidance using speech or audio output, e.g. text-to-speech
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
Abstract
The invention provides a navigation device and a display control method thereof, comprising: a storage unit; a display unit; a route setting unit; a voice navigation part; a voice language type acquisition unit for acquiring a language type of a voice of voice navigation; a map language type acquisition unit that acquires, from the map information, a language type of a map display name displayed on a map by a facility in the voice navigation unit; an actual display language type acquisition unit that acquires a language type of a facility actual display name in the voice navigation unit; a language type determination unit that determines whether or not the language type of the speech matches the language type of the map display name and the language type of the actual display name; and a control unit that changes the facility-related display on the display unit or the facility in the voice guidance unit when the language type determination unit determines that the language type of the voice does not match the language type of the map display name and the language type of the actual display name. According to the present invention, the user can easily find the relevant facility in voice navigation.
Description
Technical Field
The present invention relates to a navigation device and a display control method thereof, and more particularly, to a navigation device capable of outputting voice and image information and a display control method thereof.
Background
The current car navigation device has: a map display function of displaying a map of the current position of the vehicle, the surroundings, and the like on a display screen; a route search function of searching for a route from a departure place to a destination specified by a user; a route guidance function for guiding the vehicle to travel along the searched route, and a facility search function for performing processing such as searching and displaying various facility-related information.
In addition, most navigation apparatuses have a function of outputting various navigation information by voice. For example, when route guidance is performed, a guidance voice such as "turn right ahead" is output together with a route guidance map of the vehicle route displayed on the display screen at an intersection point where a route change is required. In addition, in order to make the user more intuitively guided, the name, the point, and the like of the landmark facility on the guidance route are added to the navigation and voice output is performed at the time of route navigation.
In general, a language used in a country or a region to which a current position belongs, or a language designated by a user is used to display map information such as facility and road names on a display screen, and a voice guidance language used when voice output route guidance is performed.
As shown in fig. 11, when each facility name is displayed in chinese and a language other than chinese on the map display screen, if the user selects to output the guidance route information and the facility name in voice using chinese as a native language, chinese "amushi" is output when performing voice navigation on the landmark facility "hemmes". For example, when the vehicle is traveling along the guidance route 311 and the current position 312 of the vehicle is about to reach the intersection, the navigation device outputs "turn left in amushi" in chinese to perform guidance of the travel route.
However, when the name of the facility displayed on the map and the name of the facility actually displayed on the facility plaque or the outer surface of the building are both in the roman language "hemmes" other than chinese, that is, when the name of the facility "amushi" in chinese is not displayed on the facility plaque or the outer surface of the building, for a user who only knows the native language of chinese, the corresponding location cannot be found on the map and the facility cannot be found in reality according to the voice navigation information including the chinese "amushi", and therefore, the user may be confused and uneasy where the facility, i.e., where the facility "amushi" turns left. This results in problems such as the user not being able to smoothly travel along the guidance route by the voice guidance information, and missing a route change point.
Disclosure of Invention
The present invention provides a navigation device and a display control method thereof, which can facilitate a user to find a facility in voice navigation, in order to solve the above-mentioned technical problems in the prior art.
In order to solve the above-described problems in the prior art, a navigation device according to the present invention includes: a storage unit that stores map information; a display unit for displaying the map information stored in the storage unit; a route setting unit that sets a guidance route; a voice guidance unit for guiding the guidance route-related information set by the route setting unit by voice; a voice language type acquisition unit for acquiring a language type of the voice outputted from the voice guidance unit; a map language type acquisition unit that acquires, from the map information, a language type of a map display name displayed on a map by a facility in the voice navigation unit; an actual display language type acquiring unit that acquires a language type of a facility actual display name in the voice guidance unit; and a control unit configured to change the facility displayed on the display unit or the facility in the voice navigation unit when the language type determination unit determines that the language type of the voice is not identical to the language type of the map display name and the language type of the actual display name.
According to the navigation device, the user can easily find the related facilities in the voice navigation, and can smoothly travel along the guide route according to the information of the voice navigation.
The navigation device may further include: and a language changing unit that changes a language of an actual display name to a language of the same type as that of the voice, and causes the control unit to display a facility name in the changed language on the display unit together with a corresponding map display name or to display the facility name in the changed language on the display unit in place of the map display name. In this way, the language type of the facility name of the facility displayed on the map is the same as the language type of the facility voice output in the voice navigation, that is, the displayed facility name and the facility name of the facility voice output are in the same language, so that the user can find the position of the facility in the voice navigation based on the display information of the facility on the map, and can smoothly travel along the guide route based on the navigation information.
The navigation device may further include: and a real image acquiring unit that acquires a real image of a facility in the voice navigation, wherein when the language type determining unit determines that the language type of the voice does not match the language type of the map display name and the language type of the real display name, the control unit causes the display unit to display the real image of the facility acquired by the real image acquiring unit. In this way, the user can see the real image of the facility on the display unit and know which facility is being guided, so that the user can find the facility more intuitively and easily, and can smoothly travel along the guide route based on the navigation information.
The navigation device may further include: and a substitute facility detection unit configured to detect, from other facilities in the vicinity of the facility in the voice navigation unit, a substitute facility in which at least one of the language type of the map display name and the language type of the actual display name is the same as the language type of the voice when the language type determination unit determines that the language type of the voice does not match the language type of the map display name and the language type of the actual display name, and to output the voice by replacing the facility in the voice navigation unit with the substitute facility through the control unit. In this way, since voice navigation is performed using the substitute facility having the same language type as that of the navigation voice, either the language type of the map display name or the language type of the actual display name, the user can find the substitute facility and smoothly travel along the guidance route based on the navigation information.
In the navigation device, the actual display language type acquiring unit may acquire the language type of the actual display name from a street view image.
In addition, in order to solve the above technical problems in the prior art, the present invention further provides a display control method of a navigation device, including: a voice language type obtaining step, namely obtaining the language type of voice in voice navigation; a map language type acquisition step of acquiring a language type of a map display name displayed on a map by a facility in voice navigation from map information; an actual display language type acquisition step of acquiring a language type of an actual display name on a facility in voice navigation; a language type determination step of determining whether or not the language type of the voice matches the language type of the map display name and the language type of the actual display name, and a control step of changing the facility during the voice navigation or the related display of the facility when the language type determination step determines that the language type of the voice does not match the language type of the map display name and the language type of the actual display name.
According to the navigation device and the display control method thereof, the user can easily find the related facilities in the voice navigation, and can smoothly travel along the guide route according to the navigation information.
Drawings
Fig. 1 is a block diagram showing a configuration of a navigation device according to a first embodiment of the present invention.
Fig. 2 is a flowchart showing a specific processing method of the navigation device according to the first embodiment of the present invention.
Fig. 3 is a schematic display screen of the navigation device according to the first embodiment of the present invention.
Fig. 4 is a schematic display screen of the navigation device according to the first embodiment of the present invention.
Fig. 5 is a block diagram showing the configuration of a navigation device according to a second embodiment of the present invention.
Fig. 6 is a flowchart showing a specific processing method of a navigation device according to a second embodiment of the present invention.
Fig. 7 is a schematic view of a display screen of a navigation device according to a second embodiment of the present invention.
Fig. 8 is a block diagram showing the configuration of a navigation device according to a third embodiment of the present invention.
Fig. 9 is a flowchart showing a specific processing method of a navigation device according to a third embodiment of the present invention.
Fig. 10 is a schematic view of a display screen of a navigation device according to a third embodiment of the present invention.
Fig. 11 is a schematic view of a display screen of a navigation device in a conventional problem.
Detailed Description
Hereinafter, a navigation device according to a first embodiment of the present invention will be described in detail with reference to the drawings.
Fig. 1 is a block diagram showing the configuration of a car navigation device according to a first embodiment. As shown in the figure, the car navigation device includes: a storage unit 1, a current position detection unit 2, a route setting unit 3, a voice navigation unit 4, a voice language type acquisition unit 5, a map language type acquisition unit 6, an actual display language type acquisition unit 7, a language type determination unit 8, a language change unit 9, a display unit 10, and a control unit 11.
The storage unit 1 may be constituted by a nonvolatile Memory such as a flash Memory, a volatile Memory such as a RAM (Random Access Memory), or a combination of both, may be constituted by a DVD-ROM which is an optical storage medium, or may be constituted by a Memory card or the like. The storage unit 1 according to the first embodiment stores information including map data, various kinds of language data, voice data, and real images of facilities.
The current position detecting unit 2 detects the current position of the vehicle by using an autonomous navigation system sensor device such as a gyro sensor and a GPS receiver.
The route setting unit 3 searches for a route between two points in the map data stored in the storage unit 1, and sets the route as a guidance route. For example, a route from the current position detected by the current position detecting unit 2 at the time of departure of the user to the destination set by the user is searched for and set as the guidance route. The guidance route set by the route setting unit 3 is displayed on the display unit 10 together with map information.
The voice guidance unit 4 is configured to output voice guidance through a speaker based on the information on the guidance route retrieved by the route setting unit 3 and the guidance information for each intersection node stored in the storage unit 1. The content of the voice guidance output includes words necessary for voice output of facility information guidance, such as information on the name of the facility and the location of the facility. A method for setting a language type of an output speech, comprising: a method of default setting of a language type of a voice at the time of manufacturing and production of a navigation device; a method of setting by estimating the native language of a user based on personal information (for example, nationality, residence, etc.) registered in advance by the user, coordinates (longitude, latitude) of the residence location, and the like; and the user himself/herself sets the method of which language is used.
Here, the frequency of voice output in voice guidance, the voice type, the target of voice guidance, and the like may be set by default or may be set according to the preference of the user.
The speech language type acquiring unit 5 acquires the language type of the speech output from the speech navigation unit 4. The speech language type acquiring unit 5 acquires the set speech language type from the speech navigation unit 4 or the storage unit 1. Alternatively, the navigation voice may be collected by a voice collecting means such as a microphone, and voice data may be analyzed in the navigation device, or the type of language may be determined by analyzing the voice data using a network search function.
The map language type acquiring unit 6 is a language type for acquiring a map display name displayed on the map by the facility in the voice navigation unit 4 from the map information. For example, the voice guidance unit 4 stores the facility name of the facility in the storage unit 1 in association with which facility position on the map the facility corresponds, retrieves the facility name of the facility from the storage unit 1 by using the association, compares the retrieved facility name with the voice data of each type stored in the storage unit 1, and acquires the language type of the map display name of the facility.
Here, the map language type acquisition unit 6 may acquire the language type of the map display name of the facility on the display screen in accordance with the travel of the vehicle, or may acquire the language type of the map display name of the facility before the facility name of the facility is voice-output based on the voice guidance information and the time of performing the voice guidance.
The actual display language type acquisition unit 7 is a language type for acquiring a name actually displayed at a facility in the voice navigation unit 4.
Here, the actual display language type acquisition unit 7 may determine the language type of the acquired actual display name of the facility from the actual display names of the facilities stored in the storage unit 1 in advance, or may directly acquire the language type by the language type of the actual display name stored in the storage unit 1 in advance. The actual display language type acquisition unit 7 may acquire the language type of the actual display name of the facility by recognizing the actual display name of the facility from the Street View image by a Street View (Street View) function of the satellite map while the user uses the navigation device, or may acquire the language type of the actual display name of the facility by capturing an image of the actual object of the facility with an on-board camera or the like and recognizing the actual display name. Further, the actual object of the facility in the voice guidance unit 4 may be specified based on the position of the facility in the voice guidance unit 4 on the map stored in the storage unit 1.
The language type determination unit 8 is formed of an application specific integrated circuit or a CPU function block. The language type determination unit 8 determines whether or not the language type of the voice navigation matches the language type of the map display name and the language type of the actual display name. In the present invention, the language type determination unit 8 may determine whether or not the language type of the voice guidance and the language type of the actual display name of the facility from which the voice is output match, and if not, whether or not the language type of the voice guidance and the language type of the map display name of the facility match, or may determine whether or not both the language type of the voice guidance, the language type of the map display name of the facility and the language type of the actual display name match.
The language changing unit 9 changes the language of the actual display name to the same language as the language type of the speech. In the first embodiment, the names of the respective language types of each facility are associated with each other and stored in the storage unit 1, and the language changing unit 9 selects the name of the language type identical to the language type of the voice navigation from the names of the associated language types and changes the name to the facility name of the facility. Further, the facility name corresponding to the language type identical to the language type of the voice guidance can be searched out through the network, and the facility name can be changed.
The display unit 10 may have both a display screen and a touch panel, and may have an operation unit (not shown). The Display screen may be formed of a Display device such as an LCD (Liquid Crystal Display), an Organic EL (Organic Electro-luminscence Panel), or a plasma screen.
The touch panel of the display unit 10 may be any one of a capacitive type, a resistive type, an infrared type, and a load detection type. The operation options can be manipulated by touching the operation options on the display panel with a finger, a stylus pen, or the like.
The display unit 10 displays information such as a map used for navigation, a current position of the vehicle, a guide route, and a facility name.
The display unit 10 of the present invention may not be provided with a touch panel, and may have only a display panel.
The control unit 11 is a general computer, and includes therein, for example, a CPU, a ROM, an EEPROM, a RAM, an I/O, and a bus (not shown) connecting these components. When the language type determination unit 8 determines that the language type of the voice does not match either or both of the language type of the map display name and the language type of the actual display name, the control unit 11 changes the display related to the facility in the voice navigation unit 4 displayed on the map.
Next, the operation of the first embodiment of the present invention will be specifically described with reference to fig. 2 to 4.
Fig. 2 is a flowchart showing a processing method of the navigation device according to the first embodiment, and fig. 3 and 4 are schematic diagrams showing a display screen of the navigation device according to the first embodiment.
First, in the first embodiment, before the start of voice navigation, as shown in fig. 3, the roman facility name "hemmes", the chinese kanji facility names "finance center", "textile building", and "time square" are displayed on the display screen of the display unit 10, and the user sets chinese characters as the language of voice output of voice navigation.
When the vehicle position approaches a point a predetermined distance (for example, 30 m) before the intersection where the travel route change is required, the voice guidance unit 4 outputs a voice including information such as a road name or a landmark facility name. For example, when the vehicle approaches an intersection where a facility "hemmes" exists nearby, "left turn in amushi is output in chinese speech.
As shown in fig. 2, in step S101, while the vehicle is traveling along the guidance route, the guidance route 311 and the vehicle current position 312 are displayed on the display unit 10 of the car navigation device, and if the vehicle is traveling forward from the vehicle current position 312 for a predetermined distance or a predetermined time, it is determined whether or not to start voice navigation when a route change such as a left-right turn is to be performed when the vehicle reaches an intersection. When it is determined that voice navigation of the guidance route is to be started (yes in step S101), the process proceeds to step S102, otherwise (no in step S101), the process returns to step S101, and the process in step S101 is repeated.
Next, in step S102, the speech language type acquisition unit 5 acquires the language information of the speech output from the speech navigation unit 4. In the present embodiment, after the language type in which chinese is speech is acquired, the process proceeds to step S103.
In step S103, as shown in fig. 3, the display name "HERMES" of the facility output by the voice navigation unit 4 is displayed on the display screen on which the current vehicle position 312 and the guide route 311 are displayed, and the language type of "HERMES" acquired by the map display language type acquisition unit 6 is french by comparing the "HERMES" with each piece of voice data stored in the storage unit 1.
In step S103, the actual display language type acquisition unit 7 recognizes that the actual display name of the facility "hemmes" is "hemmes" from the building plaque or the building exterior surface picture of the facility "hemmes" by the street view function of the satellite map using the image recognition technique, and compares the language data stored in the storage unit 1 to acquire that the language type of the actual display name "hemmes" is french. Subsequently, the process proceeds to step S104.
In step S104, the language type determination unit 8 determines whether or not the language type of the voice navigation of the facility "HERMES" matches the language type of the map display name and the language type of the actual display name. In the first embodiment, since the facility name outputted by voice is "romance", the language type of voice navigation is chinese, and the map display name and the actual display name of the facility "hemmes" are both "hemmes", and the language type thereof is french, the language type determination unit 8 determines that the language type of voice navigation is not consistent with both the language type of the map display name and the language type of the actual display name (no in step S104), and the flow proceeds to step S105. When the language type determination unit 8 determines that the language type of the voice guidance matches either the language type of the map display name or the language type of the actual display name (yes in step S104), all the processes are terminated.
In step S105, the language changing unit 9 acquires the chinese name "amushi" corresponding to the actual display name "hemmes" of the facility "hemmes" from the storage unit 1, and the control unit 11 displays the facility name on the map display screen of the display unit 10 together with "hemmes" and "amushi" as shown in fig. 4.
Here, the user can set that the facility name is displayed on the map display screen in place of the "hemmes" by "emacias".
Next, in step S106, as shown in fig. 4, when the vehicle current position 312 on the guide route 311 reaches the voice navigation start position, the voice navigation unit 4 performs voice navigation for traveling along the guide route such as turning at the next intersection. In the present embodiment, the Chinese speech is used to output "the left turn of Amiss".
A navigation device according to a second embodiment of the present invention will be described with reference to fig. 5 to 7.
Fig. 5 is a block diagram showing the configuration of the car navigation device according to the second embodiment. Compared with fig. 1 showing a block diagram of the first embodiment, the main difference is that the language changing unit 9 is replaced with a real image acquiring unit 12. The same components as those in fig. 1 are denoted by the same reference numerals, and description thereof will be omitted, and only relevant portions of the real image acquiring unit 12 will be described.
The real image acquisition unit 12 acquires a real image of the facility in the voice navigation unit 4. In the second embodiment, the relevant real image information of the facility is acquired from the facility-related information stored in the storage unit 1. In addition, the facility photo may be acquired by searching for information of the facility through a network, or the real image of the facility may be acquired by a street view function of a satellite map. If an on-board camera is installed, a real-world image of the facility can also be acquired by shooting the facility with the on-board camera.
The control unit 11 according to the second embodiment displays the real image acquired by the real image acquiring unit 12 on the display unit 10.
Fig. 6 is a flowchart showing a processing method of the navigation device according to the second embodiment. Steps S201 to S204 and S206 are the same as steps S101 to S104 and S106, respectively, as compared with the flowchart of fig. 2 showing the first embodiment, and therefore, the description of these steps is omitted here.
In step S205, the real image acquired by the real image acquisition unit 12 is displayed on the display unit 10 by the control unit 11. In the present embodiment, as shown in fig. 7, a real image of the facility "companies" is displayed on the map display screen. In fig. 7, a guidance route 311 and a vehicle current position 312 are also displayed.
Here, the control unit 11 may display a real image of the facility on the map display screen, and may display a facility name "amushi" corresponding to the facility in the same language as the language type of the voice navigation, together with the real image.
A navigation device according to a third embodiment of the present invention will be described with reference to fig. 8 to 10.
Fig. 8 is a block diagram showing the configuration of the car navigation device according to the third embodiment. Compared with fig. 1 showing a block diagram of the first embodiment, the main difference is that the language changing unit 9 is replaced with a substitute facility detecting unit 13. The same components as those in fig. 1 are denoted by the same reference numerals, and description thereof will be omitted, and only relevant portions of the substitute facility detection unit 13 will be described.
When the language type determination unit 8 determines that the language type of the voice does not match the language type of the map display name and the language type of the actual display name, the substitute facility detection unit 13 detects a substitute facility in which at least one of the language type of the map display name and the language type of the actual display name is the same as the language type of the voice from other facilities in the vicinity of the facility in the voice navigation unit 4. Specifically, the position of a facility in voice navigation is detected from the map information stored in the storage unit 1, and other facilities in the vicinity of the position are detected, and it is determined whether or not the language type of the map display name of the facility matches the language type of the voice set by the voice navigation, based on the map information. If they match, the facility is used as a substitute facility to substitute the original facility in the voice guidance unit 4. If they do not match, the actual display name data of the facility stored in the storage unit 1 determines whether or not the language type of the detected actual display name of the other facility matches the language type of the voice, and if they match, the facility is replaced with the original facility in the voice navigation unit 4 as a substitute facility. When there are a plurality of other facilities in the vicinity of the facility in the voice guidance unit 4, one facility satisfying the above condition may be detected.
The control unit 11 in the present embodiment replaces the original facility with the substitute facility detected by the substitute facility detecting unit 13, and performs voice output by the voice guidance unit 4 using the language type of the voice guidance voice set. In addition, the alternative facility is a facility on the guidance route.
Fig. 9 is a flowchart showing a processing method of the navigation device according to the third embodiment of the present invention. Steps S301 to S304 are the same as steps S101 to S104, respectively, as compared with the flowchart of fig. 2 showing the first embodiment, and therefore, the description of these steps is omitted here.
In step S305, when the language type determination unit 8 determines that the language type of the voice does not match the language type of the map display name and the language type of the actual display name, the substitute facility detection unit 13 detects a substitute facility having at least one of the language type of the map display name and the language type of the actual display name that is the same as the language type of the voice from other facilities in the vicinity of the facility in the voice navigation unit 4. In the present embodiment, as shown in fig. 10, in the vicinity of the facility "hemmes", the language type of the map display name for which the "fashion taste" of another facility is detected matches the language type "chinese" of the voice for voice navigation. In the vicinity of the facility "hemmes", if it is detected that the language type of the actual display name of the other facility matches the language type of the voice guidance, the facility may be detected as a substitute facility.
Next, in step S306, as shown in fig. 10, when the vehicle current position 312 on the guide route 311 reaches the voice navigation start position, the voice navigation portion 4 performs voice navigation along the guide route such as turning at the next intersection by the control portion 11. In the present embodiment, the "left turn at a still good time" is output by a chinese speech.
The present invention is not limited to the vehicle-mounted navigation device in the embodiment, but is also applicable to a combined system composed of a handheld navigation device, a mobile terminal installed with a navigation application program, and a vehicle-mounted display device, and various electronic devices such as a mobile phone and a tablet computer capable of operating a navigation function.
Claims (3)
1. A navigation device, comprising:
a storage unit that stores map information;
a display unit that displays the map information stored in the storage unit;
a route setting unit that sets a guidance route; and
a voice guidance unit that guides the guidance route related information set by the route setting unit by voice, the voice guidance unit further comprising:
a voice language type acquisition unit for acquiring the language type of the voice outputted from the voice guidance unit;
a map language type acquisition unit that acquires, from the map information, a language type of a map display name displayed on a map by a facility in the voice navigation unit;
an actual display language type acquiring unit that acquires a language type of a facility actual display name in the voice guidance unit;
a language type determination unit configured to determine whether or not a language type of the speech matches a language type of the map display name and a language type of the actual display name; and
and a control unit configured to detect, from other facilities in the vicinity of the facility in the voice navigation unit, a substitute facility having at least one of a language type of a map display name and a language type of an actual display name that is the same as a language type of the voice, when the language type determination unit determines that the language type of the voice does not match the language type of the map display name and the language type of the actual display name, replace the facility in the voice navigation unit with the substitute facility, and output the voice from the voice navigation unit.
2. The navigation device of claim 1,
the actual display language type acquiring unit acquires a language type of the actual display name from the street view image.
3. A display control method of a navigation device, comprising:
a voice language type obtaining step, namely obtaining the language type of voice in voice navigation;
a map language type acquisition step of acquiring a language type of a map display name displayed on a map by a facility in voice navigation from map information;
an actual display language type acquisition step of acquiring a language type of an actual display name on a facility in voice navigation;
a language type determination step of determining whether or not the language type of the voice matches the language type of the map display name and the language type of the actual display name; and
and a control step of detecting, when the language type determination step determines that the language type of the voice does not match the language type of the map display name and the language type of the actual display name, a substitute facility having at least one of the language type of the map display name and the language type of the actual display name that is the same as the language type of the voice from other facilities in the vicinity of the facility in the voice navigation unit, replacing the facility in the voice navigation unit with the substitute facility, and outputting the voice from the voice navigation unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610809418.6A CN107796415B (en) | 2016-09-07 | 2016-09-07 | Navigation device and display control method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610809418.6A CN107796415B (en) | 2016-09-07 | 2016-09-07 | Navigation device and display control method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107796415A CN107796415A (en) | 2018-03-13 |
CN107796415B true CN107796415B (en) | 2022-11-18 |
Family
ID=61530932
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610809418.6A Active CN107796415B (en) | 2016-09-07 | 2016-09-07 | Navigation device and display control method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107796415B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110070623B (en) * | 2019-04-16 | 2023-02-24 | 阿波罗智联(北京)科技有限公司 | Guide line drawing prompting method, device, computer equipment and storage medium |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11219105A (en) * | 1998-01-29 | 1999-08-10 | Toyota Motor Corp | Navigation device |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000206980A (en) * | 1999-01-12 | 2000-07-28 | Mitsubishi Electric Corp | Voice interactive navigation device |
JP4292646B2 (en) * | 1999-09-16 | 2009-07-08 | 株式会社デンソー | User interface device, navigation system, information processing device, and recording medium |
JP3908437B2 (en) * | 2000-04-14 | 2007-04-25 | アルパイン株式会社 | Navigation system |
JP4084550B2 (en) * | 2001-07-05 | 2008-04-30 | アルパイン株式会社 | Navigation device |
JP4997796B2 (en) * | 2006-03-13 | 2012-08-08 | 株式会社デンソー | Voice recognition device and navigation system |
JP2009140287A (en) * | 2007-12-07 | 2009-06-25 | Alpine Electronics Inc | Retrieval result display device |
JP5274191B2 (en) * | 2008-10-06 | 2013-08-28 | 三菱電機株式会社 | Voice recognition device |
CN104978015B (en) * | 2014-04-14 | 2018-09-18 | 博世汽车部件(苏州)有限公司 | Navigation system and its control method with languages self application function |
CN105099855B (en) * | 2014-04-30 | 2019-01-04 | 阿尔派株式会社 | The control method for playing back of electronic device and voice messaging |
CN106662918A (en) * | 2014-07-04 | 2017-05-10 | 歌乐株式会社 | In-vehicle interactive system and in-vehicle information appliance |
-
2016
- 2016-09-07 CN CN201610809418.6A patent/CN107796415B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11219105A (en) * | 1998-01-29 | 1999-08-10 | Toyota Motor Corp | Navigation device |
Also Published As
Publication number | Publication date |
---|---|
CN107796415A (en) | 2018-03-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3480561B1 (en) | Navigation method, device, and system | |
US9064155B2 (en) | Guidance device, guidance method, and guidance program | |
US7359798B2 (en) | Method of controlling display of point information on map | |
JP4994028B2 (en) | Gasoline price information collection system, gasoline price information collection method, and navigation apparatus | |
JP4450003B2 (en) | Navigation device | |
CN107543547B (en) | Navigation method, device and system | |
US20070115142A1 (en) | Navigation system and landmark highlighting method | |
JP4683380B2 (en) | Lane change guidance device | |
JP2007147567A (en) | Map information update system, central device, map information update method, and computer program | |
US11644330B2 (en) | Setting destinations in vehicle navigation systems based on image metadata from portable electronic devices and from captured images using zero click navigation | |
WO2016035281A1 (en) | Vehicle-mounted system, information processing method, and computer program | |
JP2020038632A (en) | Sign recognition system and sign recognition method | |
JP2007212857A (en) | Navigation device | |
CN107796415B (en) | Navigation device and display control method thereof | |
KR20170030763A (en) | Method of guiding for intersection, navigation server, navigation terminal, and navigation system including the same | |
JP2009198508A (en) | Route guidance device | |
JP7215184B2 (en) | ROUTE GUIDANCE CONTROL DEVICE, ROUTE GUIDANCE CONTROL METHOD, AND PROGRAM | |
WO2020045345A1 (en) | Sign recognition system and sign recognition method | |
KR20160130202A (en) | Apparatus and method for searching route, data saving device thereof | |
JP2011106862A (en) | Navigation device and navigation method | |
KR20170109130A (en) | Address management apparatus of navigation system and method thereof | |
JP2020144552A (en) | Information providing device and information providing method | |
KR100521056B1 (en) | Method for displaying information in car navigation system | |
CN119598038A (en) | Address verification method and device and electronic equipment | |
JP2006038557A (en) | Car navigation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |