[go: up one dir, main page]

CN104121910A - Navigation method, device, terminal, server and system - Google Patents

Navigation method, device, terminal, server and system Download PDF

Info

Publication number
CN104121910A
CN104121910A CN201310157571.1A CN201310157571A CN104121910A CN 104121910 A CN104121910 A CN 104121910A CN 201310157571 A CN201310157571 A CN 201310157571A CN 104121910 A CN104121910 A CN 104121910A
Authority
CN
China
Prior art keywords
information
terminal
navigation
target
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310157571.1A
Other languages
Chinese (zh)
Inventor
邝俊斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201310157571.1A priority Critical patent/CN104121910A/en
Priority to CN201710262213.5A priority patent/CN106969774A/en
Priority to TW102144934A priority patent/TWI494542B/en
Publication of CN104121910A publication Critical patent/CN104121910A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • G01C21/3673Labelling using text of road map data items, e.g. road names, POI names
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3679Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities
    • G01C21/3682Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities output of POI information on a road map

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a navigation method, a terminal, a server and a system, belonging to the field of Internet. The navigation method comprises the following steps: acquiring position parameters of a target position; acquiring position parameters of the terminal; determining navigation information according to the position parameters of the target position and the position parameters of the terminal; and displaying the navigation information in a real-time streetscape image displayed on the terminal in a superimposing manner. By adopting the navigation method, the problems that the traditional navigation method only can indicate a general position region and the target position cannot be correctly found under the situation that a user is not familiar with the ambient environment can be solved; the navigation information is combined with the real-time streetscape, so that the target position guided by the navigation information is a specific position in the real-time streetscape image, and the target position can be accurately found by the user.

Description

Navigation method and device, terminal, server and system
Technical Field
The invention relates to the field of internet, in particular to a navigation method, a navigation device, a terminal, a server and a navigation system.
Background
Terminals such as smart phones, tablet computers, palmtop computers and e-book readers have been widely used by users in daily life. Among them, navigation through a terminal is one of the most frequently used functions of a user.
Since most maps are 2D maps based on planes, the current common navigation method is also implemented based on 2D maps. Fig. 1 shows a schematic diagram of a conventional implementation of navigation through a terminal. A 2D map 140 is displayed on the screen 120 of the terminal, a current position 142 and a target position 144 are displayed on the 2D map 140, and a navigation direction and a navigation route are displayed between the current position 142 and the target position 144 through a navigation track 146. The user may move from the current location 142 to the target location 144 as guided by the navigation track 146.
In the process of implementing the invention, the inventor finds that the prior art has at least the following problems: since the 2D map is usually displayed on a certain scale, the target position only represents an approximate location area and cannot indicate a specific location. The user moves to a general location area even though the user moves to the target location, and the user still cannot correctly find the target location if the user is unfamiliar with the surrounding environment.
Disclosure of Invention
In order to solve the problem that the conventional navigation method can only represent a general position area and cannot correctly find a target position under the condition that a user is unfamiliar with the surrounding environment, the embodiment of the invention provides a navigation method, a navigation device, a terminal, a server and a navigation system. The technical scheme is as follows:
in a first aspect, a navigation method is provided, where the navigation method includes:
acquiring a position parameter of a target position;
acquiring a position parameter of a terminal;
determining navigation information according to the position parameters of the target position and the position parameters of the terminal;
and displaying the navigation information in an overlapping manner on the real-time street view image displayed by the terminal.
In a second aspect, there is provided a navigation device comprising:
the first acquisition module is used for acquiring the position parameters of the target position;
the second acquisition module is used for acquiring the position parameters of the terminal;
the information determining module is used for determining navigation information according to the position parameter of the target position and the position parameter of the terminal;
and the information display module is used for displaying the navigation information determined by the information determination module on the real-time street view image displayed by the terminal in an overlapping manner.
In a third aspect, a terminal is provided, the terminal comprising the navigation device according to the second aspect.
In a fourth aspect, there is provided a server comprising a navigation device according to the second aspect.
In a fifth aspect, there is provided a navigation system comprising a terminal and a server, the server comprising the navigation device of the second aspect.
The technical scheme provided by the embodiment of the invention has the following beneficial effects:
navigation information is displayed by superposing on a real-time street view image displayed by a terminal; the problem that the existing navigation method can only represent a general position area and cannot correctly find a target position under the condition that a user is unfamiliar with the surrounding environment is solved; the method and the device have the advantages that the navigation information is combined with the real-time street view image, so that the target position guided by the navigation information is a specific position in the real-time street view image, and a user can accurately find the target position.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of a conventional implementation of navigation through a terminal;
FIG. 2 is a flowchart of a navigation method according to an embodiment of the present invention;
FIG. 3 is a flowchart of a navigation method according to a second embodiment of the present invention;
FIG. 4A is a schematic illustration of the determination of navigation directions in accordance with embodiments two and three of the present invention;
fig. 4B is a schematic diagram showing determination of the horizontal display position of the first display contents according to the second and third embodiments of the present invention;
fig. 4C is a schematic view showing determination of the vertical display position of the first display contents according to the second and third embodiments of the present invention;
fig. 4D is another determination diagram of the vertical display position of the first display contents according to the second and third embodiments of the present invention;
fig. 4E is a diagram illustrating the final effect of the first display contents according to the second and third embodiments of the present invention;
fig. 4F is a schematic diagram illustrating the effect of displaying navigation information according to the second and third embodiments of the present invention on a real-time street view image in an overlapping manner;
FIG. 5 is a flowchart of a navigation method provided in the third embodiment of the present invention;
fig. 6 is a block diagram of a navigation device according to a fourth embodiment of the present invention;
fig. 7 is a block diagram of a navigation device according to a fifth embodiment of the present invention;
fig. 8 is a block diagram showing a navigation system according to a sixth embodiment of the present invention. .
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
First, some technical terms related to various embodiments of the present invention are briefly introduced:
street view images are divided into two types: traditional street view images and real-time street view images. The traditional street view image is a street view image which is collected in advance and stored in the server, and the terminal can call the street view image from the server at any time for displaying; the real-time street view image is an image which is acquired in real time through a built-in camera when the terminal is positioned in a certain street view. The latter, i.e. real-time street view images, is mainly referred to herein.
POI (Point of Interest): a POI is a kind of map data, each POI usually contains four pieces of information, i.e., name, category, longitude and latitude, such as "yellow river hotel", "restaurant", "east longitude E120 ° 23'", "north latitude N31 ° 29". Of course, the POI may also include some other additional information such as height information, for example, the building height of the yellow river hotel is 200 meters, and then the height information "200 m".
Example one
Referring to fig. 2, a flowchart of a navigation method according to an embodiment of the invention is shown. The navigation method can be used in a terminal and can also be used in a navigation system comprising the terminal and a server. The navigation method comprises the following steps:
step 202, acquiring a position parameter of a target position;
the position parameter of the target position includes a parameter for indicating a position where the target position is located in the map.
Step 204, acquiring a position parameter of the terminal;
the location parameter of the terminal may include not only a parameter indicating a location where the terminal is located in the map but also a parameter indicating a three-dimensional spatial location and a motion state of the terminal itself.
Step 206, determining navigation information according to the position parameter of the target position and the position parameter of the terminal;
and step 208, overlaying and displaying navigation information on the real-time street view image displayed by the terminal.
The real-time street view image is an image acquired by a camera in the terminal in real time, and the navigation information is displayed on the real-time street view image in an overlapping manner.
In summary, in the navigation method provided by this embodiment, the navigation information is displayed in an overlapping manner on the real-time street view image displayed by the terminal; the problem that the existing navigation method can only represent a general position area and cannot correctly find a target position under the condition that a user is unfamiliar with the surrounding environment is solved; the navigation information is combined with the real-time street view image, and the target position guided by the navigation information is a specific position in the real-time street view image, so that the user can accurately find the target position.
Example two
Referring to fig. 3, a flowchart of a navigation method according to a second embodiment of the present invention is shown. The navigation method can be used in a terminal comprising a camera and a plurality of sensors, and the terminal can be a smart phone, a tablet computer, an electronic book reader, an MP3 player (Moving Picture Experts Group Audio Layer III, motion Picture Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion Picture Experts compression standard Audio Layer 3), a laptop computer and the like. The navigation method comprises the following steps:
step 301, acquiring a position parameter of a target position;
the terminal provides real-time street view navigation service for the user. After the user initiates the service, the terminal may ask the user to input the target location. At this time, the present step may include the following sub-steps:
firstly, receiving attribute information of a target position of character input or voice input;
if the user inputs the attribute information of the target position in a character mode through a physical keyboard or a virtual keyboard, the terminal receives the attribute information of the target position input through characters;
if the user inputs the attribute information of the target position in a voice mode through the microphone, the terminal receives the attribute information of the target position input in the voice mode, and then the voice signal is recognized as corresponding character information through a voice recognition technology.
The attribute information of the target location is usually the name of the target location, and may be other description information with unique identification.
Secondly, inquiring interest points corresponding to the target position according to the attribute information of the target position;
the terminal can be pre-cached with a POI information base, and after obtaining the attribute information of the target position, the terminal queries the interest point corresponding to the target position in the POI information base according to the attribute information. If the interest point corresponding to the target position is not found, the terminal may require the user to re-input the attribute information of the target position.
Thirdly, the longitude and latitude information in the inquired interest points corresponding to the target position is used as the position parameters of the target position;
and when the terminal inquires the POI corresponding to the target position, using the longitude and latitude information in the POI corresponding to the target position as the position parameter of the target position.
It should be noted that, when the user starts the real-time street view navigation service, the terminal may start to collect and display the real-time street view image. Of course, the terminal may start to acquire and display the real-time street view image after starting the navigation, which is not limited to this.
Step 302, acquiring a position parameter of a terminal;
different from the position parameter of the target position, the position parameter of the terminal not only includes longitude and latitude information, but also includes orientation information, and the orientation information represents the orientation position of the camera of the terminal. If the terminal is in a non-stationary state, the position parameter of the terminal further needs to include at least one of motion information and rotation angle information, the motion information represents the motion information of the terminal in the moving process, and the rotation angle information represents the motion information of the terminal in the rotating process.
The terminal may acquire the location parameters through at least one sensor built in. Specifically, the method comprises the following steps:
firstly, acquiring longitude and latitude information through a Global Positioning System (GPS) receiver in a terminal, and taking the longitude and latitude information as a part of position parameters;
secondly, acquiring orientation information through an electronic compass sensor in the terminal, and taking the orientation information as the other part of the position parameters;
thirdly, if the position parameters also comprise motion information, acquiring the motion information through a three-axis acceleration sensor in the terminal;
fourthly, if the position parameters also comprise rotation angle information, acquiring the rotation angle information through a gyroscope sensor in the terminal;
it should be noted that, the above four steps are only for illustration, and the acquisition order of each piece of information is not specifically limited, that is, the terminal usually acquires each piece of information in the location parameters at the same time. In addition, the terminal may acquire the location parameters more than once, but may acquire the location parameters once at predetermined time intervals.
Step 303, requesting a map from a server;
the terminal requests a map, which may be existing 2D map data, from the server. It should be noted that, if the terminal has cached the offline map data packet in advance, step 303 and step 304 do not need to be executed. In this embodiment, the terminal is only exemplified as not including the offline map data packet.
Step 304, receiving and storing a map fed back by the server;
and the terminal receives and stores the 2D map data fed back by the server.
305, determining navigation information according to the position parameter of the target position and the position parameter of the terminal;
and the terminal determines navigation information according to the acquired position parameters of the target position, the acquired position parameters of the terminal and a map stored locally. The navigation information is the navigation information special for the real-time street view image and comprises navigation direction, destination marking information or the combination of the navigation direction and the destination marking information. Wherein the navigation direction is used for indicating a direction towards the target position in the real-time street view image; and the target labeling information is used for identifying the location of the target position in the real-time street view image. The following first describes the determination process of the navigation direction:
and the terminal determines a navigation direction according to the position parameter of the target position and the position parameter of the terminal, wherein the navigation direction is used for indicating the direction towards the target position in the real-time street view image. In particular, the determination of the navigation direction may comprise the sub-steps of:
firstly, calculating a navigation track on a map according to longitude and latitude information of a target position and longitude and latitude information of a terminal;
with reference to fig. 4A, the terminal may locate the location a of the target location on the map 31 according to the latitude and longitude information of the target location, may locate the current location B of the terminal on the map 31 according to the latitude and longitude information of the terminal, and may then calculate a navigation track between the points a and B on the map according to a navigation algorithm, where the navigation track is usually a shortest path between the points a and B when the terminal travels in a certain traffic manner, such as a shortest path during walking, a shortest path of a bus route, or a shortest path during automobile traveling.
And secondly, generating a navigation direction according to the orientation information and the navigation track of the terminal.
If the navigation track is regarded as a curve and the position of the terminal on the map is regarded as a point on the navigation track, an advancing direction D1 can be determined from the tangential direction of the curve, and the advancing direction D1 and the terminal facing direction D2 can determine that the navigation direction is deviated from the terminal facing direction to the left by N degrees or to the right by N degrees, as shown in the figure, the navigation direction D3 is deviated from the terminal facing direction D2 to the right by N degrees.
To this end, the terminal may determine the navigation direction in the navigation information.
Continuing to describe the determination process of the target annotation information:
and the terminal determines target labeling information according to the position parameter of the target position and the position parameter of the terminal, wherein the target labeling information is used for identifying the location of the target position in the real-time street view image. Specifically, the determination of the target annotation information may comprise the following sub-steps:
firstly, determining a visible area of a terminal on a map according to longitude and latitude information and orientation information of the terminal;
the visual area of the terminal is used for representing the area which can be collected by the camera of the terminal, and the visual area is represented as a sector area which is positioned in front of the orientation of the terminal on the map. With continued reference to fig. 4A, on the map 31, the location B of the terminal may be determined according to latitude and longitude information of the terminal, and then the visible area 32 of the terminal may be determined according to the orientation information D2 of the terminal, where the visible area 32 is a sector area, and an included angle between two sides of the sector area is determined by a visible angle of a camera of the terminal, such as 120 degrees; the length of both sides of the sector area may be a preset value, such as 500 meters on the map 31.
Secondly, detecting whether the target position is positioned in the visible area according to the longitude and latitude information of the target position;
the terminal can locate the position a of the target position on the map according to the latitude and longitude information of the target position, and then detect whether the position a of the target position is located in the visible area 32.
Thirdly, if the detection result is that the target position is located in the visible area, generating target labeling information.
And if the detection result is that the target position is located in the visible area, the terminal generates target labeling information. The target annotation information is typically presented in the form of a text box or text bubble displayed on the real-time street view image. The process of generating the target annotation information by the terminal can comprise the following substeps:
1) generating first display content according to the attribute information of the target position;
the terminal first generates the first display content according to the attribute information of the target location, which may be the name, introduction, etc. of the target location, and the attribute information may be obtained from the POI corresponding to the target location queried in step 301. For example, the name "yellow river hotel" of the target location is used as the first display content.
2) Determining a horizontal display position of the first display content on the real-time street view image according to the position identified by the longitude and latitude information of the target position in the visible area;
referring to fig. 4B in combination, the terminal may determine the position of the target position in the horizontal line of sight 33 according to a connection line m between the position a identified by the latitude and longitude information of the target position in the visible area 32 and the position B of the terminal itself. In other words, the intersection C of the connecting line m and the horizontal visual field line 33 corresponds to the position of the target position on the horizontal visual field line 33, which can be represented by the ratio L1/L2 of the line segment L1 and the line segment L2. When the screen display width of the terminal is known, a horizontal coordinate x can be converted according to L1/L2, and the horizontal coordinate x can be used as the horizontal display position of the first display content on the real-time street view image.
3) Determining the vertical display position of the first display content on the real-time street view image according to the height information of the target position;
referring to fig. 4C, when the terminal provides real-time street view navigation, the camera in the terminal is usually toward the front by default, the terminal determines the vertex D of the target position according to the height information H of the target position and the distance m between the terminal and the target position, and after connecting the position a with the vertex D to obtain the connection line n, the position of the target position on the vertical view line 34 can be determined. In other words, the intersection E of the connecting line n with the vertical line of sight 34 corresponds to the position of the target position on the vertical line of sight 34, which can be represented by the ratio L3/L4 of the line segment L3 and the line segment L4. When the screen display height of the terminal is known, a vertical coordinate y can be converted according to L3/L4, and the vertical coordinate y can be used as the vertical display position of the first display content on the real-time street view image.
Assuming that the target location is a building, the vertical coordinate y is calculated such that the first display is displayed on the top of the building. If it is desired that the first display content is displayed in the middle of the building, the above proportional relationship may be modified to (L3 +1/2 × L4)/(1/2 × L4), as shown in fig. 4D. By analogy, if it is desired that the first display content is displayed in the upper 3/4 portion in the building, the above proportional relationship may be modified to (L3 +1/4 × L4)/(3/4 × L4).
The horizontal view line 33 and the vertical view line 34 are determined by the viewing angle of the camera of the terminal, and may be preset values. In addition, the sub-steps 2) and 3) are only schematic illustrations, and the specific algorithm implementation based on the idea may be different, and is not particularly limited. The height information of the target position may be acquired from the POI corresponding to the target position, and if the POI corresponding to the target position does not include the height information, one preset vertical coordinate may be used as a vertical display position of the first display content on the real-time street view image.
4) And taking the first display content, the horizontal display position and the vertical display position as target labeling information.
Up to this point, the first display content, the horizontal display position of the first display content, and the vertical display position of the first display content may be taken as the target annotation information corresponding to the target position. Taking the first display content displayed in the middle of the building as an example, the finally determined target annotation information can be referred to as shown in fig. 4E.
Since the location parameter of the terminal may be obtained at predetermined time intervals, the target label information is also generated at predetermined time intervals correspondingly. In this process, the terminal may be in a non-stationary state accompanied by displacement or rotation, so when the position parameter of the terminal includes the motion information and/or the rotation angle information, it is preferable to include steps 306 to 307 to update the horizontal display position and/or the vertical display position in the target annotation information. The method comprises the following specific steps:
step 306, determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal; updating the horizontal display position in the target marking information according to the moving speed of the terminal in the horizontal direction;
the motion information is generally acceleration information of the terminal in three spatial directions, and the moving speed of the terminal in the horizontal direction can be calculated according to the acceleration information of the terminal in the horizontal direction.
And after the moving speed of the terminal in the horizontal direction is calculated, updating the horizontal display position in the target marking information according to the moving speed, so that the horizontal display position in the target marking information follows the movement of the terminal in the horizontal direction. That is, when the terminal moves to the right, the horizontal display position in the target marking information is adjusted to the left by a corresponding amplitude; when the terminal moves leftwards, the horizontal display position in the target marking information is adjusted rightwards by a corresponding amplitude, so that the target marking information can generate the effect of 'adhering' to be displayed on the target position.
Step 307, determining a rotation angle of the terminal in the vertical direction according to the rotation angle information of the terminal; updating a vertical display position in the target marking information according to the rotation angle of the terminal in the vertical direction;
correspondingly, the rotation angle information is generally rotation angle values of the terminal in three spatial directions, and the rotation angle of the terminal in the vertical direction can be calculated according to the rotation angle values of the terminal in the vertical direction.
After the rotation angle of the terminal in the vertical direction is calculated, the vertical display position in the target labeling information is updated according to the rotation angle, so that the vertical display position in the target labeling information follows the movement of the terminal in the vertical direction. That is, when the terminal rotates upwards, the vertical display position in the target marking information is adjusted downwards by a corresponding amplitude; when the terminal rotates downwards, the vertical display position in the target marking information is adjusted upwards by a corresponding amplitude, so that the target marking information can generate an effect of 'adhering' to be displayed on the target position.
308, determining interest point marking information according to the position parameters of the terminal and at least one interest point;
as can be seen from the above description, the target annotation information is generated for subsequent display only when the target position is within the visible area of the terminal. When the target position is located outside the visible area of the terminal, the navigation information only contains navigation directions. Preferably, in order to provide more useful information to the user, the navigation information may further include interest point labeling information for labeling the location of other interest points besides the target location on the real-time street view image. And the terminal determines the interest point marking information according to the position parameter of the terminal and at least one interest point. Specifically, the method may include the following substeps:
firstly, determining a visible area of a terminal on a map according to a position parameter of the terminal;
the visible area can be determined in the map by latitude and longitude information and orientation information of the terminal, and the specific determination process is not repeated.
Secondly, at least one interest point in a visible area of the terminal is inquired, wherein the at least one interest point does not include the interest point corresponding to the target position;
and secondly, generating at least one interest point annotation information according to the inquired interest points.
The process of generating the interest point labeling information is basically the same as the process of generating the target labeling information. May comprise the following sub-steps:
1) acquiring attribute information, longitude and latitude information and height information of the interest points;
2) generating second display content according to the attribute information of the interest points;
3) determining the horizontal display position of the second display content on the real-time street view image according to the position identified by the longitude and latitude information of the interest point in the visible area;
4) determining the vertical display position of the second display content on the real-time street view image according to the height information of the interest point;
5) and taking the second display content, the horizontal display position and the vertical display position as the interest point annotation information.
For example, in the example shown in fig. 4E, if there is a POI "book building" in the visible area 33 of the terminal, the horizontal display position x and the vertical display position y can be determined based on the latitude and longitude information and the height information of the POI "book building" using the "book building" as the second display content.
Like the target annotation information, when the position parameter of the terminal includes the motion information and/or the rotation angle information, steps 309 and 310 are preferably included to update the horizontal display position and/or the vertical display position in the interest point annotation information. The method comprises the following specific steps:
step 309, determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal; updating a horizontal display position in the interest point annotation information according to the moving speed of the terminal in the horizontal direction;
step 310, determining a rotation angle of the terminal in a vertical direction according to the rotation angle information of the terminal; and updating the vertical display position in the interest point annotation information according to the rotation angle of the terminal in the vertical direction.
The detailed process can be combined with the process shown in reference to steps 306 to 307, which is not described in detail. It should be noted that, although the generation processes of the target annotation information and the interest point annotation information are described as two parts in this embodiment, in a specific implementation, the two generation processes may be performed in parallel and share a part of steps, such as: the steps of determining the visible area of the terminal on the map according to the position parameters of the terminal, determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal, determining the rotating angle of the terminal in the vertical direction according to the rotating angle information of the terminal and the like can be shared. The execution order and the specific implementation manner of the two generation processes are not particularly limited herein.
And 311, overlaying and displaying navigation information on the real-time street view image displayed by the terminal.
When the terminal displays the real-time street view image, a transparent layer can be additionally arranged above the real-time street view image, and then navigation information is displayed in the transparent layer. Specifically, the terminal normally displays the navigation direction in the navigation information at a position lower than the middle of the real-time street view image, as indicated by an arrow 35 in fig. 4F.
If the navigation information includes the target labeling information, the terminal further displays the first display content 36 in the target labeling information at a first designated position in the real-time street view image, wherein the first designated position is determined according to the horizontal display position and the vertical display position in the target labeling information.
If the navigation information further includes at least one interest point label information, the terminal further displays the second display content 37 in each interest point label information at a second designated position in the real-time street view image, and the second designated position is determined according to the horizontal display position and the vertical display position in the interest point label information. Fig. 4F is illustrated only in the case where the navigation information includes one point of interest annotation information, but actually, the point of interest annotation information may be two or more.
Particularly, if there is an overlapping portion in the display position between the target annotation information and the interest point annotation information, the longer distance target annotation information/interest point annotation information may be displayed at a lower layer or hidden according to the distance between the target position and the terminal and the distance between the interest point and the terminal. If the display position between the interest point labeling information and the interest point labeling information has a superposition part, the interest point labeling information with a longer distance can be placed on a lower layer for display or hidden display according to the distance between the interest point and the terminal. In addition, the display mode of the target labeling information may be different from the display mode of the interest point labeling information, so that the display mode of the target labeling information is more striking and prominent, for example, the interest point labeling information is displayed by a thin green text box, the target labeling information is displayed by a bold red text box, and the like.
In a more preferred embodiment, the terminal may also provide a navigation voice to be used in conjunction with the navigation information described above.
In summary, in the navigation method provided by this embodiment, the navigation information is displayed in an overlapping manner on the real-time street view image displayed by the terminal; the problem that the existing navigation method can only represent a general position area and cannot correctly find a target position under the condition that a user is unfamiliar with the surrounding environment is solved; the navigation information is combined with the real-time street view image, and the target position guided by the navigation information is a specific position in the real-time street view image, so that the user can accurately find the target position.
In this embodiment, the display position of the target annotation information in the navigation information is updated according to the motion information and the rotation angle information of the terminal, so that the display position of the target annotation information can be correspondingly changed along with the movement or rotation of the terminal. In other words, even if the terminal moves or rotates, the target marking information can still accurately indicate the position of the target position, so that the user can still accurately find the target position by means of the indication of the target marking information in a scene such as walking or sitting on a car.
The embodiment also adds at least one interest point mark information in the navigation information, so that the navigation information can provide more useful information, and further, a user can obtain more useful information in real-time street view navigation. Meanwhile, the display position of the interest point marking information in the navigation information is updated through the motion information and the rotation angle information of the terminal, so that the display position of the interest point marking information can be correspondingly changed along with the movement or the rotation of the terminal, and further, a user can still accurately find the interest point by means of the indication of the target marking information in the scenes such as walking or sitting on a car. Alternatively, the point of interest annotation information can be effectively combined with the scene in the real-time street view image.
The above-described embodiment is exemplified only in the case where the navigation method is applied to a separate terminal. Due to the fact that computing power of the terminals is uneven, in order to reduce the requirement for the computing power of the terminals when the navigation method is applied to the terminals, the navigation method can be further applied to a navigation system comprising the terminals and a server to achieve the purpose, the server undertakes main computing work, the terminals are only responsible for collecting position parameters of the terminals and position parameters of target positions, and navigation information is displayed in an overlaying mode on real-time street view images when the real-time street view images are displayed.
Please refer to fig. 5, which shows a flowchart of a navigation method according to a third embodiment of the present invention. The present embodiment is exemplified in that the navigation method is used in a navigation system including a terminal and a server. The navigation method comprises the following steps:
step 501, a terminal collects attribute information of a target position;
the terminal provides real-time street view navigation service for the user. After the user initiates the service, the terminal may ask the user to input the target location. At this time, the terminal receives attribute information of a target position input by a user through text input or voice input.
If the user inputs the attribute information of the target position in a character mode through a physical keyboard or a virtual keyboard, the terminal receives the attribute information of the target position input through characters;
if the user inputs the attribute information of the target position in a voice mode through the microphone, the terminal receives the attribute information of the target position input in the voice mode, and then the voice signal is recognized as corresponding character information through a voice recognition technology.
The attribute information of the target location is usually the name of the target location, and may be other description information with unique identification.
It should be noted that, when the user starts the real-time street view navigation service, the terminal may start to collect and display the real-time street view image. Of course, the terminal may start to acquire and display the real-time street view image after starting the navigation, which is not limited to this.
502, the terminal sends the attribute information of the target position to a server;
the terminal may transmit the attribute information of the target location to the server through a wireless network or a wired network.
Step 503, the server obtains the position parameter of the target position;
the server acquires the position parameter of the target position by receiving the attribute information of the target position sent by the terminal. Specifically, the present step may include the following sub-steps:
1) the server receives attribute information of a target position sent by the terminal, wherein the attribute information of the target position is information obtained by the terminal receiving character input or voice input;
2) the server inquires interest points corresponding to the target position according to the attribute information of the target position;
the server may pre-cache a POI information base, and after obtaining the attribute information of the target location, the terminal queries an interest point corresponding to the target location in the POI information base according to the attribute information. If the interest point corresponding to the target position cannot be inquired, the server can feed back error information to the terminal, and the terminal can require the user to input the attribute information of the target position again after receiving the error information.
3) And the server takes the searched longitude and latitude information in the interest point corresponding to the target position as the position parameter of the target position.
And when the server inquires the POI corresponding to the target position, taking the longitude and latitude information in the POI corresponding to the target position as the position parameter of the target position.
Step 504, the terminal collects the position parameters of the terminal;
different from the position parameter of the target position, the position parameter of the terminal not only includes longitude and latitude information, but also includes orientation information, and the orientation information represents the orientation position of the camera of the terminal. If the terminal is in a non-stationary state, the position parameter of the terminal further needs to include at least one of motion information and rotation angle information, the motion information represents the motion information of the terminal in the moving process, and the rotation angle information represents the motion information of the terminal in the rotating process.
The terminal may acquire the location parameters through at least one sensor built in. Specifically, the method comprises the following steps:
firstly, acquiring longitude and latitude information through a Global Positioning System (GPS) receiver in a terminal, and taking the longitude and latitude information as a part of position parameters;
secondly, acquiring orientation information through an electronic compass sensor in the terminal, and taking the orientation information as the other part of the position parameters;
thirdly, if the position parameters also comprise motion information, acquiring the motion information through a three-axis acceleration sensor in the terminal;
fourthly, if the position parameters also comprise rotation angle information, acquiring the rotation angle information through a gyroscope sensor in the terminal;
it should be noted that, the above four steps are only for illustration, and the acquisition order of each piece of information is not specifically limited, that is, the terminal usually acquires each piece of information in the location parameters at the same time. In addition, the terminal may acquire the location parameters more than once, but may acquire the location parameters once at predetermined time intervals.
Step 505, the terminal sends the position parameter of the terminal to a server;
and the terminal sends the acquired position parameters to the server. The terminal can immediately send the position parameters to the server after acquiring the position parameters each time. If the terminal only collects the position parameters once, the process of sending the position parameters to the server is also once; if the terminal collects the position parameters once every preset time interval, the process of sending the position parameters to the server is corresponding to multiple times.
Step 506, the server acquires the position parameters of the terminal;
and the server receives the position parameters sent by the terminal. The location parameters of the terminal may include:
latitude and longitude information and orientation information; or,
latitude and longitude information, orientation information and motion information; or,
latitude and longitude information, orientation information and rotation angle information; or,
latitude and longitude information, orientation information, motion information and rotation angle information;
the latitude and longitude information is information collected by a terminal through a Global Positioning System (GPS) receiver; the orientation information is information acquired by the terminal through an electronic compass sensor; the motion information is information acquired by the terminal through a three-axis acceleration sensor; the rotation angle information is information collected by the terminal through a gyroscope sensor.
Step 507, the server determines navigation information according to the position parameter of the target position and the position parameter of the terminal;
and the server determines navigation information according to the acquired position parameters of the target position, the acquired position parameters of the terminal and a map stored locally. The navigation information is the navigation information special for the real-time street view image and comprises navigation direction, destination marking information or the combination of the navigation direction and the destination marking information. Wherein the navigation direction is used for indicating a direction towards the target position in the real-time street view image; and the target labeling information is used for identifying the location of the target position in the real-time street view image. The following first describes the determination process of the navigation direction:
and the server determines a navigation direction according to the position parameter of the target position and the position parameter of the terminal, wherein the navigation direction is used for indicating the direction towards the target position in the real-time street view image. In particular, the determination of the navigation direction may comprise the sub-steps of:
firstly, calculating a navigation track on a map according to longitude and latitude information of a target position and longitude and latitude information of a terminal;
with reference to fig. 4A, the server may locate the location a of the target location on the map 31 according to the latitude and longitude information of the target location, may locate the current location B of the terminal on the map 31 according to the latitude and longitude information of the terminal, and may then calculate a navigation track between the points a and B on the map according to a navigation algorithm, where the navigation track is usually a shortest path between the points a and B when the terminal travels in a certain traffic manner, such as a shortest path during walking, a shortest path of a bus route, or a shortest path during automobile traveling.
And secondly, generating a navigation direction according to the orientation information and the navigation track of the terminal.
If the navigation track is regarded as a curve and the position of the terminal on the map is regarded as a point on the navigation track, a forward direction D1 can be determined from the tangential direction of the curve, and the foreground direction D1 and the heading direction D2 of the terminal can determine that the navigation direction is deviated from the heading direction of the terminal to the left by N degrees or to the right by N degrees, as shown in the figure, the navigation direction is deviated from the heading direction D2 of the terminal to the right by N degrees.
To this end, the server may determine the navigation direction in the navigation information.
Continuing to describe the determination process of the target annotation information:
and the server determines target labeling information according to the position parameter of the target position and the position parameter of the terminal, wherein the target labeling information is used for identifying the location of the target position in the real-time street view image. Specifically, the determination of the target annotation information may comprise the following sub-steps:
firstly, determining a visible area of a terminal on a map according to longitude and latitude information and orientation information of the terminal;
the visual area of the terminal is used for representing the area which can be collected by the camera of the terminal, and the visual area is represented as a sector area which is positioned in front of the orientation of the terminal on the map. With continued reference to fig. 4A, on the map 31, the location B of the terminal may be determined according to latitude and longitude information of the terminal, and then the visible area 32 of the terminal may be determined according to the orientation information D2 of the terminal, where the visible area 32 is a sector area, and an included angle between two sides of the sector area is determined by a visible angle of a camera of the terminal, such as 120 degrees; the length of both sides of the sector area may be a preset value, such as 500 meters on the map 31.
Secondly, detecting whether the target position is positioned in the visible area according to the longitude and latitude information of the target position;
the server can locate the position a of the target position on the map according to the latitude and longitude information of the target position, and then detect whether the position a of the target position is located in the visible area 32.
Thirdly, if the detection result is that the target position is located in the visible area, generating target labeling information.
And if the detection result is that the target position is located in the visible area, the server generates target labeling information. The target annotation information is typically presented in the form of a text box or text bubble displayed on the real-time street view image. The process of generating the target annotation information by the server can comprise the following sub-steps:
1) generating first display content according to the attribute information of the target position;
the server first generates the first display content according to the attribute information of the target location, which may be the name, introduction, etc. of the target location, and the attribute information may be obtained from the POI corresponding to the target location queried in step 403. For example, the name "yellow river hotel" of the target location is used as the first display content.
2) Determining a horizontal display position of the first display content on the real-time street view image according to the position identified by the longitude and latitude information of the target position in the visible area;
referring to fig. 4B in combination, the server may determine the position of the target location in the horizontal line of sight 33 according to a connection line m between the location a identified by the latitude and longitude information of the target location within the visible area 32 and the location B of the terminal. In other words, the intersection C of the connecting line m and the horizontal visual field line 33 corresponds to the position of the target position on the horizontal visual field line 33, which can be represented by the ratio L1/L2 of the line segment L1 and the line segment L2. When the screen display width of the terminal is known, a horizontal coordinate x can be converted according to L1/L2, and the horizontal coordinate x can be used as the horizontal display position of the first display content on the real-time street view image.
3) Determining the vertical display position of the first display content on the real-time street view image according to the height information of the target position;
referring to fig. 4C, when the terminal provides real-time street view navigation, the camera in the terminal is usually toward the front by default, the server determines the vertex D of the target position according to the height information H of the target position and the distance m between the terminal and the target position, and after the position a of the terminal and the vertex D are connected to obtain the connection line n, the position of the target position on the vertical view line 34 can be determined. In other words, the intersection E of the connecting line n with the vertical line of sight 34 corresponds to the position of the target position on the vertical line of sight 34, which can be represented by the ratio L3/L4 of the line segment L3 and the line segment L4. When the screen display height of the terminal is known, a vertical coordinate y can be converted according to L3/L4, and the vertical coordinate y can be used as the vertical display position of the first display content on the real-time street view image.
Assuming that the target location is a building, the vertical coordinate y is calculated such that the first display is displayed on the top of the building. If it is desired that the first display content is displayed in the middle of the building, the above proportional relationship may be modified to (L3 +1/2 × L4)/(1/2 × L4), as shown in fig. 4D. By analogy, if it is desired that the first display content is displayed in the upper 3/4 portion in the building, the above proportional relationship may be modified to (L3 +1/4 × L4)/(3/4 × L4).
The horizontal view line 33 and the vertical view line 34 are determined by the viewing angle of the camera of the terminal, and may be preset values. In addition, the sub-steps 2) and 3) are only schematic illustrations, and the specific algorithm implementation based on the idea may be different, and is not particularly limited. The height information of the target position may be acquired from the POI corresponding to the target position, and if the POI corresponding to the target position does not include the height information, one preset vertical coordinate may be used as a vertical display position of the first display content on the real-time street view image.
4) And taking the first display content, the horizontal display position and the vertical display position as target labeling information.
To this end, the server may take the first display content, the horizontal display position of the first display content, and the vertical display position of the first display content as the target annotation information corresponding to the target position. Taking the first display content displayed in the middle of the building as an example, the finally determined target annotation information can be referred to as shown in fig. 4E.
Since the location parameter of the terminal may be obtained at predetermined time intervals, the target label information is also generated at predetermined time intervals correspondingly. In this process, the terminal may be in a non-stationary state accompanied by displacement or rotation, so when the position parameter of the terminal includes the motion information and/or the rotation angle information, it preferably includes steps 508 to 509 to update the horizontal display position and/or the vertical display position in the target annotation information. The method comprises the following specific steps:
step 508, determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal; updating the horizontal display position in the target marking information according to the moving speed of the terminal in the horizontal direction;
the motion information is generally acceleration information of the terminal in three spatial directions, and the server can calculate the moving speed of the terminal in the horizontal direction according to the acceleration information of the terminal in the horizontal direction.
After the server calculates the moving speed of the terminal in the horizontal direction, the server updates the horizontal display position in the target labeling information according to the moving speed, so that the horizontal display position in the target labeling information follows the movement of the terminal in the horizontal direction. That is, when the terminal moves to the right, the horizontal display position in the target marking information is adjusted to the left by a corresponding amplitude; when the terminal moves leftwards, the horizontal display position in the target marking information is adjusted rightwards by a corresponding amplitude, so that the target marking information can generate the effect of 'adhering' to be displayed on the target position.
Step 509, determining a rotation angle of the terminal in the vertical direction according to the rotation angle information of the terminal; updating a vertical display position in the target marking information according to the rotation angle of the terminal in the vertical direction;
correspondingly, the rotation angle information is generally rotation angle values of the terminal in three spatial directions, and the server can calculate the rotation angle of the terminal in the vertical direction according to the rotation angle values of the terminal in the vertical direction.
After the server calculates the rotation angle of the terminal in the vertical direction, the server updates the vertical display position in the target labeling information according to the rotation angle, so that the vertical display position in the target labeling information follows the movement of the terminal in the vertical direction. That is, when the terminal rotates upwards, the vertical display position in the target marking information is adjusted downwards by a corresponding amplitude; when the terminal rotates downwards, the vertical display position in the target marking information is adjusted upwards by a corresponding amplitude, so that the target marking information can generate an effect of 'adhering' to be displayed on the target position.
Step 510, determining interest point marking information according to the position parameters of the terminal and at least one interest point;
as can be seen from the above description, the target annotation information is generated for subsequent display only when the target position is within the visible area of the terminal. When the target position is located outside the visible area of the terminal, the navigation information only contains navigation directions. Preferably, in order to provide more useful information to the user, the navigation information may further include interest point labeling information for labeling a location of at least one interest point other than the target location on the real-time street view. And the server determines the interest point marking information according to the position parameters of the terminal and at least one interest point. Specifically, the method may include the following substeps:
thirdly, determining a visible area of the terminal on a map according to the position parameters of the terminal;
the visible area can be determined in the map by latitude and longitude information and orientation information of the terminal, and the specific determination process is not repeated.
Secondly, at least one interest point in a visible area of the terminal is inquired, wherein the at least one interest point does not include the interest point corresponding to the target position;
fourthly, generating at least one interest point annotation information according to the inquired interest points.
The process of generating the interest point labeling information is basically the same as the process of generating the target labeling information. May comprise the following sub-steps:
1) acquiring attribute information, longitude and latitude information and height information of the interest points;
2) generating second display content according to the attribute information of the interest points;
3) determining the horizontal display position of the second display content on the real-time street view image according to the position identified by the longitude and latitude information of the interest point in the visible area;
4) determining the vertical display position of the second display content on the real-time street view image according to the height information of the interest point;
5) and taking the second display content, the horizontal display position and the vertical display position as the interest point annotation information.
For example, in the example shown in fig. 4E, if there is a POI "book building" in the visible area 33 of the terminal, the horizontal display position x and the vertical display position y can be determined based on the latitude and longitude information and the height information of the POI "book building" using the "book building" as the second display content.
Like the target annotation information, when the position parameter of the terminal includes the motion information and/or the rotation angle information, steps 511 and 512 are preferably included to update the horizontal display position and/or the vertical display position in the interest point annotation information. The method comprises the following specific steps:
step 511, determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal; updating a horizontal display position in the interest point annotation information according to the moving speed of the terminal in the horizontal direction;
step 512, determining a rotation angle of the terminal in the vertical direction according to the rotation angle information of the terminal; and updating the vertical display position in the interest point annotation information according to the rotation angle of the terminal in the vertical direction.
The detailed process can be combined with the process shown in steps 508 to 509, which is not described in detail for the sake of brevity. It should be noted that, although the generation processes of the target annotation information and the interest point annotation information are described as two parts in this embodiment, in a specific implementation, the two generation processes may be performed in parallel and share a part of steps, for example: the steps of determining the visible area of the terminal on the map according to the position parameters of the terminal, determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal, determining the rotating angle of the terminal in the vertical direction according to the rotating angle information of the terminal and the like can be shared. The execution order and the specific implementation manner of the two generation processes are not particularly limited herein.
Step 513, the server sends navigation information to the terminal;
the server may transmit the navigation information to the terminal. The server may determine the navigation information according to the position parameter of the terminal every time the server receives the position parameter of the terminal, and then send the navigation information to the server. If the terminal only reports the position parameter once, the process that the server sends the navigation information is also once; if the terminal reports the position parameters once every preset time interval, the process of the server sending the navigation information is corresponding to multiple times.
And 514, the terminal displays the navigation information on the real-time street view image displayed by the terminal in an overlapping manner.
The terminal can receive the navigation information sent by the server, and then when the terminal displays the real-time street view image, the terminal can add a transparent layer above the real-time street view image and then display the navigation information in the transparent layer. Specifically, the terminal normally displays the navigation direction in the navigation information at a position lower than the middle of the real-time street view image, as indicated by an arrow 35 in fig. 4F.
If the navigation information includes the target labeling information, the terminal further displays the first display content 36 in the target labeling information at a first designated position in the real-time street view image, wherein the first designated position is determined according to the horizontal display position and the vertical display position in the target labeling information.
If the navigation information further includes at least one interest point label information, the terminal further displays the second display content 37 in each interest point label information at a second designated position in the real-time street view image, and the second designated position is determined according to the horizontal display position and the vertical display position in the interest point label information. Fig. 4F is illustrated only in the case where the navigation information includes one point of interest annotation information, but actually, the point of interest annotation information may be two or more.
Particularly, if there is an overlapping portion in the display position between the target annotation information and the interest point annotation information, the longer distance target annotation information/interest point annotation information may be displayed at a lower layer or hidden according to the distance between the target position and the terminal and the distance between the interest point and the terminal. If the display position between the interest point labeling information and the interest point labeling information has a superposition part, the interest point labeling information with a longer distance can be placed on a lower layer for display or hidden display according to the distance between the interest point and the terminal. In addition, the display mode of the target labeling information may be different from the display mode of the interest point labeling information, so that the display mode of the target labeling information is more striking and prominent, for example, the interest point labeling information is displayed by a thin green text box, the target labeling information is displayed by a bold red text box, and the like.
In a more preferred embodiment, the server may also provide a navigation voice to the terminal for use in conjunction with the navigation information described above.
In summary, in the navigation method provided by this embodiment, the navigation information is displayed in an overlapping manner on the real-time street view image displayed by the terminal; the problem that the existing navigation method can only represent a general position area and cannot correctly find a target position under the condition that a user is unfamiliar with the surrounding environment is solved; the navigation information is combined with the real-time street view image, and the target position guided by the navigation information is a specific position in the real-time street view image, so that the user can accurately find the target position.
In this embodiment, the display position of the target annotation information in the navigation information is updated according to the motion information and the rotation angle information of the terminal, so that the display position of the target annotation information can be correspondingly changed along with the movement or rotation of the terminal. In other words, even if the terminal moves or rotates, the target marking information can still accurately indicate the position of the target position, so that the user can still accurately find the target position by means of the indication of the target marking information in a scene such as walking or sitting on a car.
The embodiment also adds at least one interest point mark information in the navigation information, so that the navigation information can provide more useful information, and further, a user can obtain more useful information in real-time street view navigation. Meanwhile, the display position of the interest point marking information in the navigation information is updated through the motion information and the rotation angle information of the terminal, so that the display position of the interest point marking information can be correspondingly changed along with the movement or the rotation of the terminal, and further, a user can still accurately find the interest point by means of the indication of the target marking information in the scenes such as walking or sitting on a car. Alternatively, the point of interest annotation information can be effectively combined with the scene in the real-time street view image.
In this embodiment, the server is further used for performing main calculation, and the terminal only needs to report the location parameter of the target location and the location parameter of the terminal to the server, and then receives the navigation information for display. Because the terminal does not need to download map data in the whole process, compared with the existing 2D navigation method and the virtual 4D navigation method, the requirement on the calculation performance of the terminal is not high, the data flow required by the communication between the terminal and the server is very small, but the overall navigation effect is superior to the existing 2D navigation method and the virtual 4D navigation method. When the number of terminals using the real-time street view navigation function is very large, the requirement for the service carrying capacity of the whole mobile communication network or internet network is also greatly reduced.
Example four
Referring to fig. 6, a block diagram of a navigation device according to a fourth embodiment of the present invention is shown. The navigation device may be implemented as all or part of the terminal by software, hardware or a combination of both. The navigation device may include: a first obtaining module 610, a second obtaining module 620, an information determining module 630 and an information displaying module 640;
a first obtaining module 610, configured to obtain a location parameter of a target location;
a second obtaining module 620, configured to obtain a location parameter of the terminal;
an information determining module 630, configured to determine navigation information according to the location parameter of the target location and the location parameter of the terminal;
an information display module 640, configured to display the navigation information determined by the information determination module 630 in an overlapping manner on the real-time street view image displayed by the terminal.
In a more preferred embodiment, the navigation information includes navigation directions and/or destination label information, and the information determining module includes: a direction determination submodule and/or a purpose determination submodule;
the direction determining submodule is used for determining a navigation direction according to the position parameter of the target position and the position parameter of the terminal, and the navigation direction is used for indicating the direction facing the target position in the real-time street view image;
and the destination determining submodule is used for determining destination marking information according to the position parameter of the target position and the position parameter of the terminal, and the destination marking information is used for identifying the location of the target position in the real-time street view image.
In a more preferred embodiment, the location parameter of the target location includes latitude and longitude information of the target location, the location information of the terminal includes latitude and longitude information and orientation information of the terminal, and the direction determining sub-module includes:
a trajectory calculation unit and a direction generation unit;
the track calculation unit is used for calculating a navigation track on a map according to the longitude and latitude information of the target position and the longitude and latitude information of the terminal;
and the direction generating unit is used for generating the navigation direction according to the orientation information of the terminal and the navigation track.
In a more preferred embodiment, the location parameter of the target location includes latitude and longitude information of the target location, the location information of the terminal includes latitude and longitude information and orientation information of the terminal, and the destination determining sub-module includes:
an area determination unit, a position detection unit and an information generation unit;
the area determining unit is used for determining a visible area of the terminal on the map according to the longitude and latitude information and the orientation information of the terminal;
the position detection unit is used for detecting whether the target position is positioned in the visible area according to the longitude and latitude information of the target position;
and the information generating unit is used for generating the target labeling information if the detection result shows that the target position is located in the visible area.
In a more preferred embodiment, the position parameter of the target position further includes attribute information and altitude information of the target position, and the information generating unit includes:
the system comprises a content generation subunit, a horizontal generation subunit, a vertical generation subunit and a target labeling subunit;
the content generating subunit is configured to generate first display content according to the attribute information of the target location;
the horizontal generating subunit is configured to determine, according to the identified position of the latitude and longitude information of the target position in the visible area, a horizontal display position of the first display content on the real-time street view image;
the vertical generation subunit is configured to determine, according to the height information of the target position, a vertical display position of the first display content on the real-time street view image;
and the target labeling subunit is configured to use the first display content, the horizontal display position, and the vertical display position as the target labeling information.
In a more preferred embodiment, the location parameter of the terminal further includes motion information of the terminal, and the purpose determining sub-module further includes:
a first horizontal velocity determining unit and a first horizontal position updating unit;
the first horizontal speed determining unit is used for determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal;
and the first horizontal position updating unit is used for updating the horizontal display position in the target marking information according to the moving speed of the terminal in the horizontal direction.
In a more preferred embodiment, the location parameter of the terminal further includes rotation angle information of the terminal, and the purpose determining sub-module further includes:
a first vertical angle determination unit and a first vertical position update unit;
the first vertical angle determining unit is used for determining the rotation angle of the terminal in the vertical direction according to the rotation angle information of the terminal;
the first vertical position updating unit is used for updating the vertical display position in the target labeling information according to the rotation angle of the terminal in the vertical direction.
In a more preferred embodiment, the terminal may further include:
the system comprises a region determining module, an interest point inquiring module, an information generating module and an interest point displaying module;
the area determining module is used for determining a visible area of the terminal on a map according to the position parameter of the terminal;
the interest point query module is used for querying at least one interest point located in a visible area of the terminal;
the information generation module is used for generating at least one interest point annotation information according to the inquired interest points;
and the interest point display module is used for displaying the interest point annotation information on the real-time street view image in an overlapping manner.
In a more preferred embodiment, the information generating module includes: the system comprises an information acquisition unit, a content generation unit, a horizontal generation unit, a vertical generation unit and a target labeling unit;
the information acquisition unit is used for acquiring the attribute information, the longitude and latitude information and the height information of the interest points;
the content generating unit is used for generating second display content according to the attribute information of the interest points;
the horizontal generating unit is used for determining the horizontal display position of the second display content on the real-time street view image according to the position identified by the longitude and latitude information of the interest point in the visible area;
the vertical generation unit is used for determining the vertical display position of the second display content on the real-time street view image according to the height information of the interest point;
the target labeling unit is configured to use the display content, the horizontal display position, and the vertical display position as the interest point labeling information.
In a more preferred embodiment, the location parameter of the terminal further includes motion information of the terminal, and the information generating module further includes:
a second horizontal velocity determining unit and a second horizontal position updating unit;
the second horizontal speed determining unit is used for determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal;
and the second horizontal position updating unit is used for updating the horizontal display position in the interest point mark information according to the moving speed of the terminal in the horizontal direction.
In a more preferred embodiment, the position parameter of the terminal further includes rotation angle information of the terminal, and the information generating module further includes:
a second vertical angle determination unit and a second vertical position update unit;
the second vertical angle determining unit is used for determining the rotation angle of the terminal in the vertical direction according to the rotation angle information of the terminal;
and the second vertical position updating unit is used for updating the vertical display position in the interest point annotation information according to the rotation angle of the terminal in the vertical direction.
In a more preferred embodiment, the first obtaining module includes:
the device comprises a first receiving unit, an interest point inquiring unit and a parameter determining unit;
the first receiving unit is used for receiving attribute information of a target position input through characters or voice;
the interest point query unit is used for querying interest points corresponding to the target position according to the attribute information of the target position;
and the parameter determining unit is used for taking the searched longitude and latitude information in the interest point corresponding to the target position as the position parameter of the target position.
In a more preferred embodiment, the second obtaining module includes:
the device comprises a longitude and latitude acquisition unit, an orientation acquisition unit, an acceleration acquisition unit and an angle acquisition unit;
the longitude and latitude acquisition unit is used for acquiring longitude and latitude information through a Global Positioning System (GPS) receiver in the terminal and taking the longitude and latitude information as a part of the position parameters;
the orientation acquisition unit is used for acquiring orientation information through an electronic compass sensor in the terminal, and taking the orientation information as another part of the position parameters;
the acceleration acquisition unit is used for acquiring the acceleration information through a three-axis acceleration sensor in the terminal if the position parameters further comprise acceleration information;
and the angle acquisition unit is used for acquiring the rotation angle information through a gyroscope sensor in the terminal if the position parameters further comprise rotation angle information.
In a more preferred embodiment, the terminal further includes: the map request module and the map receiving module;
the map request module is used for requesting map data of the map from a server;
the map receiving module is used for receiving the map data of the map fed back by the server.
In summary, the navigation device provided in the embodiment of the present invention displays the navigation information by superimposing the real-time street view image displayed on the terminal; the problem that the existing navigation method can only represent a general position area, and cannot correctly find a target position under the condition that a user is unfamiliar with the surrounding environment is solved; the method and the device have the advantages that the navigation information is combined with the real-time street view image, so that the target position guided by the navigation information is a specific position in the real-time street view image, and a user can accurately find the target position.
In this embodiment, the display position of the target annotation information in the navigation information is updated according to the motion information and the rotation angle information of the terminal, so that the display position of the target annotation information can be correspondingly changed along with the movement or rotation of the terminal. In other words, even if the terminal moves or rotates, the target marking information can still accurately indicate the position of the target position, so that the user can still accurately find the target position by means of the indication of the target marking information in a scene such as walking or sitting on a car.
The embodiment also adds at least one interest point mark information in the navigation information, so that the navigation information can provide more useful information, and further, a user can obtain more useful information in real-time street view navigation. Meanwhile, the display position of the interest point marking information in the navigation information is updated through the motion information and the rotation angle information of the terminal, so that the display position of the interest point marking information can be correspondingly changed along with the movement or the rotation of the terminal, and further, a user can still accurately find the interest point by means of the indication of the target marking information in the scenes such as walking or sitting on a car. Alternatively, the point of interest annotation information can be effectively combined with the scene in the real-time street view image.
EXAMPLE five
Referring to fig. 7, a block diagram of a navigation device according to a fifth embodiment of the present invention is shown. The navigation device may be implemented in software, hardware, or a combination of both as all or part of a server, which may be a server in a navigation system. The navigation device may include: a first obtaining module 610, a second obtaining module 620, an information determining module 630 and an information displaying module 640;
a first obtaining module 610, configured to obtain a location parameter of a target location;
a second obtaining module 620, configured to obtain a location parameter of the terminal;
an information determining module 630, configured to determine navigation information according to the location parameter of the target location and the location parameter of the terminal;
an information display module 640, configured to display the navigation information determined by the information determination module 630 in an overlapping manner on the real-time street view image displayed by the terminal.
In a more preferred embodiment, the navigation information includes navigation directions and/or destination label information, and the information determining module includes: a direction determination submodule and/or a purpose determination submodule;
the direction determining submodule is used for determining a navigation direction according to the position parameter of the target position and the position parameter of the terminal, and the navigation direction is used for indicating the direction facing the target position in the real-time street view image;
and the destination determining submodule is used for determining destination marking information according to the position parameter of the target position and the position parameter of the terminal, and the destination marking information is used for identifying the location of the target position in the real-time street view image.
In a more preferred embodiment, the location parameter of the target location includes latitude and longitude information of the target location, the location information of the terminal includes latitude and longitude information and orientation information of the terminal, and the direction determining sub-module includes:
a trajectory calculation unit and a direction generation unit;
the track calculation unit is used for calculating a navigation track on a map according to the longitude and latitude information of the target position and the longitude and latitude information of the terminal;
and the direction generating unit is used for generating the navigation direction according to the orientation information of the terminal and the navigation track.
In a more preferred embodiment, the location parameter of the target location includes latitude and longitude information of the target location, the location information of the terminal includes latitude and longitude information and orientation information of the terminal, and the destination determining sub-module includes:
an area determination unit, a position detection unit and an information generation unit;
the area determining unit is used for determining a visible area of the terminal on the map according to the longitude and latitude information and the orientation information of the terminal;
the position detection unit is used for detecting whether the target position is positioned in the visible area according to the longitude and latitude information of the target position;
and the information generating unit is used for generating the target labeling information if the detection result shows that the target position is located in the visible area.
In a more preferred embodiment, the position parameter of the target position further includes attribute information and altitude information of the target position, and the information generating unit includes:
the system comprises a content generation subunit, a horizontal generation subunit, a vertical generation subunit and a target labeling subunit;
the content generating subunit is configured to generate first display content according to the attribute information of the target location;
the horizontal generating subunit is configured to determine, according to the identified position of the latitude and longitude information of the target position in the visible area, a horizontal display position of the first display content on the real-time street view image;
the vertical generation subunit is configured to determine, according to the height information of the target position, a vertical display position of the first display content on the real-time street view image;
and the target labeling subunit is configured to use the first display content, the horizontal display position, and the vertical display position as the target labeling information.
In a more preferred embodiment, the location parameter of the terminal further includes motion information of the terminal, and the purpose determining sub-module further includes:
a first horizontal velocity determining unit and a first horizontal position updating unit;
the first horizontal speed determining unit is used for determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal;
and the first horizontal position updating unit is used for updating the horizontal display position in the target marking information according to the moving speed of the terminal in the horizontal direction.
In a more preferred embodiment, the location parameter of the terminal further includes rotation angle information of the terminal, and the purpose determining sub-module further includes:
a first vertical angle determination unit and a first vertical position update unit;
the first vertical angle determining unit is used for determining the rotation angle of the terminal in the vertical direction according to the rotation angle information of the terminal;
the first vertical position updating unit is used for updating the vertical display position in the target labeling information according to the rotation angle of the terminal in the vertical direction.
In a more preferred embodiment, the server may further include:
the system comprises a region determining module, an interest point inquiring module, an information generating module and an interest point displaying module;
the area determining module is used for determining a visible area of the terminal on a map according to the position parameter of the terminal;
the interest point query module is used for querying at least one interest point located in a visible area of the terminal;
the information generation module is used for generating at least one interest point annotation information according to the inquired interest points;
and the interest point display module is used for displaying the interest point annotation information on the real-time street view image in an overlapping manner.
In a more preferred embodiment, the information generating module includes: the system comprises an information acquisition unit, a content generation unit, a horizontal generation unit, a vertical generation unit and a target labeling unit;
the information acquisition unit is used for acquiring the attribute information, the longitude and latitude information and the height information of the interest points;
the content generating unit is used for generating second display content according to the attribute information of the interest points;
the horizontal generating unit is used for determining the horizontal display position of the second display content on the real-time street view image according to the position identified by the longitude and latitude information of the interest point in the visible area;
the vertical generation unit is used for determining the vertical display position of the second display content on the real-time street view image according to the height information of the interest point;
the target labeling unit is configured to use the display content, the horizontal display position, and the vertical display position as the interest point labeling information.
In a more preferred embodiment, the location parameter of the terminal further includes motion information of the terminal, and the information generating module further includes:
a second horizontal velocity determining unit and a second horizontal position updating unit;
the second horizontal speed determining unit is used for determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal;
and the second horizontal position updating unit is used for updating the horizontal display position in the interest point mark information according to the moving speed of the terminal in the horizontal direction.
In a more preferred embodiment, the position parameter of the terminal further includes rotation angle information of the terminal, and the information generating module further includes:
a second vertical angle determination unit and a second vertical position update unit;
the second vertical angle determining unit is used for determining the rotation angle of the terminal in the vertical direction according to the rotation angle information of the terminal;
and the second vertical position updating unit is used for updating the vertical display position in the interest point annotation information according to the rotation angle of the terminal in the vertical direction.
In a more preferred embodiment, the first obtaining module includes:
the device comprises a second receiving unit, an interest point inquiring unit and a parameter determining unit;
the second receiving unit is used for receiving attribute information of a target position sent by a terminal, wherein the attribute information of the target position is obtained by receiving character input or voice input by the terminal;
the interest point query unit is used for querying interest points corresponding to the target position according to the attribute information of the target position;
and the parameter determining unit is used for taking the searched longitude and latitude information in the interest point corresponding to the target position as the position parameter of the target position.
In a more preferred embodiment, the second obtaining module is specifically configured to receive a location parameter sent by a terminal, where the location parameter includes:
latitude and longitude information and orientation information; or,
latitude and longitude information, orientation information and acceleration information; or,
latitude and longitude information, orientation information and rotation angle information; or,
latitude and longitude information, orientation information, acceleration information and rotation angle information;
the latitude and longitude information is information collected by the terminal through a Global Positioning System (GPS) receiver; the orientation information is information acquired by the terminal through an electronic compass sensor; the acceleration information is information acquired by the terminal through a three-axis acceleration sensor; the rotation angle information is information acquired by the terminal through a gyroscope sensor.
In a more preferred embodiment, the information display module is specifically configured to send the navigation information to the terminal, so that the terminal displays the navigation information in an overlapping manner on the displayed real-time street view image after receiving the navigation information.
In summary, the navigation device provided in this embodiment displays the navigation information by superimposing the real-time street view image displayed on the terminal; the problem that the existing navigation method can only represent a general position area, and cannot correctly find a target position under the condition that a user is unfamiliar with the surrounding environment is solved; the method and the device have the advantages that the navigation information is combined with the real-time street view image, so that the target position guided by the navigation information is a specific position in the real-time street view image, and a user can accurately find the target position.
In this embodiment, the display position of the target annotation information in the navigation information is updated according to the motion information and the rotation angle information of the terminal, so that the display position of the target annotation information can be correspondingly changed along with the movement or rotation of the terminal. In other words, even if the terminal moves or rotates, the target marking information can still accurately indicate the position of the target position, so that the user can still accurately find the target position by means of the indication of the target marking information in a scene such as walking or sitting on a car.
The embodiment also adds at least one interest point mark information in the navigation information, so that the navigation information can provide more useful information, and further, a user can obtain more useful information in real-time street view navigation. Meanwhile, the display position of the interest point marking information in the navigation information is updated through the motion information and the rotation angle information of the terminal, so that the display position of the interest point marking information can be correspondingly changed along with the movement or the rotation of the terminal, and further, a user can still accurately find the interest point by means of the indication of the target marking information in the scenes such as walking or sitting on a car. Alternatively, the point of interest annotation information can be effectively combined with the scene in the real-time street view image.
In this embodiment, the server is further used for performing main calculation, and the terminal only needs to report the location parameter of the target location and the location parameter of the terminal to the server, and then receives the navigation information for display. Because the terminal does not need to download map data in the whole process, compared with the existing 2D navigation method and the virtual 4D navigation method, the requirement on the calculation performance of the terminal is not high, the data flow required by the communication between the terminal and the server is very small, but the overall navigation effect is superior to the existing 2D navigation method and the virtual 4D navigation method. When the number of terminals using the real-time street view navigation function is very large, the requirement for the service carrying capacity of the whole mobile communication network or internet network is also greatly reduced.
EXAMPLE six
Referring to fig. 8, a block diagram of a navigation system according to a sixth embodiment of the present invention is shown; the navigation system may include a terminal 810 and a server 820. The server may be the server provided in embodiment five and more preferred embodiments provided based on embodiment five.
In summary, the navigation system provided in this embodiment displays the navigation information by superimposing the real-time street view image displayed on the terminal; the problem that the existing navigation method can only represent a general position area, and cannot correctly find a target position under the condition that a user is unfamiliar with the surrounding environment is solved; the method and the device have the advantages that the navigation information is combined with the real-time street view image, so that the target position guided by the navigation information is a specific position in the real-time street view image, and a user can accurately find the target position.
In this embodiment, the display position of the target annotation information in the navigation information is updated according to the motion information and the rotation angle information of the terminal, so that the display position of the target annotation information can be correspondingly changed along with the movement or rotation of the terminal. In other words, even if the terminal moves or rotates, the target marking information can still accurately indicate the position of the target position, so that the user can still accurately find the target position by means of the indication of the target marking information in a scene such as walking or sitting on a car.
The embodiment also adds at least one interest point mark information in the navigation information, so that the navigation information can provide more useful information, and further, a user can obtain more useful information in real-time street view navigation. Meanwhile, the display position of the interest point marking information in the navigation information is updated through the motion information and the rotation angle information of the terminal, so that the display position of the interest point marking information can be correspondingly changed along with the movement or the rotation of the terminal, and further, a user can still accurately find the interest point by means of the indication of the target marking information in the scenes such as walking or sitting on a car. Alternatively, the point of interest annotation information can be effectively combined with the scene in the real-time street view image.
In this embodiment, the server is further used for performing main calculation, and the terminal only needs to report the location parameter of the target location and the location parameter of the terminal to the server, and then receives the navigation information for display. Because the terminal does not need to download map data in the whole process, compared with the existing 2D navigation method and the virtual 4D navigation method, the requirement on the calculation performance of the terminal is not high, the data flow required by the communication between the terminal and the server is very small, but the overall navigation effect is superior to the existing 2D navigation method and the virtual 4D navigation method. When the number of terminals using the real-time street view navigation function is very large, the requirement for the service carrying capacity of the whole mobile communication network or internet network is also greatly reduced.
It should be noted that: in the navigation method provided by the above embodiment, when real-time street view navigation is performed, only the division of the functional modules is exemplified, and in practical application, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above. In addition, the navigation device and the navigation method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments in detail and are not described herein again.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (39)

1. A method of navigation, the method comprising:
acquiring a position parameter of a target position;
acquiring a position parameter of a terminal;
determining navigation information according to the position parameters of the target position and the position parameters of the terminal;
and displaying the navigation information in an overlapping manner on the real-time street view image displayed by the terminal.
2. The navigation method according to claim 1, wherein the navigation information includes navigation directions and/or destination label information, and the determining the navigation information according to the location parameter of the target location and the location parameter of the terminal includes:
determining a navigation direction according to the position parameter of the target position and the position parameter of the terminal, wherein the navigation direction is used for indicating the direction towards the target position in the real-time street view image; and/or the presence of a gas in the gas,
and determining target marking information according to the position parameter of the target position and the position parameter of the terminal, wherein the target marking information is used for identifying the location of the target position in the real-time street view image.
3. The navigation method according to claim 2, wherein the location parameter of the target location includes latitude and longitude information of the target location, the location information of the terminal includes latitude and longitude information and orientation information of the terminal, and the determining the navigation direction according to the location parameter of the target location and the location parameter of the terminal includes:
calculating a navigation track on a map according to the longitude and latitude information of the target position and the longitude and latitude information of the terminal;
and generating the navigation direction according to the orientation information of the terminal and the navigation track.
4. The navigation method according to claim 2, wherein the location parameter of the target location includes latitude and longitude information of the target location, the location information of the terminal includes latitude and longitude information and orientation information of the terminal, and the determining the destination label information according to the location parameter of the target location and the location parameter of the terminal includes:
determining a visible area of the terminal on the map according to the longitude and latitude information and the orientation information of the terminal;
detecting whether the target position is located in the visible area or not according to the longitude and latitude information of the target position;
and if the detection result is that the target position is located in the visible area, generating the target labeling information.
5. The navigation method according to claim 4, wherein the position parameters of the target position further include attribute information and altitude information of the target position, and the generating the target annotation information includes:
generating first display content according to the attribute information of the target position;
determining the horizontal display position of the first display content on the real-time street view image according to the position identified by the longitude and latitude information of the target position in the visible area;
determining the vertical display position of the first display content on the real-time street view image according to the height information of the target position;
and taking the first display content, the horizontal display position and the vertical display position as the target annotation information.
6. The navigation method according to claim 5, wherein the location parameter of the terminal further includes motion information of the terminal, and the determining the horizontal display location of the first display content on the real-time street view image according to the location of the target location within the visible area further includes:
determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal;
and updating the horizontal display position in the target marking information according to the moving speed of the terminal in the horizontal direction.
7. The navigation method according to claim 5, wherein the position parameters of the terminal further include rotation angle information of the terminal, and the determining the vertical display position of the first display content on the real-time street view image according to the height information of the target position further includes:
determining the rotation angle of the terminal in the vertical direction according to the rotation angle information of the terminal;
and updating the vertical display position in the target marking information according to the rotation angle of the terminal in the vertical direction.
8. The navigation method of claim 1, wherein the navigation information further includes point of interest annotation information, the method further comprising:
and determining interest point marking information according to the position parameters of the terminal and at least one interest point, wherein the interest point marking information is used for marking the position of the at least one interest point except the target position on the real-time street view image.
9. The navigation method according to claim 8, wherein the determining the interest point annotation information according to the location parameter of the terminal and the at least one interest point comprises:
determining a visible area of the terminal on a map according to the position parameters of the terminal;
inquiring at least one interest point in a visible area of the terminal, wherein the at least one interest point does not include the interest point corresponding to the target position;
and generating at least one interest point annotation information according to the inquired interest points.
10. The navigation method according to claim 9, wherein the generating at least one interest point annotation information according to the queried interest point comprises:
acquiring attribute information, longitude and latitude information and height information of the interest points;
generating second display content according to the attribute information of the interest points;
determining the horizontal display position of the second display content on the real-time street view image according to the position identified by the longitude and latitude information of the interest point in the visible area;
determining the vertical display position of the second display content on the real-time street view image according to the height information of the interest point;
and taking the second display content, the horizontal display position and the vertical display position as the interest point annotation information.
11. The navigation method according to claim 10, wherein the location parameters of the terminal further include motion information of the terminal, and after determining the horizontal display location of the second display content on the real-time street view image according to the location identified by the latitude and longitude information of the point of interest within the visible area, the navigation method further comprises:
determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal;
and updating the horizontal display position in the interest point annotation information according to the moving speed of the terminal in the horizontal direction.
12. The navigation method according to claim 10, wherein the position parameters of the terminal further include rotation angle information of the terminal, and the determining the vertical display position of the second display content on the real-time street view image according to the height information of the interest point further includes:
determining the rotation angle of the terminal in the vertical direction according to the rotation angle information of the terminal;
and updating the vertical display position in the interest point annotation information according to the rotation angle of the terminal in the vertical direction.
13. The navigation method according to any one of claims 1 to 12, wherein the obtaining of the position parameter of the target position includes:
receiving attribute information of a target position of character input or voice input;
inquiring interest points corresponding to the target position according to the attribute information of the target position;
and using the longitude and latitude information in the inquired interest point corresponding to the target position as the position parameter of the target position.
14. The navigation method according to any one of claims 1 to 12, wherein the obtaining the location parameter of the terminal comprises:
acquiring longitude and latitude information through a Global Positioning System (GPS) receiver in the terminal, and taking the longitude and latitude information as a part of the position parameters;
acquiring orientation information through an electronic compass sensor in the terminal, and taking the orientation information as another part of the position parameters;
if the position parameters further comprise motion information, acquiring the motion information through a three-axis acceleration sensor in the terminal;
and if the position parameters further comprise rotation angle information, acquiring the rotation angle information through a gyroscope sensor in the terminal.
15. The navigation method according to any one of claims 3 to 7 and 9 to 12, wherein before determining the navigation information according to the position parameter of the target position and the position parameter of the terminal, the method further comprises:
requesting the map from a server;
and receiving and storing the map fed back by the server.
16. The navigation method according to any one of claims 1 to 12, wherein the obtaining of the position parameter of the target position includes:
receiving attribute information of a target position sent by a terminal, wherein the attribute information of the target position is information obtained by receiving character input or voice input by the terminal;
inquiring interest points corresponding to the target position according to the attribute information of the target position;
and using the longitude and latitude information in the inquired interest point corresponding to the target position as the position parameter of the target position.
17. The navigation method according to any one of claims 1 to 12, wherein the obtaining the location parameter of the terminal comprises:
receiving a position parameter sent by a terminal, wherein the position parameter comprises:
latitude and longitude information and orientation information; or,
latitude and longitude information, orientation information and motion information; or,
latitude and longitude information, orientation information and rotation angle information; or,
latitude and longitude information, orientation information, motion information and rotation angle information;
the latitude and longitude information is information collected by the terminal through a Global Positioning System (GPS) receiver; the orientation information is information acquired by the terminal through an electronic compass sensor; the motion information is information acquired by the terminal through a three-axis acceleration sensor; the rotation angle information is information acquired by the terminal through a gyroscope sensor.
18. The navigation method according to any one of claims 1 to 12, wherein the displaying the navigation information on the real-time street view image displayed by the terminal in an overlaying manner comprises:
and sending the navigation information to the terminal so that the terminal can display the navigation information on the displayed real-time street view image in an overlapping manner after receiving the navigation information.
19. A navigation device, characterized in that the device comprises:
the first acquisition module is used for acquiring the position parameters of the target position;
the second acquisition module is used for acquiring the position parameters of the terminal;
the information determining module is used for determining navigation information according to the position parameter of the target position and the position parameter of the terminal;
and the information display module is used for displaying the navigation information determined by the information determination module on the real-time street view image displayed by the terminal in an overlapping manner.
20. The navigation device according to claim 19, wherein the navigation information includes navigation direction and/or destination label information, and the information determination module includes: a direction determination submodule and/or a purpose determination submodule;
the direction determining submodule is used for determining a navigation direction according to the position parameter of the target position and the position parameter of the terminal, and the navigation direction is used for indicating the direction facing the target position in the real-time street view image;
and the destination determining submodule is used for determining destination marking information according to the position parameter of the target position and the position parameter of the terminal, and the destination marking information is used for identifying the location of the target position in the real-time street view image.
21. The navigation device of claim 20, wherein the location parameter of the target location includes latitude and longitude information of the target location, the location information of the terminal includes latitude and longitude information and orientation information of the terminal, and the direction determination sub-module includes:
a trajectory calculation unit and a direction generation unit;
the track calculation unit is used for calculating a navigation track on a map according to the longitude and latitude information of the target position and the longitude and latitude information of the terminal;
and the direction generating unit is used for generating the navigation direction according to the orientation information of the terminal and the navigation track.
22. The navigation device of claim 20, wherein the location parameter of the target location includes latitude and longitude information of the target location, the location information of the terminal includes latitude and longitude information and orientation information of the terminal, and the destination determination sub-module includes:
a first area determination unit, a target position detection unit and a first information generation unit;
the first area determining unit is used for determining a visible area of the terminal on the map according to the longitude and latitude information and the orientation information of the terminal;
the target position detection unit is used for detecting whether the target position is positioned in the visible area according to the longitude and latitude information of the target position;
the first information generating unit is configured to generate the target annotation information if the detection result indicates that the target position is located in the visible area.
23. The navigation apparatus according to claim 22, wherein the position parameters of the target position further include attribute information and altitude information of the target position, and the first information generation unit includes:
the system comprises a first content generation subunit, a first horizontal generation subunit, a first vertical generation subunit and a target labeling subunit;
the first content generation subunit is configured to generate first display content according to the attribute information of the target position;
the first horizontal generating subunit is configured to determine, according to the identified position of the latitude and longitude information of the target position in the visible area, a horizontal display position of the first display content on the real-time street view image;
the first vertical generation subunit is configured to determine, according to the height information of the target position, a vertical display position of the first display content on the real-time street view image;
and the target labeling subunit is configured to use the first display content, the horizontal display position, and the vertical display position as the target labeling information.
24. The navigation device of claim 23, wherein the location parameters of the terminal further include motion information of the terminal, and the purpose determination sub-module further includes:
a first horizontal velocity determining unit and a first horizontal position updating unit;
the first horizontal speed determining unit is used for determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal;
and the first horizontal position updating unit is used for updating the horizontal display position in the target marking information according to the moving speed of the terminal in the horizontal direction.
25. The navigation device of claim 23, wherein the position parameters of the terminal further include rotation angle information of the terminal, and the purpose determination sub-module further includes:
a first vertical angle determination unit and a first vertical position update unit;
the first vertical angle determining unit is used for determining the rotation angle of the terminal in the vertical direction according to the rotation angle information of the terminal;
the first vertical position updating unit is used for updating the vertical display position in the target labeling information according to the rotation angle of the terminal in the vertical direction.
26. The navigation device of claim 19, wherein the navigation information further includes point of interest annotation information, and wherein the information determination module further comprises: an interest point determination submodule;
the interest point determining submodule is used for determining interest point marking information according to the position parameter of the terminal and at least one interest point, and the interest point marking information is used for marking the position of the at least one interest point except the target position on the real-time street view image.
27. The navigation device of claim 26, wherein the point of interest determination sub-module comprises:
the second region determining unit, the interest point inquiring unit and the second information generating unit;
the second area determining unit is used for determining a visible area of the terminal on a map according to the position parameter of the terminal;
the interest point query unit is configured to query at least one interest point located in a visible area of the terminal, where the at least one interest point does not include an interest point corresponding to the target location;
the second information generating unit is used for generating at least one interest point annotation information according to the inquired interest points.
28. The navigation device according to claim 27, wherein the second information generation unit includes: the system comprises a second information acquisition unit, a second content generation unit, a second horizontal generation unit, a second vertical generation unit and an interest point marking unit;
the second information acquisition unit is used for acquiring the attribute information, the longitude and latitude information and the height information of the interest point;
the second content generating unit is used for generating second display content according to the attribute information of the interest points;
the second horizontal generating unit is used for determining the horizontal display position of the second display content on the real-time street view image according to the position identified by the longitude and latitude information of the interest point in the visible area;
the second vertical generation unit is used for determining the vertical display position of the second display content on the real-time street view image according to the height information of the interest point;
the interest point annotation unit is configured to use the second display content, the horizontal display position, and the vertical display position as the interest point annotation information.
29. The navigation device of claim 28, wherein the location parameters of the terminal further include motion information of the terminal, and wherein the point of interest determination sub-module further includes:
a second horizontal velocity determining unit and a second horizontal position updating unit;
the second horizontal speed determining unit is used for determining the moving speed of the terminal in the horizontal direction according to the motion information of the terminal;
and the second horizontal position updating unit is used for updating the horizontal display position in the interest point mark information according to the moving speed of the terminal in the horizontal direction.
30. The navigation device according to claim 28, wherein the location parameters of the terminal further include rotation angle information of the terminal, the interest point determination sub-module further includes:
a second vertical angle determination unit and a second vertical position update unit;
the second vertical angle determining unit is used for determining the rotation angle of the terminal in the vertical direction according to the rotation angle information of the terminal;
and the second vertical position updating unit is used for updating the vertical display position in the interest point annotation information according to the rotation angle of the terminal in the vertical direction.
31. The navigation device according to any one of claims 19 to 30, wherein the first obtaining module comprises:
the device comprises a first receiving unit, an interest point inquiring unit and a parameter determining unit;
the first receiving unit is used for receiving attribute information of a target position of character input or voice input;
the interest point query unit is used for querying interest points corresponding to the target position according to the attribute information of the target position;
and the parameter determining unit is used for taking the searched longitude and latitude information in the interest point corresponding to the target position as the position parameter of the target position.
32. The navigation device according to any one of claims 19 to 30, wherein the second obtaining module comprises:
the device comprises a longitude and latitude acquisition unit, an orientation acquisition unit, an acceleration acquisition unit and an angle acquisition unit;
the longitude and latitude acquisition unit is used for acquiring longitude and latitude information through a Global Positioning System (GPS) receiver in the terminal and taking the longitude and latitude information as a part of the position parameters;
the orientation acquisition unit is used for acquiring orientation information through an electronic compass sensor in the terminal, and taking the orientation information as another part of the position parameters;
the acceleration acquisition unit is used for acquiring the motion information through a three-axis acceleration sensor in the terminal if the position parameters further comprise motion information;
and the angle acquisition unit is used for acquiring the rotation angle information through a gyroscope sensor in the terminal if the position parameters further comprise rotation angle information.
33. The navigation device according to any one of claims 20 to 25, 27 to 32, further comprising:
the map request module and the map receiving module;
the map request module is used for requesting the map from a server;
and the map receiving module is used for receiving and storing the map fed back by the server.
34. The navigation device according to any one of claims 19 to 30, wherein the first obtaining module comprises:
the device comprises a second receiving unit, an interest point inquiring unit and a parameter determining unit;
the second receiving unit is used for receiving attribute information of a target position sent by a terminal, wherein the attribute information of the target position is obtained by receiving character input or voice input by the terminal;
the interest point query unit is used for querying interest points corresponding to the target position according to the attribute information of the target position;
and the parameter determining unit is used for taking the searched longitude and latitude information in the interest point corresponding to the target position as the position parameter of the target position.
35. The navigation device according to any one of claims 19 to 30, wherein the second obtaining module is specifically configured to receive a location parameter sent by a terminal, where the location parameter includes:
latitude and longitude information and orientation information; or,
latitude and longitude information, orientation information and motion information; or,
latitude and longitude information, orientation information and rotation angle information; or,
latitude and longitude information, orientation information, motion information and rotation angle information;
the latitude and longitude information is information collected by the terminal through a Global Positioning System (GPS) receiver; the orientation information is information acquired by the terminal through an electronic compass sensor; the motion information is information acquired by the terminal through a three-axis acceleration sensor; the rotation angle information is information acquired by the terminal through a gyroscope sensor.
36. The navigation device according to any one of claims 19 to 30, wherein the information display module is specifically configured to send the navigation information to the terminal, so that the terminal displays the navigation information in a manner of superimposing on the displayed real-time street view image after receiving the navigation information.
37. A terminal, characterized in that it comprises a navigation device according to any one of claims 19 to 33.
38. A server, characterized in that the server comprises a navigation device according to any one of claims 19 to 30, 34 to 36.
39. A navigation system, characterized in that the navigation system comprises a terminal and a server,
the server comprising a navigation device as claimed in any one of claims 19 to 30, 34 to 36.
CN201310157571.1A 2013-04-28 2013-04-28 Navigation method, device, terminal, server and system Pending CN104121910A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201310157571.1A CN104121910A (en) 2013-04-28 2013-04-28 Navigation method, device, terminal, server and system
CN201710262213.5A CN106969774A (en) 2013-04-28 2013-04-28 Air navigation aid and device, terminal, server and system
TW102144934A TWI494542B (en) 2013-04-28 2013-12-06 Navigation method, device, terminal, server and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310157571.1A CN104121910A (en) 2013-04-28 2013-04-28 Navigation method, device, terminal, server and system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201710262213.5A Division CN106969774A (en) 2013-04-28 2013-04-28 Air navigation aid and device, terminal, server and system

Publications (1)

Publication Number Publication Date
CN104121910A true CN104121910A (en) 2014-10-29

Family

ID=51767422

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201710262213.5A Pending CN106969774A (en) 2013-04-28 2013-04-28 Air navigation aid and device, terminal, server and system
CN201310157571.1A Pending CN104121910A (en) 2013-04-28 2013-04-28 Navigation method, device, terminal, server and system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201710262213.5A Pending CN106969774A (en) 2013-04-28 2013-04-28 Air navigation aid and device, terminal, server and system

Country Status (2)

Country Link
CN (2) CN106969774A (en)
TW (1) TWI494542B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104457790A (en) * 2014-11-27 2015-03-25 百度在线网络技术(北京)有限公司 Method for evaluating inductive effect of navigation product, testing device and construction method for testing device
CN105571606A (en) * 2014-11-04 2016-05-11 沃尔沃汽车公司 Methods and systems for enabling improved positioning of a vehicle
CN106102004A (en) * 2016-06-07 2016-11-09 珠海市魅族科技有限公司 A kind of method showing objective and mobile terminal
CN106643780A (en) * 2016-11-17 2017-05-10 百度在线网络技术(北京)有限公司 Navigation information representation method and device
CN107677289A (en) * 2017-09-30 2018-02-09 百度在线网络技术(北京)有限公司 Information processing method, device and motor vehicle
CN107733954A (en) * 2016-08-12 2018-02-23 北京嘀嘀无限科技发展有限公司 Method and device for pushed information
CN108632570A (en) * 2017-03-15 2018-10-09 珀斯特传媒有限公司 Image providing method and server
WO2018227380A1 (en) * 2017-06-13 2018-12-20 深圳市伊特利网络科技有限公司 Location-based method and system for recommending relaxation venue
WO2019119358A1 (en) * 2017-12-21 2019-06-27 Bayerische Motoren Werke Aktiengesellschaft Method, device and system for displaying augmented reality poi information
CN109977189A (en) * 2019-03-31 2019-07-05 联想(北京)有限公司 Display methods, device and electronic equipment
CN110345954A (en) * 2018-04-03 2019-10-18 奥迪股份公司 Navigation system and method
CN112422886A (en) * 2019-08-22 2021-02-26 杭州海康威视数字技术股份有限公司 Visual domain three-dimensional control display system

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105871952A (en) * 2015-01-20 2016-08-17 阿里巴巴集团控股有限公司 Method and device for information processing
CN111213118B (en) * 2017-10-09 2023-11-03 深圳传音通讯有限公司 Position identification method and terminal
CN111750872B (en) * 2020-06-17 2021-04-13 北京嘀嘀无限科技发展有限公司 Information interaction method and device, electronic equipment and computer readable storage medium
US20230044871A1 (en) * 2020-12-29 2023-02-09 Google Llc Search Results With Result-Relevant Highlighting
CN112764865A (en) * 2021-01-25 2021-05-07 维沃移动通信有限公司 Display method and device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060155466A1 (en) * 2005-01-12 2006-07-13 Sanyo Electric Co., Ltd. Mobile terminal with navigation function
US7353110B2 (en) * 2004-02-13 2008-04-01 Dvs Korea Co., Ltd. Car navigation device using forward real video and control method thereof
CN102121831A (en) * 2010-12-01 2011-07-13 北京腾瑞万里科技有限公司 Real-time street view navigation method and device
CN102322866A (en) * 2011-07-04 2012-01-18 深圳市子栋科技有限公司 Navigation method and system based on natural speech recognition
CN102338639A (en) * 2010-07-26 2012-02-01 联想(北京)有限公司 Information processing device and information processing method
CN102519478A (en) * 2011-11-16 2012-06-27 深圳市凯立德科技股份有限公司 Streetscape destination guiding method and device
CN102706355A (en) * 2012-05-18 2012-10-03 北京腾瑞万里科技有限公司 Navigation method and mobile terminal
CN102759360A (en) * 2011-04-28 2012-10-31 昆达电脑科技(昆山)有限公司 Navigation device combining driving video record and navigation information
CN102879000A (en) * 2012-09-20 2013-01-16 华为终端有限公司 Navigation terminal, navigation method and remote navigation service system
CN102889892A (en) * 2012-09-13 2013-01-23 东莞宇龙通信科技有限公司 Live-action navigation method and navigation terminal

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003006680A (en) * 2001-06-20 2003-01-10 Zenrin Co Ltd Method for generating three-dimensional electronic map data
KR100489890B1 (en) * 2002-11-22 2005-05-17 한국전자통신연구원 Apparatus and Method to Provide Stereo Video or/and Detailed Information of Geographic Objects
JP3819873B2 (en) * 2003-05-28 2006-09-13 三洋電機株式会社 3D image display apparatus and program
JP2006105640A (en) * 2004-10-01 2006-04-20 Hitachi Ltd Navigation device
JP3961545B2 (en) * 2005-11-29 2007-08-22 株式会社コナミデジタルエンタテインメント Object selection device, object selection method, and program
US20100029293A1 (en) * 2007-05-10 2010-02-04 Sony Ericsson Mobile Communications Ab Navigation system using camera
TW201033586A (en) * 2009-03-12 2010-09-16 Compal Communications Inc Navigation device with real-time image incorporating navigating information and method thereof
TWI408340B (en) * 2009-07-27 2013-09-11 Htc Corp Mehtod for displaying navigation route, navigation apparatus and computer program product
KR101181967B1 (en) * 2010-12-29 2012-09-11 심광호 3D street view system using identification information.
CN102334099B (en) * 2011-08-09 2013-08-28 华为技术有限公司 Method and device of parameter configuration under bs framework
CN102829788A (en) * 2012-08-27 2012-12-19 北京百度网讯科技有限公司 Live action navigation method and live action navigation device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7353110B2 (en) * 2004-02-13 2008-04-01 Dvs Korea Co., Ltd. Car navigation device using forward real video and control method thereof
US20060155466A1 (en) * 2005-01-12 2006-07-13 Sanyo Electric Co., Ltd. Mobile terminal with navigation function
CN102338639A (en) * 2010-07-26 2012-02-01 联想(北京)有限公司 Information processing device and information processing method
CN102121831A (en) * 2010-12-01 2011-07-13 北京腾瑞万里科技有限公司 Real-time street view navigation method and device
CN102759360A (en) * 2011-04-28 2012-10-31 昆达电脑科技(昆山)有限公司 Navigation device combining driving video record and navigation information
CN102322866A (en) * 2011-07-04 2012-01-18 深圳市子栋科技有限公司 Navigation method and system based on natural speech recognition
CN102519478A (en) * 2011-11-16 2012-06-27 深圳市凯立德科技股份有限公司 Streetscape destination guiding method and device
CN102706355A (en) * 2012-05-18 2012-10-03 北京腾瑞万里科技有限公司 Navigation method and mobile terminal
CN102889892A (en) * 2012-09-13 2013-01-23 东莞宇龙通信科技有限公司 Live-action navigation method and navigation terminal
CN102879000A (en) * 2012-09-20 2013-01-16 华为终端有限公司 Navigation terminal, navigation method and remote navigation service system

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105571606A (en) * 2014-11-04 2016-05-11 沃尔沃汽车公司 Methods and systems for enabling improved positioning of a vehicle
CN104457790A (en) * 2014-11-27 2015-03-25 百度在线网络技术(北京)有限公司 Method for evaluating inductive effect of navigation product, testing device and construction method for testing device
CN104457790B (en) * 2014-11-27 2017-07-25 百度在线网络技术(北京)有限公司 Evaluate and test method, test device and its construction method of the inducing effect of navigation product
CN106102004A (en) * 2016-06-07 2016-11-09 珠海市魅族科技有限公司 A kind of method showing objective and mobile terminal
CN107733954A (en) * 2016-08-12 2018-02-23 北京嘀嘀无限科技发展有限公司 Method and device for pushed information
CN106643780A (en) * 2016-11-17 2017-05-10 百度在线网络技术(北京)有限公司 Navigation information representation method and device
CN108632570A (en) * 2017-03-15 2018-10-09 珀斯特传媒有限公司 Image providing method and server
WO2018227380A1 (en) * 2017-06-13 2018-12-20 深圳市伊特利网络科技有限公司 Location-based method and system for recommending relaxation venue
CN107677289A (en) * 2017-09-30 2018-02-09 百度在线网络技术(北京)有限公司 Information processing method, device and motor vehicle
WO2019119358A1 (en) * 2017-12-21 2019-06-27 Bayerische Motoren Werke Aktiengesellschaft Method, device and system for displaying augmented reality poi information
CN111512120A (en) * 2017-12-21 2020-08-07 宝马股份公司 Method, device and system for displaying augmented reality POI information
CN110345954A (en) * 2018-04-03 2019-10-18 奥迪股份公司 Navigation system and method
CN109977189A (en) * 2019-03-31 2019-07-05 联想(北京)有限公司 Display methods, device and electronic equipment
CN112422886A (en) * 2019-08-22 2021-02-26 杭州海康威视数字技术股份有限公司 Visual domain three-dimensional control display system

Also Published As

Publication number Publication date
TWI494542B (en) 2015-08-01
CN106969774A (en) 2017-07-21
TW201441582A (en) 2014-11-01

Similar Documents

Publication Publication Date Title
CN104121910A (en) Navigation method, device, terminal, server and system
US11692842B2 (en) Augmented reality maps
US11990108B2 (en) Method and apparatus for rendering items in a user interface
CN110375755B (en) Solution for highly customized interactive mobile map
US8954275B2 (en) Schematic maps
KR101962394B1 (en) Prominence-based generation and rendering of map features
US8769442B2 (en) System and method for allocating digital graffiti objects and canvasses
JP2019109252A (en) Systems and methods for using visual landmarks in initial navigation
US9087412B2 (en) Method and apparatus for grouping and de-overlapping items in a user interface
US8872767B2 (en) System and method for converting gestures into digital graffiti
US20120194547A1 (en) Method and apparatus for generating a perspective display
US20140301645A1 (en) Method and apparatus for mapping a point of interest based on user-captured images
CN110019580A (en) Map-indication method, device, storage medium and terminal
WO2013055980A1 (en) Method, system, and computer program product for obtaining images to enhance imagery coverage
TWI694298B (en) Information display method, device and terminal
KR102108488B1 (en) Contextual Map View
CN105378433A (en) Method and apparatus for self-adaptively visualizing location based digital information
US20120293550A1 (en) Localization device and localization method with the assistance of augmented reality
JP5780417B2 (en) In-vehicle system
US9354076B2 (en) Guiding server, guiding method and recording medium recording guiding program
WO2018093619A1 (en) Systems and methods for dynamically providing scale information on a digital map
CN109073406B (en) Processing map-related user input to detect route requests
US20120253666A1 (en) Movement guidance display system, movement guidance display method, and computer program
CN110704567A (en) Method and apparatus for outputting information
JP5430363B2 (en) MAP INFORMATION DISPLAY DEVICE, MAP INFORMATION DISPLAY METHOD, AND PROGRAM

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20141029