[go: up one dir, main page]

CN116558542B - Guide arrow generation method, device, equipment and product - Google Patents

Guide arrow generation method, device, equipment and product

Info

Publication number
CN116558542B
CN116558542B CN202310350364.1A CN202310350364A CN116558542B CN 116558542 B CN116558542 B CN 116558542B CN 202310350364 A CN202310350364 A CN 202310350364A CN 116558542 B CN116558542 B CN 116558542B
Authority
CN
China
Prior art keywords
point
points
boundary
arrow
shape
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310350364.1A
Other languages
Chinese (zh)
Other versions
CN116558542A (en
Inventor
刘坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Alibaba China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba China Co Ltd filed Critical Alibaba China Co Ltd
Priority to CN202310350364.1A priority Critical patent/CN116558542B/en
Publication of CN116558542A publication Critical patent/CN116558542A/en
Application granted granted Critical
Publication of CN116558542B publication Critical patent/CN116558542B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3632Guidance using simplified or iconic instructions, e.g. using arrows

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

本公开实施例涉及一种引导箭头的生成方法、装置、设备及产品,通过获取导航路径包括的形点和立体引导箭头的样式文件;基于导航路径包括的形点和立体引导箭头的样式文件中记录的箭头宽度值,生成立体引导箭头的顶面的边界点;针对位于锚点前方的边界点,根据边界点与锚点之间的距离以及当前的相机视角,将位于锚点前方的边界点的视角修正至朝向相机视角方向,基于修正后的顶面的边界点和样式文件记录的箭头高度值,生成立体引导箭头的侧面的边界点,从而基于顶面的边界点和侧面的边界点,得到立体引导箭头。本公开实施例能够保证立体引导箭头的顶面始终能够清晰明了的呈现给用户,解决了相关技术中低视角看不到立体引导箭头的指向的问题。

The disclosed embodiments relate to a method, apparatus, device and product for generating a guide arrow, which obtains the shape points included in the navigation path and the style file of the three-dimensional guide arrow; generates the boundary points of the top surface of the three-dimensional guide arrow based on the shape points included in the navigation path and the arrow width values recorded in the style file of the three-dimensional guide arrow; for the boundary points located in front of the anchor point, according to the distance between the boundary point and the anchor point and the current camera angle of view, corrects the perspective of the boundary point located in front of the anchor point to the direction of the camera angle of view, and generates the boundary points of the side surface of the three-dimensional guide arrow based on the corrected boundary points of the top surface and the arrow height value recorded in the style file, thereby obtaining a three-dimensional guide arrow based on the boundary points of the top surface and the boundary points of the side surface. The disclosed embodiments can ensure that the top surface of the three-dimensional guide arrow can always be clearly presented to the user, solving the problem in the related art that the direction of the three-dimensional guide arrow cannot be seen at a low angle of view.

Description

Method, device, equipment and product for generating guide arrow
Technical Field
The embodiment of the disclosure relates to the technical field of map rendering, in particular to a method, a device, equipment and a product for generating a guide arrow.
Background
Existing navigation guidance generally includes two modes, namely, voice guidance, that is, broadcasting guidance information (such as a front intersection requesting a left turn) to a navigated object by voice broadcasting to guide the navigated object to travel along a navigation route, and picture guidance, that is, drawing a navigation route and a stereoscopic guidance arrow on a navigation interface as shown in fig. 1, and guiding the navigated object to travel along the navigation route by a direction indication given by the stereoscopic guidance arrow. The navigation guidance scene mainly comprises intersection guidance, lane changing guidance and the like.
As navigation evolves from road-level navigation to lane-level navigation, the inventors found that since the existing stereoscopic guide arrow is static, the model of the stereoscopic guide arrow once constructed cannot adjust the pointing direction according to the viewing angle direction, so in a lane-level navigation scene, if the guide arrow is viewed from the side of the stereoscopic guide arrow at a low viewing angle, the problem that the pointing direction of the stereoscopic guide arrow is blocked by other parts of the stereoscopic guide arrow can occur, which can cause the navigated subject to not see the pointing direction of the arrow, miss the entrance and even walk, and affect the user experience. Therefore, how to provide a clear and clear stereoscopic guiding arrow to help the navigated object accurately determine the driving action is a technical problem that needs to be solved by those skilled in the art.
Disclosure of Invention
In order to solve the technical problems, the embodiments of the present disclosure provide a method, an apparatus, a device, and a product for generating a guiding arrow.
A first aspect of the disclosed embodiments provides a method for generating a guide arrow, which includes obtaining a shape point included in a navigation path and a style file of the stereoscopic guide arrow, generating a boundary point of a top surface of the stereoscopic guide arrow based on the shape point included in the navigation path and an arrow width value recorded in the style file of the stereoscopic guide arrow, correcting a view angle of the boundary point located in front of an anchor point to be directed to a camera view angle direction according to a distance between the boundary point and the anchor point and a current camera view angle for the boundary point located in front of the anchor point, generating a boundary point of a side surface of the stereoscopic guide arrow based on the corrected boundary point of the top surface and the arrow height value recorded in the style file, wherein an included angle between the side surface and the top surface is equal to a preset included angle, and obtaining the stereoscopic guide arrow based on the boundary point of the top surface and the boundary point of the side surface.
A second aspect of an embodiment of the present disclosure provides a generating device of a guidance arrow, including:
The first acquisition module is used for acquiring a shape point and a style file of a three-dimensional guide arrow included in the navigation path;
The first generation module is used for generating boundary points of the top surface of the stereoscopic guiding arrow based on the shape points included in the navigation path and the arrow width values recorded in the stereoscopic guiding arrow style file;
The correction module is used for correcting the view angle of the boundary point positioned in front of the anchor point to be towards the view angle direction of the camera according to the distance between the boundary point and the anchor point and the current view angle of the camera;
The second generation module is used for generating boundary points of side faces of the stereoscopic guiding arrow based on the corrected boundary points of the top face and the arrow height values recorded by the style file, and the included angle between the side faces and the top face is equal to a preset included angle;
And the third generation module is used for obtaining the stereoscopic guiding arrow based on the boundary point of the top surface and the boundary point of the side surface.
A third aspect of the embodiments of the present disclosure provides a terminal device, which includes a memory and a processor, where the memory is a nonvolatile memory, and the memory stores a computer program, where the computer program is executed by the processor, and may implement the method described in the first aspect.
A fourth aspect of the disclosed embodiments provides a computer program product stored on a storage medium, the storage medium being a non-volatile storage medium, which when executed, enables the method of the first aspect described above to be carried out.
A fifth aspect of the embodiments of the present disclosure provides a computer-readable storage medium, which is a non-volatile storage medium, in which a computer program is stored, which, when executed, can implement the method according to the first aspect.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has the following advantages:
According to the embodiment of the disclosure, a shape point and a style file of a stereoscopic guiding arrow are acquired, the shape point and the style file of the stereoscopic guiding arrow are acquired, a boundary point of the top surface of the stereoscopic guiding arrow is generated based on the shape point and the arrow width value recorded in the style file of the stereoscopic guiding arrow, the angle of view of the boundary point positioned in front of an anchor point is corrected to be directed to the direction of the camera angle of view according to the distance between the boundary point and the anchor point and the current camera angle of view, and the boundary point of the side surface of the stereoscopic guiding arrow is generated based on the corrected boundary point of the top surface and the arrow height value recorded in the style file, so that the stereoscopic guiding arrow is obtained based on the boundary point of the top surface and the boundary point of the side surface. According to the embodiment of the disclosure, the visual angle of the boundary point in front of the anchor point is corrected to the visual angle direction of the camera, so that the visual angle of the boundary point on the twisted top surface faces the visual angle direction of the camera, the top surface of the stereoscopic guiding arrow is ensured to be clearly presented to a user all the time, the user is helped to quickly focus on driving actions such as turning, turning around, changing lanes and the like, and the problem that the stereoscopic guiding arrow cannot be seen from the side surface in a low visual angle in the related art is solved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments of the present disclosure or the solutions in the prior art, the drawings that are required for the description of the embodiments or the prior art will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a schematic illustration of a navigation interface provided by the related art;
Fig. 2 is a schematic diagram of an application scenario provided in an embodiment of the present disclosure;
FIG. 3 is a flowchart of a method for generating a guide arrow provided by an embodiment of the present disclosure;
Fig. 4 is a schematic diagram of a method for generating a boundary point of a top surface of a stereoscopic guiding arrow according to an embodiment of the disclosure;
FIG. 5 is a schematic diagram of a method for generating an endoskeleton point provided by an embodiment of the present disclosure;
FIG. 6 is a schematic illustration of an endoskeleton smoothing method;
FIG. 7 is a schematic diagram of a method of generating exoskeleton points based on the exoskeleton points in FIG. 5;
FIG. 8 is a schematic diagram of a method of generating a surface with the points of the endoskeleton, the points of the exoskeleton and the points of the shape in FIG. 7;
FIG. 9 is a schematic top view of a stereoscopic guide arrow provided by an embodiment of the present disclosure;
FIG. 10 is a schematic view of boundary points on the top surface of a stereoscopic guide arrow provided by an embodiment of the present disclosure;
FIG. 11 is a generation effect diagram of a stereoscopic guide arrow provided by an embodiment of the present disclosure;
FIG. 12 is a flow chart of a texture mapping method provided by an embodiment of the present disclosure;
fig. 13 is a schematic structural view of a generating device of a guide arrow provided in an embodiment of the present disclosure;
fig. 14 is a schematic structural diagram of a terminal device in an embodiment of the present disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, a further description of aspects of the present disclosure will be provided below. It should be noted that, without conflict, the embodiments of the present disclosure and features in the embodiments may be combined with each other.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced otherwise than as described herein, and it is apparent that the embodiments in the specification are only some, rather than all, of the embodiments of the present disclosure.
In order to facilitate understanding of the technical solutions of the embodiments of the present disclosure, first, some nouns related to the embodiments of the present disclosure are explained.
The shape point can be understood as a positioning point contained in the navigation path, and comprises information such as longitude, latitude, altitude and the like of the positioning point.
The anchor point is a position for leading out driving actions such as straight running, turning around and the like, the driving actions are guided by a three-dimensional guiding arrow, namely, the guiding arrow is led out of the anchor point, and the anchor point can be generally understood as a bifurcation point/bifurcation point of a road.
Texture mapping, in the embodiment of the disclosure, texture on a texture image is applied to a stereoscopic guiding arrow.
Dissection refers to the process of constructing skeleton points and faces of a model according to the trend of shape points and further dividing the faces into renderable geometric primitives.
Fig. 2 is a schematic diagram of an application scenario provided in an embodiment of the present disclosure. The terminal device in fig. 2 may be understood as a device with navigation functions and processing capabilities such as a mobile phone, car phone, tablet computer, etc. The server is a server that can provide navigation services. In the scenario shown in fig. 2, the terminal device may send a navigation request to the server through a preset communication protocol, where the navigation request includes, but is not limited to, information such as a start position, an end position, and the like. After receiving the navigation request, the server performs route planning according to the information of the initial position, the final position and the like, and sends one or more pieces of information of the planned route to the terminal equipment, and the user selects the navigation route from the one or more pieces of planned routes. The navigation route comprises, but is not limited to, a shape point, an anchor point position, a navigation guiding action led out from the anchor point position, and a style file of a stereoscopic guiding arrow corresponding to the navigation guiding action. The style file may include information such as style, width value, and height value of the stereoscopic guide arrow. The style files can be issued in advance to the terminal device for storage, and are not necessarily issued along with the navigation route.
After the terminal device obtains the navigation route, the terminal device moves according to the navigation route, and determines the anchor point position to be passed by the front and the navigation guiding action to be led out by the anchor point position according to the positioning position of the terminal device and the navigation route in front (the navigation route is the route of the navigation route in a certain section of road), and further obtains the style file of the corresponding three-dimensional guiding arrow and the shape point included in the navigation route according to the navigation guiding action. And then taking the shape point as a base point, and splitting boundary points of the top surface of the stereoscopic guide arrow at the left side and the right side of the shape point according to the arrow width value recorded in the stereoscopic guide arrow style file. Further, for the boundary point located in front of the anchor point, the view angle of the boundary point located in front of the anchor point is corrected to be directed toward the view angle direction of the camera according to the distance between the boundary point and the anchor point and the current view angle of the camera, so that the twisted boundary point can be seen from the view angle of the camera. And generating boundary points of the side faces of the stereoscopic guide arrows based on the boundary points of the top faces after correction and the arrow height values recorded by the style files, and further obtaining the stereoscopic guide arrows based on the boundary points of the top faces and the boundary points of the side faces.
According to the embodiment of the disclosure, the visual angle of the boundary point of the top surface of the stereoscopic guide arrow is corrected to the visual angle direction of the camera, so that the top surface of the stereoscopic guide arrow always faces the visual angle direction, the direction of the stereoscopic guide arrow is ensured to be always seen, a user is helped to quickly focus on the direction guided by the stereoscopic guide arrow, and the user experience is improved.
In order to better understand the technical solutions of the embodiments of the present disclosure, the technical solutions of the embodiments of the present disclosure are described below in conjunction with exemplary embodiments.
By way of example, fig. 3 is a flowchart of a method for generating a guidance arrow according to an embodiment of the present disclosure. The method may be exemplarily performed by the terminal device in fig. 2. As shown in fig. 3, in some exemplary implementations, the method provided by the disclosed embodiments may include steps 301-305.
Step 301, a style file of a shape point and a stereoscopic guiding arrow included in a navigation path is obtained.
The navigation path in the embodiments of the present disclosure may be understood as a path of the navigation route in the front road. A shape point may be understood as a location point on a navigation path. Illustratively, in embodiments of the present disclosure, the shape point includes an anchor point, which may be understood as the bifurcation point of the road.
The stereoscopic guiding arrow can understand three-dimensional marks for indicating driving actions, such as turning arrows, lane changing arrows and the like. The style file of the stereoscopic guide arrow includes style information of the guide arrow, such as the pointing direction of the arrow, the arrow width value, the arrow height value, the arrow length, and the like.
In some embodiments, a guiding action of a front road may be determined according to a current positioning position and a navigation path, and then a style file of a stereoscopic guiding arrow corresponding to the guiding action is acquired according to the guiding action. Referring to fig. 2, in one implementation of the embodiment of the present disclosure, a terminal device transmits information such as a start position, a terminal position, and the like to a server. And the server plans a route according to the information such as the starting point position, the end point position and the like, and feeds back the information of the planned route to the terminal equipment. The information of the planned route comprises shape points of navigation guidance of each road section, navigation guidance actions corresponding to each anchor point in the planned route and style files of three-dimensional guidance arrows corresponding to the navigation guidance actions. After the user selects the navigation route from the planned route, the terminal device guides the user to travel according to the navigation route. When the distance between the terminal device and the anchor point is smaller than or equal to the preset distance, the generation operation of the stereoscopic guiding arrow can be started. The terminal device determines a guiding action (such as left turn) of the front road according to the current positioning position and the navigation path, and then acquires a shape point and a style file of a three-dimensional guiding arrow included in the front navigation path from data of the navigation path according to the guiding action.
It should be noted that, the distance between the terminal device and the anchor point is continuously changed, and as the distance between the terminal device and the anchor point is changed, the style of the stereoscopic guiding arrow may also follow the change, for example, the portion corresponding to the road that has been driven in the stereoscopic guiding arrow disappears, etc. According to the embodiment of the disclosure, the style file of the stereoscopic guiding arrow corresponding to the real-time distance can be obtained in real time according to the distance between the terminal equipment and the anchor point, and the corresponding stereoscopic guiding arrow can be generated in real time. The style file of the stereoscopic guiding arrow is obtained in real time through the distance between the terminal equipment and the anchor point, and the corresponding stereoscopic guiding arrow is generated, so that the stereoscopic guiding arrow can be ensured to change along with the movement of the terminal equipment in real time, the stereoscopic guiding arrow can always play a role in pointing, the pointing accuracy of the stereoscopic guiding arrow is ensured, and the guiding effect of the stereoscopic guiding arrow is improved.
For example, in some embodiments, after the shape point is obtained, a process of preprocessing the shape point may be further included. The pre-processing procedure includes, but is not limited to, thinning and smoothing fit. For example, after the shape points are obtained, the shape points may be traversed to determine the distance between adjacent shape points, and if the distance between a certain shape point and a previous shape point and/or a subsequent shape point is less than a preset distance, the shape points are prevented from intersecting within the stereoscopic guiding arrow a normal vector between shape points when the distance between shape points is too close to cause model isomerization. For another example, in another embodiment, the shape points may be traversed to determine the connection line of the adjacent shape points and the included angle (hereinafter referred to as the second included angle) between the two adjacent connection lines, and if the second included angle between the two adjacent connection lines is smaller than the second preset threshold, the shape points on the two connection lines are determined to be collinear. For collinear shape points, a first shape point and a last shape point can be reserved, and the shape point between the first shape point and the last shape point is deleted, so that the data volume is reduced, and the calculation efficiency is improved. For another example, if the second included angle between two adjacent connecting lines is greater than a third preset threshold, and the third preset threshold is greater than the second preset threshold, that is, the position with a larger direction change such as turning, at least one shape point can be inserted by an interpolation method such as catmull-rom, so as to improve the smoothness of the connecting line of the shape point at the position with a larger direction change such as turning.
Step 302, generating boundary points of the top surface of the stereoscopic guiding arrow based on the shape points included in the navigation path and the arrow width values recorded in the style file of the stereoscopic guiding arrow.
Fig. 4 is a schematic diagram illustrating a method for generating a boundary point of a top surface of a stereoscopic guiding arrow according to an embodiment of the disclosure. As shown in fig. 4, in an exemplary implementation of the embodiments of the present disclosure, a method of generating a boundary point of a top surface of a guide arrow may include steps 401 to 402.
In step 401, endoskeletal points on the top surface of the stereoscopic guide arrow are split on the left and right sides of the shape point based on the shape point included in the navigation path and the arrow width value recorded in the style file of the stereoscopic guide arrow.
At step 402, at least an endoskeleton point is taken as a boundary point for the top surface.
For example, fig. 5 is a schematic diagram of a method for generating an endoskeleton point according to an embodiment of the disclosure. As shown in fig. 5, G1 and G2 are two adjacent shape points included in the shape points in fig. 5. Wherein G2 is a shape point located in front of G1 running. Connecting G1 and G2 generates a vector (represented by arrow L1 in fig. 5) directed from G1 to G2. At this time, the vectors L11 and L12 may be obtained by rotating L1 clockwise and counterclockwise by 90 degrees, respectively, with G1 as a base point. Then, according to the arrow width value, an inner skeleton point G11 is inserted at a position which is one half of the arrow width value from G1 in the direction of the vector L11, and an inner skeleton point G12 is inserted at a position which is one half of the arrow width value from G1 in the direction of the vector L12, so that the inner skeleton points on the left and right sides of G1 are obtained. And the same can obtain the inner skeleton points on the left side and the right side of each shape point.
The boundary point of the top surface of the three-dimensional guide arrow can be obtained rapidly and accurately through a subdivision method.
For example, in some scenes with large directional changes, such as turns, head drops, etc., to improve the smoothness of the stereoscopic guide arrows at the turns, in some implementations of the disclosed embodiments, a step of smoothing the endoskeleton of the stereoscopic guide arrows at the turns may be included. The endoskeleton refers to a curve obtained by connecting points of the endoskeleton. For example, in one embodiment, after the endoskeleton point on the top surface of the stereoscopic guiding arrow is split, the connection line of two adjacent shape points in the shape points included in the navigation path can be determined first, and the normal vector perpendicular to the connection line of the two adjacent shape points can be determined, where the normal vector takes the shape points as the homeowner. And then determining a first included angle of two normal vectors on a common shape point of two adjacent connecting lines, wherein the two normal vectors point to an inner skeleton point on the same side. If the first included angle is larger than a first preset threshold value, determining the opening direction of the first included angle as a chamfering direction, and inserting at least one inner skeleton point in the chamfering direction so as to smooth the inner skeletons of the sides pointed by the two normal vectors. For example, fig. 6 is a schematic diagram of an endoskeleton smoothing method, as shown in fig. 6, M1, M2, M3 are three consecutive shape points, and N1, N2, N3, N4, N5 are endoskeleton points. Wherein, N1, N3 and N5 are respectively the endoskeleton points obtained by adopting the method shown in FIG. 5 by taking M1, M2 and M3 as base points. X1 is the normal vector of the connecting line of M1 and M2, X2 is the normal vector of the connecting line of M2 and M3, and the angle O is the first included angle which is the included angle between X1 and X2, as shown in FIG. 6, when the first included angle is larger than a first preset threshold value, the direction of the opening of the angle O is determined as the chamfering direction, and the inner skeleton points N2 and N4 are inserted in the chamfering direction, so that the curves obtained after the connection of N1, N2, N3, N4 and N5 are smooth. Of course, fig. 6 is merely illustrative and not limiting.
For example, in some embodiments, after obtaining the endoskeleton point based on any one of the methods described above, the method may further include a step of splitting the exoskeleton point of the stereoscopic guide arrow in a direction away from the endoskeleton point with the endoskeleton point as a base point.
By way of example, FIG. 7 is a schematic diagram of a method of generating exoskeleton points based on the endoskeleton points in FIG. 5. Taking the endoskeleton point G11 as an example, when the endoskeleton point G11 is taken as a base point, the exoskeleton point G111 may be obtained by interpolation at a position at a predetermined distance from the endoskeleton point G11 in the direction of the vector L11. Similarly, when the exoskeleton point G12 is taken as a base point, the exoskeleton point G112 may be obtained by interpolation at a position at a preset distance from the exoskeleton point G12 in the direction of the vector L12. Similarly, each exoskeleton point can be used as a base point to generate a corresponding exoskeleton point.
In the case of generating an exoskeleton point from an endoskeleton point, the endoskeleton point and the skeletal point may also be taken as boundary points of the top surface of the stereoscopic guide arrow. In this case, the method of generating the top surface of the stereoscopic guide arrow may refer to fig. 8. By way of example, FIG. 8 is a schematic diagram of a method of generating a surface with the points of the endoskeleton, the points of the exoskeleton and the points of the shape in FIG. 7. In fig. 8, G21 and G22 are the exoskeleton points generated based on G2, and G211 and G221 are the exoskeleton points generated based on G21 and G22, respectively. As shown in fig. 8, G21, G2, G11, and G1 may be connected to form a quadrilateral, G22, G2, G12, and G1 may be connected to form a quadrilateral, G211, G21, G111, and G11 may be generated to form a quadrilateral, G221, G22, G112, and G12 may be formed to form a quadrilateral, and each quadrilateral may be triangulated to obtain a plane formed by G1, G2, G11, G12, G111, G112, G21, G22, G211, and G221, and so on, an endoskeleton point obtained by taking each of the points of the shapes as a base point, and an exoskeleton point obtained by taking each of the points of the endoskeleton as a base point may be connected to obtain a top surface of the stereoscopic guiding arrow, such as the top surface of the stereoscopic guiding arrow shown in fig. 9. Of course, fig. 8 and 9 are merely examples and are not intended to be limiting.
According to the embodiment of the disclosure, the shape points are used as the base points, the inner skeleton points are generated according to the width values of the three-dimensional guide arrows, then the outer skeleton points are generated by taking the inner skeleton points as the base points, the top surface is generated based on the outer skeleton points, the inner skeleton points and the shape points, the accuracy of the generated top surface can be improved, one inner skeleton point and one outer skeleton point extend out of one side of each shape point, the number of the inner skeleton points and the outer skeleton points can be reduced, storage space and calculation resources are saved, and the generation efficiency of the top surface of the guide arrows is improved.
It should be noted that the method for generating the endoskeleton points and the exoskeleton points is only an exemplary method, and not the only method, in other embodiments, a shape point may be taken as a base point, a preset number of points are inserted as the endoskeleton points and/or the exoskeleton points at the left side and the right side of the shape point according to a preset step length, and then a top surface is generated based on the inserted points and the shape point.
Step 303, for the boundary point located in front of the anchor point, correcting the view angle of the boundary point located in front of the anchor point to the direction towards the view angle of the camera according to the distance between the boundary point and the anchor point and the current view angle of the camera.
The camera view angle can be understood as the included angle between the connecting line of the camera and the anchor point and the vertical direction.
In some embodiments, a torsion angle for correcting the boundary point in front of the anchor point to a direction towards the camera viewing angle may be generated according to the distance between the boundary point and the anchor point and the current camera viewing angle, and then the boundary point in front of the anchor point is twisted based on the torsion angle, so that the viewing angle of the boundary point in front of the anchor point on the top surface of the stereoscopic guiding arrow is towards the camera viewing angle direction.
In the embodiments of the present disclosure, there are various methods for generating the torsion angle of the boundary point on the top surface of the stereoscopic guide arrow, and an exemplary method will be described below as an example. By way of example, fig. 10 is a schematic illustration of boundary points on the top surface of a stereoscopic guide arrow provided by an embodiment of the present disclosure. As shown in FIG. 10, Z is an anchor point and M 11、M12、M13……M1N is N shape points. M 21、M22、M23……M2N is the N endoskeletal points to the left of the centroid. M 31、M32、M33……M3S is S endoskeleton points on the right side of the shape point, and the values of S and N can be the same or different. M 41、M42、M43……M4N is the N exoskeleton points to the left of the shape point, and M 51、M52、M53……M5S is the S exoskeleton points to the right of the shape point. For example, for the left N endoskeleton points, the distance from anchor point Z to M 21 can be calculated first, then the distance between M 21 and M 22 can be calculated separately, The distance between M 22 and M 23, and so on, calculates the distance between all adjacent endoskeleton points, and then sums the distance between the anchor point Z and M 21 and the distance between all adjacent endoskeleton points on the left side to obtain the maximum progressive distance of the endoskeleton points on the left side, namely the progressive distance of M 2N. Wherein the distance of anchor point Z to M 21 may be taken as the progressive distance of M 21. The sum of the distance from anchor point Z to M 21 and the distance from M 21 to M 22 can be used as the progressive distance of M 22 relative to anchor point Z, and so on, and the progressive distance of other endoskeletal points on the left relative to the anchor point can be obtained. Similarly, the maximum progressive distance of the exoskeleton points to the right of the shape point and the progressive distance of the exoskeleton points to the right relative to the anchor point Z, the maximum progressive distance of the exoskeleton points to the left of the shape point and the progressive distance of the exoskeleton points to the left relative to the anchor point Z, the maximum progressive distance of the exoskeleton points to the right of the shape point and the progressive distance of the exoskeleton points to the right relative to the anchor point Z may be calculated.
Further, for any one of the inner skeleton points or the outer skeleton points, a ratio of a progressive distance of the inner skeleton point or the outer skeleton point to a maximum progressive distance of a side of the inner skeleton point or the outer skeleton point may be used as a weighting coefficient, and a weighted calculation may be performed on an angle of the current camera view angle (for example, the weighting coefficient may be multiplied by the angle of the current view angle) to obtain a torsion angle of the inner skeleton point or the outer skeleton point. Or d can be used as a weighting coefficient to weight the angle of the camera view angle by using the ratio of the progressive distance of the inner skeleton point or the outer skeleton point to the maximum progressive distance of the upper boundary point (the inner skeleton point or the outer skeleton point on either side) of the top surface, so as to obtain the torsion angle. That is, in one embodiment, for any boundary point in front of the anchor point, the progressive distance of the boundary point relative to the anchor point can be determined according to the position of the anchor point, and the ratio of the progressive distance of the boundary point to the maximum progressive distance of the boundary point on the top surface is used as a weighting coefficient to weight the angle of the camera view angle, so as to obtain the torsion angle of any boundary point.
In some embodiments, after the torsion angle of each boundary point is obtained, the angle of view of the boundary point may be quaternion-twisted toward the camera view direction based on the torsion angle of each boundary point, such that the angle of view of the twisted boundary point is directed toward the camera view direction. The quaternion-based torsion method may be referred to in the related art and will not be described herein.
According to the embodiment of the disclosure, the quaternion-based torsion method can improve the torsion efficiency of the vertex without carrying out coordinate transformation on the vertex.
It should be noted that, in the above method, the farther the boundary points (such as the inner skeleton points and the outer skeleton points) are from the anchor points, the larger the torsion angle is, so that a linear torsion effect is presented, the torsion of the stereoscopic guiding arrow is more linear and smooth, and the display effect of the stereoscopic guiding arrow is improved.
And 304, generating boundary points of the side faces of the stereoscopic guide arrows based on the boundary points of the corrected top faces and the arrow height values recorded by the style files, wherein the included angle between the side faces of the stereoscopic guide arrows and the top faces is equal to a preset included angle.
The predetermined angle may be, for example, 90 degrees, or may be any other angle.
In some embodiments, the method for generating the side boundary points may be to generate a normal vector of each boundary point toward the opposite direction of the camera view angle based on each boundary point on the top surface of the stereoscopic guide arrow, and then generate a boundary point of the bottom surface of the stereoscopic guide arrow on the normal vector, where the distance from the boundary point of the bottom surface to the boundary point of the corresponding top surface is equal to the height of the stereoscopic guide arrow recorded in the pattern file. Since the boundary point of the bottom surface is actually the boundary point of the side surface, the boundary point of the bottom surface may be regarded as the boundary point of the side surface.
For example, in other embodiments, by traversing the exoskeleton points on the top surface, when traversing to the exoskeleton points, any triangle is selected from the triangles formed by the exoskeleton points and the endoskeleton points as a target triangle, the direction of the normal vector of the side of the target triangle away from the top surface is taken as the target direction, and then the boundary points of the side surfaces are obtained by interpolation in the target direction according to the height value of the stereoscopic guiding arrow.
And 305, obtaining a stereoscopic guiding arrow based on the boundary point of the top surface and the boundary point of the side surface.
For example, fig. 11 is a generating effect diagram of a stereoscopic guiding arrow provided in an embodiment of the disclosure. The method includes the steps of obtaining a shape point included in a navigation path and a style file of a stereoscopic guiding arrow, generating boundary points of the top surface of the stereoscopic guiding arrow based on the shape point included in the navigation path and an arrow width value recorded in the style file of the stereoscopic guiding arrow, correcting the view angle of the boundary points located in front of an anchor point to be directed to the view angle direction of the camera according to the distance between the boundary points and the anchor point and the current view angle of the camera for the boundary points located in front of the anchor point, and generating boundary points of the side surfaces of the stereoscopic guiding arrow based on the boundary points of the top surface and the boundary points of the side surfaces after correction based on the arrow height value recorded in the style file, as shown in fig. 11. According to the embodiment of the disclosure, the visual angle of the boundary point in front of the anchor point is corrected to the visual angle direction of the camera, so that the visual angle of the boundary point on the twisted top surface faces the visual angle direction of the camera, the top surface of the stereoscopic guiding arrow is ensured to be clearly presented to a user all the time, the user is helped to quickly focus on driving actions such as turning, turning around, changing lanes and the like, and the problem that the stereoscopic guiding arrow cannot be seen from the side surface in a low visual angle in the related art is solved.
By way of example, fig. 12 is a flowchart of a texture mapping method provided by an embodiment of the present disclosure, and as shown in fig. 12, after generating a stereoscopic guide arrow based on the method in the embodiment of fig. 3, textures on a texture image may be mapped onto the guide arrow by the following method.
Step 1201, determining, for coordinate points on the stereoscopic guiding arrow, a proportion of a progressive distance of the coordinate point relative to the anchor point to a maximum progressive distance of the coordinate point on the stereoscopic guiding arrow.
For each coordinate point on the stereoscopic guiding arrow, the method for calculating the progressive distance can be referred to as the method for determining the progressive distance between the inner skeleton point and the outer skeleton point in the above embodiment, which is not described herein.
And 1202, weighting the height of the texture image by taking the proportion as a weighting coefficient to obtain the ordinate of the texture corresponding to the coordinate point.
For example, assuming that the ratio of the progressive distance of the coordinate point to the maximum progressive distance is Q and the height of the texture image is H, the result of multiplying Q by H may be taken as the ordinate of the texture corresponding to the coordinate point on the texture image.
Step 1203, obtaining the abscissa of the texture corresponding to the coordinate point from a preset texture mapping configuration table.
In the embodiment of the present disclosure, a texture mapping configuration table may be preconfigured with information of abscissa coordinates of textures corresponding to each coordinate point on a texture image. According to the embodiment of the disclosure, the abscissa of the texture corresponding to the coordinate point on the texture image can be directly obtained from the texture mapping configuration table.
And step 1204, mapping the texture onto the coordinate point according to the ordinate and the abscissa.
After determining the abscissa and the ordinate of the texture on the texture image, the texture corresponding to the abscissa and the ordinate on the texture image is mapped to the coordinate point of the guiding arrow, so that the display effect of the stereoscopic guiding arrow is improved.
In some implementations of the disclosed embodiments, after mapping the textures corresponding to the abscissa and the ordinate onto the coordinate points of the stereoscopic guide arrow, the textures on the coordinate points of the stereoscopic guide arrow may also be transformed at preset time intervals. For example, in a possible implementation manner, after a preset time interval is reached, the data of the abscissa and the numerical value of the ordinate of the texture corresponding to the coordinate point can be changed according to a preset change amplitude, for example, a first numerical value is added to the abscissa, a second numerical value is added to the ordinate, and the like, and then the changed texture mapping corresponding to the abscissa and the ordinate is remapped to the coordinate point of the stereoscopic guiding arrow, so that the display effect of the streamer is created on the stereoscopic guiding arrow, the identification degree of the stereoscopic guiding arrow is improved, and the problem of weak auxiliary focusing function caused by insufficient protrusion of the stereoscopic guiding arrow is avoided.
For example, in other embodiments, the guiding arrow may be pointed by an additional indicating arrow, so as to help the user to quickly focus on the stereoscopic guiding arrow, and improve the recognition degree of the stereoscopic guiding arrow. For example, in one example, the moving speed of the indication arrow and the closest distance between the indication arrow and the terminal device and the anchor point may be preset, initially, the indication arrow is generated at the closest distance between the indication arrow and the terminal device along the guiding path, then, the position of the indication arrow at each moment is determined according to the preset moving speed of the indication arrow, and the indication arrow is generated at the position, and after the indication arrow moves to the closest distance between the indication arrow and the anchor point, the indication arrow is generated again at the closest distance between the indication arrow and the terminal device according to the positioning position of the terminal device at the next moment, so that the effect of circulating flow of the indication arrow on the guiding path is achieved, thereby helping the user to quickly focus on the stereoscopic guiding arrow and improving the guiding effect of the stereoscopic guiding arrow.
Fig. 13 is a schematic structural diagram of a generating device of a guiding arrow according to an embodiment of the present disclosure. The apparatus may be understood as a terminal device or a part of functional modules in a terminal device in the above embodiments. As shown in fig. 13, the generating apparatus 1300 includes:
a first obtaining module 1301, configured to obtain a shape point and a style file of a stereoscopic guiding arrow included in a navigation path;
a first generating module 1302, configured to generate a boundary point of a top surface of the stereoscopic guiding arrow based on a shape point included in the navigation path and an arrow width value recorded in the style file of the stereoscopic guiding arrow;
The correction module 1303 is configured to correct, for a boundary point located in front of the anchor point, a view angle of the boundary point located in front of the anchor point to be directed toward the camera view angle direction according to a distance between the boundary point and the anchor point and a current camera view angle;
A second generating module 1304, configured to generate, based on the modified boundary point of the top surface and the arrow height value recorded by the style file, a boundary point of a side surface of the stereoscopic guiding arrow, where an included angle between the side surface and the top surface is equal to a preset included angle;
And a third generating module 1305, configured to obtain the stereoscopic guiding arrow based on the boundary point of the top surface and the boundary point of the side surface.
In one embodiment, the first generation module 1302 is configured to:
And dividing an endoskeleton point of the top surface of the stereoscopic guiding arrow at the left side and the right side of the shape point based on the shape point included in the navigation path and the arrow width value recorded in the style file of the stereoscopic guiding arrow, and taking at least the endoskeleton point as a boundary point of the top surface.
In one embodiment, the first generation module 1302 may also be configured to:
And taking the inner skeleton point as a base point, splitting the outer skeleton point of the three-dimensional guiding arrow in a direction deviating from the inner skeleton point, and taking the inner skeleton point and the outer skeleton point as boundary points of the top surface.
In one embodiment, the generating apparatus 1300 may further include a first smoothing module configured to:
Determining the connection line of two adjacent shape points in the shape points included in the navigation path;
Determining normal vectors perpendicular to connecting lines of two adjacent shape points, wherein the normal vectors take the shape points as the homeowners;
determining a first included angle of two normal vectors on a common shape point of two adjacent connecting lines, wherein the two normal vectors point to an inner skeleton point on the same side;
determining the opening direction of the first included angle as a chamfering direction in response to the first included angle being greater than a first preset threshold;
at least one endoskeleton point is inserted in the chamfering direction to smooth the endoskeleton of the side pointed by the two normal vectors, wherein the endoskeleton refers to a curve obtained by connecting the endoskeleton points.
In one embodiment, the correction module 1303 is configured to:
generating a torsion angle for correcting a boundary point in front of an anchor point to a direction towards the camera viewing angle according to the distance between the boundary point and the anchor point and the current camera viewing angle, wherein the anchor point is a shape point for leading out a guiding arrow in shape points included in the navigation path;
And twisting the boundary point positioned in front of the anchor point based on the twisting angle so that the view angle of the boundary point positioned in front of the anchor point on the top surface faces the view angle direction of the camera.
In one embodiment, the correction module 1303 is configured to:
determining the progressive distance of any boundary point relative to the anchor point according to the position of the anchor point aiming at any boundary point in front of the anchor point;
And weighting the angle of the camera view angle by taking the ratio of the progressive distance of any boundary point to the maximum progressive distance of the boundary point on the top surface as a weighting coefficient to obtain the torsion angle of any boundary point.
In one embodiment, the generating apparatus 1300 may further include a fourth generating module for:
Generating a normal vector of the boundary point toward the opposite direction of the camera viewing angle based on each boundary point on the top surface;
Generating boundary points of the bottom surface of the stereoscopic guide arrow on a normal vector, wherein the distance from the boundary points of the bottom surface to the boundary points of the corresponding top surface is equal to the height of the stereoscopic guide arrow recorded in the pattern file.
In one embodiment, the first obtaining module 1301 is configured to:
Determining a guiding action of a front road according to the current positioning position and the navigation path, and acquiring a style file of a stereoscopic guiding arrow corresponding to the guiding action according to the guiding action
In one embodiment, the generating apparatus 1300 may further include a data cleansing module for:
Traversing the shape points, and determining the distance between adjacent shape points;
for any shape point, deleting the shape point in response to the distance between the shape point and the previous shape point or the next shape point being smaller than a preset distance;
and/or
Traversing the shape points, and determining connecting lines of adjacent shape points and a second included angle between two adjacent connecting lines;
determining that the shape points on the two adjacent connecting lines are collinear in response to the second included angle being smaller than a second preset threshold value;
And reserving a first shape point and a last shape point in the collinear multiple shape points, and deleting the shape points between the first shape point and the last shape point.
In one embodiment, the generating apparatus 1300 may further include a mapping module configured to:
determining the proportion of the progressive distance of the coordinate point relative to the anchor point to the maximum progressive distance of the coordinate point on the stereoscopic guiding arrow aiming at the coordinate point on the stereoscopic guiding arrow;
Weighting the heights of the texture images by taking the proportion as a weighting coefficient to obtain the ordinate of the textures corresponding to the coordinate points;
acquiring the abscissa of the texture corresponding to the coordinate point from a preset texture mapping configuration table;
and mapping the texture onto the coordinate point according to the ordinate and the abscissa.
In one embodiment, the generating device 1300, the mapping module is further configured to:
and responding to the preset time interval, changing the numerical values of the abscissa and the ordinate according to the preset change amplitude, and mapping the textures corresponding to the changed abscissa and ordinate onto the coordinate point.
The device provided by the embodiment of the present disclosure may be directed to the method of any one of the above method embodiments, and the implementation manner and the beneficial effects of the method are similar, and are not described herein again.
The embodiment of the disclosure also provides a terminal device, which includes a memory and a processor, where the memory is a nonvolatile memory, and the memory stores a computer program, and when the computer program is executed by the processor, the method of any one of the above embodiments of the method can be implemented, and the execution mode and the beneficial effects are similar, and are not repeated herein.
Fig. 14 is a schematic structural diagram of a terminal device in an embodiment of the disclosure. Referring now in particular to fig. 14, a schematic diagram of a configuration of a terminal device 1400 suitable for use in implementing embodiments of the present disclosure is shown. Terminal device 1400 in embodiments of the present disclosure may include, but is not limited to, devices such as cell phones, PAD (tablet computer), car sets, and the like that navigate and process capabilities. The terminal device shown in fig. 14 is only one example, and should not impose any limitation on the functions and scope of use of the embodiments of the present disclosure.
As shown in fig. 14, the terminal apparatus 1400 may include a processing device (e.g., a central processor, a graphics processor, etc.) 1401, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 1402 or a program loaded from a storage device 1408 into a Random Access Memory (RAM) 1403. In the RAM 1403, various programs and data necessary for the operation of the terminal device 1400 are also stored. The processing device 1401, the ROM 1402, and the RAM 1403 are connected to each other through a bus 1404. An input/output (I/O) interface 1405 is also connected to the bus 1404.
In general, devices including input devices 1406 such as a touch screen, a touch panel, a keyboard, a mouse, a camera, a microphone, an accelerometer, a gyroscope, and the like, output devices 1407 including a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like, storage devices 1408 including a magnetic tape, a hard disk, and the like, and communication devices 1409 may be connected to the I/O interface 1405. The communication means 1409 may allow the terminal device 1400 to communicate wirelessly or by wire with other devices to exchange data. While fig. 14 illustrates a terminal device 1400 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 1409, or installed from the storage means 1408, or installed from the ROM 1402. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 1401.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of a computer-readable storage medium may include, but are not limited to, an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to electrical wiring, fiber optic cable, RF (radio frequency), and the like, or any suitable combination of the foregoing.
The computer readable medium may be contained in the terminal device or may exist alone without being incorporated in the terminal device.
The computer readable medium carries one or more programs, and when the one or more programs are executed by a processing device, the processing device is caused to acquire a shape point and a style file of a stereoscopic guiding arrow included in a navigation path, generate a boundary point of a top surface of the stereoscopic guiding arrow based on the shape point included in the navigation path and an arrow width value recorded in the style file of the stereoscopic guiding arrow, correct a viewing angle of the boundary point located in front of the anchor point to be directed to a viewing angle direction of the camera according to a distance between the boundary point and the anchor point and a current camera viewing angle, generate a boundary point of a side surface of the stereoscopic guiding arrow based on the corrected boundary point of the top surface and the arrow height value recorded in the style file, wherein an included angle between the side surface and the top surface is equal to a preset included angle, and obtain the stereoscopic guiding arrow based on the boundary point of the top surface and the boundary point of the side surface.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic that may be used include Field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems-on-a-chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The embodiments of the present disclosure further provide a computer readable storage medium, where a computer program is stored, where the computer program, when executed by a processor, may implement the method of any one of the embodiments of fig. 2 to 9, and the implementation manner and beneficial effects are similar, and are not repeated herein.
Embodiments of the present disclosure further provide a computer program product stored in a storage medium, where the program product, when executed, may implement the method of any of the embodiments of fig. 2 to 9, in a manner similar to the advantageous effects, and will not be described herein.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
The foregoing is merely a specific embodiment of the disclosure to enable one skilled in the art to understand or practice the disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown and described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (15)

1.一种引导箭头的生成方法,其中,包括:1. A method for generating a guide arrow, comprising: 获取导航路径包括的形点和立体引导箭头的样式文件;Get the style file of the shape points and three-dimensional guide arrows included in the navigation path; 基于所述导航路径包括的形点和所述立体引导箭头的样式文件中记录的箭头宽度值,生成立体引导箭头的顶面的边界点;其中,所述形点为导航路径上包含的定位点;Based on the shape points included in the navigation path and the arrow width values recorded in the style file of the 3D guide arrow, generating the boundary points of the top surface of the 3D guide arrow; wherein the shape points are the positioning points included in the navigation path; 针对位于锚点前方的边界点,根据所述边界点与锚点之间的距离以及当前的相机视角,将位于所述锚点前方的边界点的视角修正至朝向所述相机视角方向;其中,所述锚点是指所述导航路径包括的形点中用于引出引导箭头的形点;For a boundary point located in front of an anchor point, correcting the perspective of the boundary point located in front of the anchor point to be toward the camera perspective based on the distance between the boundary point and the anchor point and the current camera perspective; wherein the anchor point refers to a point in the navigation path that is used to draw a guide arrow; 基于修正后的所述顶面的边界点和所述样式文件记录的箭头高度值,生成所述立体引导箭头的侧面的边界点,所述侧面与所述顶面的夹角等于预设夹角;Based on the corrected boundary points of the top surface and the arrow height value recorded in the pattern file, the boundary points of the side surface of the three-dimensional guide arrow are generated, and the angle between the side surface and the top surface is equal to the preset angle; 基于所述顶面的边界点和所述侧面的边界点,得到所述立体引导箭头。The three-dimensional guide arrow is obtained based on the boundary points of the top surface and the boundary points of the side surface. 2.根据权利要求1所述的方法,其中,所述基于所述导航路径包括的形点和所述立体引导箭头的样式文件中记录的箭头宽度值,生成立体引导箭头的顶面的边界点,包括:2. The method according to claim 1, wherein generating the boundary points of the top surface of the 3D guide arrow based on the shape points included in the navigation path and the arrow width value recorded in the style file of the 3D guide arrow comprises: 基于所述导航路径包括的形点和所述立体引导箭头的样式文件中记录的箭头宽度值,在所述形点的左右两侧剖分出所述立体引导箭头的顶面的内骨骼点;Based on the shape points included in the navigation path and the arrow width values recorded in the style file of the 3D guide arrow, segmenting the top surface of the 3D guide arrow to obtain the internal skeleton points on the left and right sides of the shape points; 至少将所述内骨骼点作为所述顶面的边界点。At least the endoskeleton point is used as a boundary point of the top surface. 3.根据权利要求2所述的方法,其中,所述基于所述导航路径包括的形点和所述立体引导箭头的样式文件中记录的箭头宽度值,在所述形点的左右两侧剖分出所述立体引导箭头的顶面的内骨骼点之后,所述方法还包括:3. The method according to claim 2, wherein, based on the shape points included in the navigation path and the arrow width values recorded in the style file of the 3D guide arrow, after segmenting the top surface of the 3D guide arrow to obtain the internal skeleton points on the left and right sides of the shape points, the method further comprises: 以所述内骨骼点为基点,向背离所述内骨骼点的方向剖分出所述立体引导箭头的外骨骼点;Taking the endoskeleton point as a base point, subdividing the exoskeleton point of the stereoscopic guide arrow in a direction away from the endoskeleton point; 所述至少将所述内骨骼点作为所述顶面的边界点,包括:The step of using at least the endoskeleton point as a boundary point of the top surface includes: 将所述内骨骼点和所述外骨骼点作为所述顶面的边界点。The endoskeleton points and the exoskeleton points are used as boundary points of the top surface. 4.根据权利要求2所述的方法,其中,所述基于所述导航路径包括的形点和所述立体引导箭头的样式文件中记录的箭头宽度值,在所述形点的左右两侧剖分出所述立体引导箭头的顶面的内骨骼点之后,所述方法还包括:4. The method according to claim 2, wherein, based on the shape points included in the navigation path and the arrow width values recorded in the style file of the 3D guide arrow, after segmenting the top surface of the 3D guide arrow to obtain the internal skeleton points on the left and right sides of the shape points, the method further comprises: 确定所述导航路径包括的形点中相邻两个形点的连线;Determining a line connecting two adjacent shape points among the shape points included in the navigation path; 确定垂直于相邻两个形点的连线的法向量,所述法向量以形点为垂足;Determine a normal vector perpendicular to a line connecting two adjacent shape points, wherein the normal vector has the shape point as a foot; 确定相邻两条连线的共用形点上的两条法向量的第一夹角,所述两条法向量指向同一侧的内骨骼点;Determine a first angle between two normal vectors on a common point of two adjacent lines, wherein the two normal vectors point to an endoskeleton point on the same side; 响应于所述第一夹角大于第一预设阈值,将所述第一夹角的开口方向确定为倒角方向;In response to the first angle being greater than a first preset threshold, determining the opening direction of the first angle as a chamfering direction; 在所述倒角方向上插入至少一个内骨骼点,以对所述两条法向量指向的所述侧的内骨骼进行平滑,其中,所述内骨骼是指对内骨骼点进行连线得到的曲线。At least one endoskeleton point is inserted in the chamfer direction to smooth the endoskeleton of the side pointed by the two normal vectors, wherein the endoskeleton refers to a curve obtained by connecting the endoskeleton points. 5.根据权利要求3所述的方法,其中,所述基于所述导航路径包括的形点和所述立体引导箭头的样式文件中记录的箭头宽度值,在所述形点的左右两侧剖分出所述立体引导箭头的顶面的内骨骼点之后,所述方法还包括:5. The method according to claim 3, wherein, based on the shape points included in the navigation path and the arrow width values recorded in the style file of the 3D guide arrow, after segmenting the top surface of the 3D guide arrow to obtain the internal skeleton points on the left and right sides of the shape points, the method further comprises: 确定所述导航路径包括的形点中相邻两个形点的连线;Determining a line connecting two adjacent shape points among the shape points included in the navigation path; 确定垂直于相邻两个形点的连线的法向量,所述法向量以形点为垂足;Determine a normal vector perpendicular to a line connecting two adjacent shape points, wherein the normal vector has the shape point as a foot; 确定相邻两条连线的共用形点上的两条法向量的第一夹角,所述两条法向量指向同一侧的内骨骼点;Determine a first angle between two normal vectors on a common point of two adjacent lines, wherein the two normal vectors point to an endoskeleton point on the same side; 响应于所述第一夹角大于第一预设阈值,将所述第一夹角的开口方向确定为倒角方向;In response to the first angle being greater than a first preset threshold, determining the opening direction of the first angle as a chamfering direction; 在所述倒角方向上插入至少一个内骨骼点,以对所述两条法向量指向的所述侧的内骨骼进行平滑,其中,所述内骨骼是指对内骨骼点进行连线得到的曲线。At least one endoskeleton point is inserted in the chamfer direction to smooth the endoskeleton of the side pointed by the two normal vectors, wherein the endoskeleton refers to a curve obtained by connecting the endoskeleton points. 6.根据权利要求1所述的方法,其中,所述针对位于所述锚点前方的边界点,根据所述边界点与锚点之间的距离以及当前的相机视角,将位于所述锚点前方的边界点的视角修正至朝向所述相机视角方向,包括:6. The method according to claim 1, wherein, for a boundary point located in front of the anchor point, correcting the perspective of the boundary point located in front of the anchor point to face the camera perspective according to the distance between the boundary point and the anchor point and the current camera perspective, comprises: 根据所述边界点与锚点之间的距离以及当前的相机视角,生成用于将所述锚点前方的边界点修正至朝向相机视角方向的扭转角度;generating a twist angle for correcting the boundary point in front of the anchor point to face the camera viewing angle according to the distance between the boundary point and the anchor point and the current camera viewing angle; 基于所述扭转角度对位于所述锚点前方的边界点进行扭转,使得所述顶面上位于所述锚点前方的边界点的视角朝向所述相机视角方向。The boundary point located in front of the anchor point is twisted based on the twist angle, so that the viewing angle of the boundary point located in front of the anchor point on the top surface is oriented toward the camera viewing angle direction. 7.根据权利要求6所述的方法,其中,所述根据所述边界点与锚点之间的距离以及当前的相机视角,生成用于将所述锚点前方的边界点修正至朝向相机视角方向的扭转角度,包括:7. The method according to claim 6, wherein generating a twist angle for correcting the boundary point in front of the anchor point to face the camera viewing angle according to the distance between the boundary point and the anchor point and the current camera viewing angle comprises: 针对所述锚点前方的任一边界点,根据所述锚点的位置,确定所述任一边界点相对于所述锚点的累进距离;For any boundary point in front of the anchor point, determine the progressive distance of the any boundary point relative to the anchor point according to the position of the anchor point; 将所述任一边界点的累进距离和所述顶面上边界点的最大累进距离的比值作为加权系数对所述相机视角的角度进行加权,得到所述任一边界点的扭转角度。The ratio of the progressive distance of any boundary point to the maximum progressive distance of the boundary points on the top surface is used as a weighting coefficient to weight the angle of the camera viewing angle to obtain the torsion angle of any boundary point. 8.根据权利要求1所述的方法,其中,所述方法还包括:8. The method according to claim 1, further comprising: 基于所述顶面上的每个边界点,生成所述边界点朝向所述相机视角反方向的法向量;Based on each boundary point on the top surface, generating a normal vector of the boundary point in a direction opposite to the camera viewing angle; 在法向量上生成所述立体引导箭头的底面的边界点,所述底面的边界点到对应顶面边界点的距离等于所述样式文件中记录的所述立体引导箭头的高度。The boundary point of the bottom surface of the three-dimensional guide arrow is generated on the normal vector, and the distance between the boundary point of the bottom surface and the corresponding boundary point of the top surface is equal to the height of the three-dimensional guide arrow recorded in the style file. 9.根据权利要求1-8中任一项所述的方法,其中,所述获取导航路径包括的形点和立体引导箭头的样式文件,包括:9. The method according to any one of claims 1 to 8, wherein the step of obtaining a style file of shape points and three-dimensional guide arrows included in the navigation path comprises: 根据当前定位位置和导航路径,确定前方道路的引导动作;Determine the guidance action on the road ahead based on the current positioning position and navigation path; 根据所述引导动作,获取和所述引导动作相对应的立体引导箭头的样式文件。According to the guiding action, a style file of a three-dimensional guiding arrow corresponding to the guiding action is obtained. 10.根据权利要求1-8中任一项所述的方法,其中,所述基于所述导航路径包括的形点和所述立体引导箭头的样式文件中记录的箭头宽度值,生成立体引导箭头的顶面的边界点之前,所述方法还包括:10. The method according to any one of claims 1 to 8, wherein before generating the boundary points of the top surface of the 3D guide arrow based on the shape points included in the navigation path and the arrow width value recorded in the style file of the 3D guide arrow, the method further comprises: 遍历所述形点,确定相邻形点之间的距离;Traversing the shape points and determining the distances between adjacent shape points; 对于任一形点,响应于所述形点与前一形点或后一形点之间的距离小于预设距离,删除所述形点;For any shape point, in response to a distance between the shape point and a previous shape point or a next shape point being less than a preset distance, deleting the shape point; 和/或and/or 遍历所述形点,确定相邻形点的连线以及相邻两个连线之间的第二夹角;Traversing the shape points, determining the connecting lines of adjacent shape points and the second angle between two adjacent connecting lines; 响应于所述第二夹角小于第二预设阈值,确定所述相邻两个连线上的形点共线;In response to the second angle being smaller than a second preset threshold, determining that the shape points on the two adjacent lines are collinear; 保留共线的多个形点中的首个形点和最后一个形点,删除所述首个形点和所述最后一个形点之间的形点。The first and last shape points of the collinear shape points are retained, and the shape points between the first and last shape points are deleted. 11.根据权利要求1-6中任一项所述的方法,其中,所述基于所述顶面的边界点和所述侧面的边界点,得到所述立体引导箭头之后,所述方法还包括:11. The method according to any one of claims 1 to 6, wherein after obtaining the 3D guide arrow based on the boundary points of the top surface and the boundary points of the side surface, the method further comprises: 针对所述立体引导箭头上的坐标点,确定所述坐标点相对于锚点的累进距离占立体引导箭头上坐标点的最大累进距离的比例;For a coordinate point on the 3D guide arrow, determining a ratio of a progressive distance of the coordinate point relative to the anchor point to a maximum progressive distance of the coordinate point on the 3D guide arrow; 将所述比例作为加权系数对纹理图像的高度进行加权,得到所述坐标点对应的纹理的纵坐标;Using the ratio as a weighting coefficient to weight the height of the texture image, to obtain the vertical coordinate of the texture corresponding to the coordinate point; 从预设的纹理映射配置表中获取所述坐标点对应的纹理的横坐标;Obtaining the horizontal coordinate of the texture corresponding to the coordinate point from a preset texture mapping configuration table; 根据所述纵坐标和所述横坐标,将所述纹理映射到所述坐标点上。The texture is mapped to the coordinate point according to the ordinate and the abscissa. 12.根据权利要求11所述的方法,其中,所述方法还包括:12. The method according to claim 11, further comprising: 响应于到达预设的时间间隔,按照预设的改变幅度改变所述横坐标和所述纵坐标的数值;In response to reaching a preset time interval, changing the values of the abscissa and the ordinate according to a preset change amplitude; 将改变后的横坐标和纵坐标对应的纹理映射到所述坐标点上。The texture corresponding to the changed horizontal coordinate and vertical coordinate is mapped to the coordinate point. 13.一种引导箭头的生成装置,其中,包括:13. A device for generating a guide arrow, comprising: 第一获取模块,用于获取导航路径包括的形点和立体引导箭头的样式文件;The first acquisition module is used to obtain a style file of shape points and three-dimensional guide arrows included in the navigation path; 第一生成模块,用于基于所述导航路径包括的形点和所述立体引导箭头的样式文件中记录的箭头宽度值,生成立体引导箭头的顶面的边界点;其中,所述形点为导航路径上包含的定位点The first generating module is used to generate the boundary points of the top surface of the 3D guide arrow based on the shape points included in the navigation path and the arrow width value recorded in the style file of the 3D guide arrow; wherein the shape points are the positioning points included in the navigation path 修正模块,用于针对位于锚点前方的边界点,根据所述边界点与锚点之间的距离以及当前的相机视角,将位于所述锚点前方的边界点的视角修正至朝向所述相机视角方向;其中,所述锚点是指所述导航路径包括的形点中用于引出引导箭头的形点;a correction module, configured to correct, for a boundary point located in front of an anchor point, the perspective of the boundary point located in front of the anchor point to be toward the camera perspective based on the distance between the boundary point and the anchor point and the current camera perspective; wherein the anchor point is a geometric point included in the navigation path and used to draw a guide arrow; 第二生成模块,用于基于修正后的所述顶面的边界点和所述样式文件记录的箭头高度值,生成所述立体引导箭头的侧面的边界点,所述侧面与所述顶面的夹角等于预设夹角;A second generating module is configured to generate boundary points of a side surface of the three-dimensional guide arrow based on the corrected boundary points of the top surface and the arrow height value recorded in the pattern file, wherein the angle between the side surface and the top surface is equal to a preset angle; 第三生成模块,用于基于所述顶面的边界点和所述侧面的边界点,得到所述立体引导箭头。The third generating module is used to obtain the three-dimensional guide arrow based on the boundary points of the top surface and the boundary points of the side surface. 14.一种终端设备,其中,包括存储器和处理器,其中,所述存储器中存储有计算机程序,当所述计算机程序被所述处理器执行时,实现如权利要求1-12中任一项所述的方法。14. A terminal device, comprising a memory and a processor, wherein a computer program is stored in the memory, and when the computer program is executed by the processor, the method according to any one of claims 1 to 12 is implemented. 15.一种计算机程序产品,其中,所述程序产品存储在存储介质中,当所述程序产品运行时,执行如权利要求1-12中任一项所述的方法。15. A computer program product, wherein the program product is stored in a storage medium, and when the program product is run, the method according to any one of claims 1 to 12 is executed.
CN202310350364.1A 2023-03-28 2023-03-28 Guide arrow generation method, device, equipment and product Active CN116558542B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310350364.1A CN116558542B (en) 2023-03-28 2023-03-28 Guide arrow generation method, device, equipment and product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310350364.1A CN116558542B (en) 2023-03-28 2023-03-28 Guide arrow generation method, device, equipment and product

Publications (2)

Publication Number Publication Date
CN116558542A CN116558542A (en) 2023-08-08
CN116558542B true CN116558542B (en) 2025-09-05

Family

ID=87495508

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310350364.1A Active CN116558542B (en) 2023-03-28 2023-03-28 Guide arrow generation method, device, equipment and product

Country Status (1)

Country Link
CN (1) CN116558542B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108139223A (en) * 2015-09-30 2018-06-08 日产自动车株式会社 Display apparatus
CN109858374A (en) * 2018-12-31 2019-06-07 武汉中海庭数据技术有限公司 Arrow class graticule extraction method and device in high-precision cartography

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007030345A1 (en) * 2007-02-28 2008-09-04 Navigon Ag Navigation device and method for the graphic output of navigation instructions
CN115342827A (en) * 2022-07-22 2022-11-15 阿里巴巴(中国)有限公司 Method, device and equipment for displaying direction guide arrow in navigation map

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108139223A (en) * 2015-09-30 2018-06-08 日产自动车株式会社 Display apparatus
CN109858374A (en) * 2018-12-31 2019-06-07 武汉中海庭数据技术有限公司 Arrow class graticule extraction method and device in high-precision cartography

Also Published As

Publication number Publication date
CN116558542A (en) 2023-08-08

Similar Documents

Publication Publication Date Title
AU2011332885B2 (en) Guided navigation through geo-located panoramas
CN108362295B (en) Vehicle route guidance device and method
EP2643821B1 (en) Path planning for street level navigation in a three-dimensional environment, and applications thereof
CN103562681B (en) Produce the method for the database for guider, export the method and guider of three-dimensional map
CN112789609A (en) Map updating method and device, movable platform and storage medium
EP3321889A1 (en) Device and method for generating and displaying 3d map
CN106980633B (en) Indoor map data generation method and device
US8532924B2 (en) Method and apparatus for displaying three-dimensional terrain and route guidance
KR20050037669A (en) Method for displaying three dimensional map
CN113570664B (en) Augmented reality navigation display method and device, electronic equipment and computer medium
US20260024260A1 (en) Movement trajectory playback method and apparatus, electronic device, and storage medium
CN115683152A (en) Vehicle navigation guiding method and device based on coordinate transformation and electronic equipment
KR102222102B1 (en) An augment reality navigation system and method of route guidance of an augment reality navigation system
CN103578141A (en) Method and device for achieving augmented reality based on three-dimensional map system
US11549820B2 (en) Method and apparatus for generating navigation route and storage medium
US20210201522A1 (en) System and method of selecting a complementary image from a plurality of images for 3d geometry extraction
KR100889470B1 (en) 3D path generation method and apparatus
CN115937479A (en) Navigation guide surface processing method, device, electronic device and computer program product
US20190128692A1 (en) Navigation system and navigation program
CN116558542B (en) Guide arrow generation method, device, equipment and product
CN113901343B (en) Method and device for determining visual angle of target object and electronic equipment
WO2023088127A1 (en) Indoor navigation method, server, apparatus and terminal
JP4533191B2 (en) 3D map display device and 3D map display program
JP6091676B2 (en) 3D map display system
CN114518120A (en) Navigation guidance method, road shape data generation method, device, device, and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant