Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, a further description of aspects of the present disclosure will be provided below. It should be noted that, without conflict, the embodiments of the present disclosure and features in the embodiments may be combined with each other.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced otherwise than as described herein, and it is apparent that the embodiments in the specification are only some, rather than all, of the embodiments of the present disclosure.
In order to facilitate understanding of the technical solutions of the embodiments of the present disclosure, first, some nouns related to the embodiments of the present disclosure are explained.
The shape point can be understood as a positioning point contained in the navigation path, and comprises information such as longitude, latitude, altitude and the like of the positioning point.
The anchor point is a position for leading out driving actions such as straight running, turning around and the like, the driving actions are guided by a three-dimensional guiding arrow, namely, the guiding arrow is led out of the anchor point, and the anchor point can be generally understood as a bifurcation point/bifurcation point of a road.
Texture mapping, in the embodiment of the disclosure, texture on a texture image is applied to a stereoscopic guiding arrow.
Dissection refers to the process of constructing skeleton points and faces of a model according to the trend of shape points and further dividing the faces into renderable geometric primitives.
Fig. 2 is a schematic diagram of an application scenario provided in an embodiment of the present disclosure. The terminal device in fig. 2 may be understood as a device with navigation functions and processing capabilities such as a mobile phone, car phone, tablet computer, etc. The server is a server that can provide navigation services. In the scenario shown in fig. 2, the terminal device may send a navigation request to the server through a preset communication protocol, where the navigation request includes, but is not limited to, information such as a start position, an end position, and the like. After receiving the navigation request, the server performs route planning according to the information of the initial position, the final position and the like, and sends one or more pieces of information of the planned route to the terminal equipment, and the user selects the navigation route from the one or more pieces of planned routes. The navigation route comprises, but is not limited to, a shape point, an anchor point position, a navigation guiding action led out from the anchor point position, and a style file of a stereoscopic guiding arrow corresponding to the navigation guiding action. The style file may include information such as style, width value, and height value of the stereoscopic guide arrow. The style files can be issued in advance to the terminal device for storage, and are not necessarily issued along with the navigation route.
After the terminal device obtains the navigation route, the terminal device moves according to the navigation route, and determines the anchor point position to be passed by the front and the navigation guiding action to be led out by the anchor point position according to the positioning position of the terminal device and the navigation route in front (the navigation route is the route of the navigation route in a certain section of road), and further obtains the style file of the corresponding three-dimensional guiding arrow and the shape point included in the navigation route according to the navigation guiding action. And then taking the shape point as a base point, and splitting boundary points of the top surface of the stereoscopic guide arrow at the left side and the right side of the shape point according to the arrow width value recorded in the stereoscopic guide arrow style file. Further, for the boundary point located in front of the anchor point, the view angle of the boundary point located in front of the anchor point is corrected to be directed toward the view angle direction of the camera according to the distance between the boundary point and the anchor point and the current view angle of the camera, so that the twisted boundary point can be seen from the view angle of the camera. And generating boundary points of the side faces of the stereoscopic guide arrows based on the boundary points of the top faces after correction and the arrow height values recorded by the style files, and further obtaining the stereoscopic guide arrows based on the boundary points of the top faces and the boundary points of the side faces.
According to the embodiment of the disclosure, the visual angle of the boundary point of the top surface of the stereoscopic guide arrow is corrected to the visual angle direction of the camera, so that the top surface of the stereoscopic guide arrow always faces the visual angle direction, the direction of the stereoscopic guide arrow is ensured to be always seen, a user is helped to quickly focus on the direction guided by the stereoscopic guide arrow, and the user experience is improved.
In order to better understand the technical solutions of the embodiments of the present disclosure, the technical solutions of the embodiments of the present disclosure are described below in conjunction with exemplary embodiments.
By way of example, fig. 3 is a flowchart of a method for generating a guidance arrow according to an embodiment of the present disclosure. The method may be exemplarily performed by the terminal device in fig. 2. As shown in fig. 3, in some exemplary implementations, the method provided by the disclosed embodiments may include steps 301-305.
Step 301, a style file of a shape point and a stereoscopic guiding arrow included in a navigation path is obtained.
The navigation path in the embodiments of the present disclosure may be understood as a path of the navigation route in the front road. A shape point may be understood as a location point on a navigation path. Illustratively, in embodiments of the present disclosure, the shape point includes an anchor point, which may be understood as the bifurcation point of the road.
The stereoscopic guiding arrow can understand three-dimensional marks for indicating driving actions, such as turning arrows, lane changing arrows and the like. The style file of the stereoscopic guide arrow includes style information of the guide arrow, such as the pointing direction of the arrow, the arrow width value, the arrow height value, the arrow length, and the like.
In some embodiments, a guiding action of a front road may be determined according to a current positioning position and a navigation path, and then a style file of a stereoscopic guiding arrow corresponding to the guiding action is acquired according to the guiding action. Referring to fig. 2, in one implementation of the embodiment of the present disclosure, a terminal device transmits information such as a start position, a terminal position, and the like to a server. And the server plans a route according to the information such as the starting point position, the end point position and the like, and feeds back the information of the planned route to the terminal equipment. The information of the planned route comprises shape points of navigation guidance of each road section, navigation guidance actions corresponding to each anchor point in the planned route and style files of three-dimensional guidance arrows corresponding to the navigation guidance actions. After the user selects the navigation route from the planned route, the terminal device guides the user to travel according to the navigation route. When the distance between the terminal device and the anchor point is smaller than or equal to the preset distance, the generation operation of the stereoscopic guiding arrow can be started. The terminal device determines a guiding action (such as left turn) of the front road according to the current positioning position and the navigation path, and then acquires a shape point and a style file of a three-dimensional guiding arrow included in the front navigation path from data of the navigation path according to the guiding action.
It should be noted that, the distance between the terminal device and the anchor point is continuously changed, and as the distance between the terminal device and the anchor point is changed, the style of the stereoscopic guiding arrow may also follow the change, for example, the portion corresponding to the road that has been driven in the stereoscopic guiding arrow disappears, etc. According to the embodiment of the disclosure, the style file of the stereoscopic guiding arrow corresponding to the real-time distance can be obtained in real time according to the distance between the terminal equipment and the anchor point, and the corresponding stereoscopic guiding arrow can be generated in real time. The style file of the stereoscopic guiding arrow is obtained in real time through the distance between the terminal equipment and the anchor point, and the corresponding stereoscopic guiding arrow is generated, so that the stereoscopic guiding arrow can be ensured to change along with the movement of the terminal equipment in real time, the stereoscopic guiding arrow can always play a role in pointing, the pointing accuracy of the stereoscopic guiding arrow is ensured, and the guiding effect of the stereoscopic guiding arrow is improved.
For example, in some embodiments, after the shape point is obtained, a process of preprocessing the shape point may be further included. The pre-processing procedure includes, but is not limited to, thinning and smoothing fit. For example, after the shape points are obtained, the shape points may be traversed to determine the distance between adjacent shape points, and if the distance between a certain shape point and a previous shape point and/or a subsequent shape point is less than a preset distance, the shape points are prevented from intersecting within the stereoscopic guiding arrow a normal vector between shape points when the distance between shape points is too close to cause model isomerization. For another example, in another embodiment, the shape points may be traversed to determine the connection line of the adjacent shape points and the included angle (hereinafter referred to as the second included angle) between the two adjacent connection lines, and if the second included angle between the two adjacent connection lines is smaller than the second preset threshold, the shape points on the two connection lines are determined to be collinear. For collinear shape points, a first shape point and a last shape point can be reserved, and the shape point between the first shape point and the last shape point is deleted, so that the data volume is reduced, and the calculation efficiency is improved. For another example, if the second included angle between two adjacent connecting lines is greater than a third preset threshold, and the third preset threshold is greater than the second preset threshold, that is, the position with a larger direction change such as turning, at least one shape point can be inserted by an interpolation method such as catmull-rom, so as to improve the smoothness of the connecting line of the shape point at the position with a larger direction change such as turning.
Step 302, generating boundary points of the top surface of the stereoscopic guiding arrow based on the shape points included in the navigation path and the arrow width values recorded in the style file of the stereoscopic guiding arrow.
Fig. 4 is a schematic diagram illustrating a method for generating a boundary point of a top surface of a stereoscopic guiding arrow according to an embodiment of the disclosure. As shown in fig. 4, in an exemplary implementation of the embodiments of the present disclosure, a method of generating a boundary point of a top surface of a guide arrow may include steps 401 to 402.
In step 401, endoskeletal points on the top surface of the stereoscopic guide arrow are split on the left and right sides of the shape point based on the shape point included in the navigation path and the arrow width value recorded in the style file of the stereoscopic guide arrow.
At step 402, at least an endoskeleton point is taken as a boundary point for the top surface.
For example, fig. 5 is a schematic diagram of a method for generating an endoskeleton point according to an embodiment of the disclosure. As shown in fig. 5, G1 and G2 are two adjacent shape points included in the shape points in fig. 5. Wherein G2 is a shape point located in front of G1 running. Connecting G1 and G2 generates a vector (represented by arrow L1 in fig. 5) directed from G1 to G2. At this time, the vectors L11 and L12 may be obtained by rotating L1 clockwise and counterclockwise by 90 degrees, respectively, with G1 as a base point. Then, according to the arrow width value, an inner skeleton point G11 is inserted at a position which is one half of the arrow width value from G1 in the direction of the vector L11, and an inner skeleton point G12 is inserted at a position which is one half of the arrow width value from G1 in the direction of the vector L12, so that the inner skeleton points on the left and right sides of G1 are obtained. And the same can obtain the inner skeleton points on the left side and the right side of each shape point.
The boundary point of the top surface of the three-dimensional guide arrow can be obtained rapidly and accurately through a subdivision method.
For example, in some scenes with large directional changes, such as turns, head drops, etc., to improve the smoothness of the stereoscopic guide arrows at the turns, in some implementations of the disclosed embodiments, a step of smoothing the endoskeleton of the stereoscopic guide arrows at the turns may be included. The endoskeleton refers to a curve obtained by connecting points of the endoskeleton. For example, in one embodiment, after the endoskeleton point on the top surface of the stereoscopic guiding arrow is split, the connection line of two adjacent shape points in the shape points included in the navigation path can be determined first, and the normal vector perpendicular to the connection line of the two adjacent shape points can be determined, where the normal vector takes the shape points as the homeowner. And then determining a first included angle of two normal vectors on a common shape point of two adjacent connecting lines, wherein the two normal vectors point to an inner skeleton point on the same side. If the first included angle is larger than a first preset threshold value, determining the opening direction of the first included angle as a chamfering direction, and inserting at least one inner skeleton point in the chamfering direction so as to smooth the inner skeletons of the sides pointed by the two normal vectors. For example, fig. 6 is a schematic diagram of an endoskeleton smoothing method, as shown in fig. 6, M1, M2, M3 are three consecutive shape points, and N1, N2, N3, N4, N5 are endoskeleton points. Wherein, N1, N3 and N5 are respectively the endoskeleton points obtained by adopting the method shown in FIG. 5 by taking M1, M2 and M3 as base points. X1 is the normal vector of the connecting line of M1 and M2, X2 is the normal vector of the connecting line of M2 and M3, and the angle O is the first included angle which is the included angle between X1 and X2, as shown in FIG. 6, when the first included angle is larger than a first preset threshold value, the direction of the opening of the angle O is determined as the chamfering direction, and the inner skeleton points N2 and N4 are inserted in the chamfering direction, so that the curves obtained after the connection of N1, N2, N3, N4 and N5 are smooth. Of course, fig. 6 is merely illustrative and not limiting.
For example, in some embodiments, after obtaining the endoskeleton point based on any one of the methods described above, the method may further include a step of splitting the exoskeleton point of the stereoscopic guide arrow in a direction away from the endoskeleton point with the endoskeleton point as a base point.
By way of example, FIG. 7 is a schematic diagram of a method of generating exoskeleton points based on the endoskeleton points in FIG. 5. Taking the endoskeleton point G11 as an example, when the endoskeleton point G11 is taken as a base point, the exoskeleton point G111 may be obtained by interpolation at a position at a predetermined distance from the endoskeleton point G11 in the direction of the vector L11. Similarly, when the exoskeleton point G12 is taken as a base point, the exoskeleton point G112 may be obtained by interpolation at a position at a preset distance from the exoskeleton point G12 in the direction of the vector L12. Similarly, each exoskeleton point can be used as a base point to generate a corresponding exoskeleton point.
In the case of generating an exoskeleton point from an endoskeleton point, the endoskeleton point and the skeletal point may also be taken as boundary points of the top surface of the stereoscopic guide arrow. In this case, the method of generating the top surface of the stereoscopic guide arrow may refer to fig. 8. By way of example, FIG. 8 is a schematic diagram of a method of generating a surface with the points of the endoskeleton, the points of the exoskeleton and the points of the shape in FIG. 7. In fig. 8, G21 and G22 are the exoskeleton points generated based on G2, and G211 and G221 are the exoskeleton points generated based on G21 and G22, respectively. As shown in fig. 8, G21, G2, G11, and G1 may be connected to form a quadrilateral, G22, G2, G12, and G1 may be connected to form a quadrilateral, G211, G21, G111, and G11 may be generated to form a quadrilateral, G221, G22, G112, and G12 may be formed to form a quadrilateral, and each quadrilateral may be triangulated to obtain a plane formed by G1, G2, G11, G12, G111, G112, G21, G22, G211, and G221, and so on, an endoskeleton point obtained by taking each of the points of the shapes as a base point, and an exoskeleton point obtained by taking each of the points of the endoskeleton as a base point may be connected to obtain a top surface of the stereoscopic guiding arrow, such as the top surface of the stereoscopic guiding arrow shown in fig. 9. Of course, fig. 8 and 9 are merely examples and are not intended to be limiting.
According to the embodiment of the disclosure, the shape points are used as the base points, the inner skeleton points are generated according to the width values of the three-dimensional guide arrows, then the outer skeleton points are generated by taking the inner skeleton points as the base points, the top surface is generated based on the outer skeleton points, the inner skeleton points and the shape points, the accuracy of the generated top surface can be improved, one inner skeleton point and one outer skeleton point extend out of one side of each shape point, the number of the inner skeleton points and the outer skeleton points can be reduced, storage space and calculation resources are saved, and the generation efficiency of the top surface of the guide arrows is improved.
It should be noted that the method for generating the endoskeleton points and the exoskeleton points is only an exemplary method, and not the only method, in other embodiments, a shape point may be taken as a base point, a preset number of points are inserted as the endoskeleton points and/or the exoskeleton points at the left side and the right side of the shape point according to a preset step length, and then a top surface is generated based on the inserted points and the shape point.
Step 303, for the boundary point located in front of the anchor point, correcting the view angle of the boundary point located in front of the anchor point to the direction towards the view angle of the camera according to the distance between the boundary point and the anchor point and the current view angle of the camera.
The camera view angle can be understood as the included angle between the connecting line of the camera and the anchor point and the vertical direction.
In some embodiments, a torsion angle for correcting the boundary point in front of the anchor point to a direction towards the camera viewing angle may be generated according to the distance between the boundary point and the anchor point and the current camera viewing angle, and then the boundary point in front of the anchor point is twisted based on the torsion angle, so that the viewing angle of the boundary point in front of the anchor point on the top surface of the stereoscopic guiding arrow is towards the camera viewing angle direction.
In the embodiments of the present disclosure, there are various methods for generating the torsion angle of the boundary point on the top surface of the stereoscopic guide arrow, and an exemplary method will be described below as an example. By way of example, fig. 10 is a schematic illustration of boundary points on the top surface of a stereoscopic guide arrow provided by an embodiment of the present disclosure. As shown in FIG. 10, Z is an anchor point and M 11、M12、M13……M1N is N shape points. M 21、M22、M23……M2N is the N endoskeletal points to the left of the centroid. M 31、M32、M33……M3S is S endoskeleton points on the right side of the shape point, and the values of S and N can be the same or different. M 41、M42、M43……M4N is the N exoskeleton points to the left of the shape point, and M 51、M52、M53……M5S is the S exoskeleton points to the right of the shape point. For example, for the left N endoskeleton points, the distance from anchor point Z to M 21 can be calculated first, then the distance between M 21 and M 22 can be calculated separately, The distance between M 22 and M 23, and so on, calculates the distance between all adjacent endoskeleton points, and then sums the distance between the anchor point Z and M 21 and the distance between all adjacent endoskeleton points on the left side to obtain the maximum progressive distance of the endoskeleton points on the left side, namely the progressive distance of M 2N. Wherein the distance of anchor point Z to M 21 may be taken as the progressive distance of M 21. The sum of the distance from anchor point Z to M 21 and the distance from M 21 to M 22 can be used as the progressive distance of M 22 relative to anchor point Z, and so on, and the progressive distance of other endoskeletal points on the left relative to the anchor point can be obtained. Similarly, the maximum progressive distance of the exoskeleton points to the right of the shape point and the progressive distance of the exoskeleton points to the right relative to the anchor point Z, the maximum progressive distance of the exoskeleton points to the left of the shape point and the progressive distance of the exoskeleton points to the left relative to the anchor point Z, the maximum progressive distance of the exoskeleton points to the right of the shape point and the progressive distance of the exoskeleton points to the right relative to the anchor point Z may be calculated.
Further, for any one of the inner skeleton points or the outer skeleton points, a ratio of a progressive distance of the inner skeleton point or the outer skeleton point to a maximum progressive distance of a side of the inner skeleton point or the outer skeleton point may be used as a weighting coefficient, and a weighted calculation may be performed on an angle of the current camera view angle (for example, the weighting coefficient may be multiplied by the angle of the current view angle) to obtain a torsion angle of the inner skeleton point or the outer skeleton point. Or d can be used as a weighting coefficient to weight the angle of the camera view angle by using the ratio of the progressive distance of the inner skeleton point or the outer skeleton point to the maximum progressive distance of the upper boundary point (the inner skeleton point or the outer skeleton point on either side) of the top surface, so as to obtain the torsion angle. That is, in one embodiment, for any boundary point in front of the anchor point, the progressive distance of the boundary point relative to the anchor point can be determined according to the position of the anchor point, and the ratio of the progressive distance of the boundary point to the maximum progressive distance of the boundary point on the top surface is used as a weighting coefficient to weight the angle of the camera view angle, so as to obtain the torsion angle of any boundary point.
In some embodiments, after the torsion angle of each boundary point is obtained, the angle of view of the boundary point may be quaternion-twisted toward the camera view direction based on the torsion angle of each boundary point, such that the angle of view of the twisted boundary point is directed toward the camera view direction. The quaternion-based torsion method may be referred to in the related art and will not be described herein.
According to the embodiment of the disclosure, the quaternion-based torsion method can improve the torsion efficiency of the vertex without carrying out coordinate transformation on the vertex.
It should be noted that, in the above method, the farther the boundary points (such as the inner skeleton points and the outer skeleton points) are from the anchor points, the larger the torsion angle is, so that a linear torsion effect is presented, the torsion of the stereoscopic guiding arrow is more linear and smooth, and the display effect of the stereoscopic guiding arrow is improved.
And 304, generating boundary points of the side faces of the stereoscopic guide arrows based on the boundary points of the corrected top faces and the arrow height values recorded by the style files, wherein the included angle between the side faces of the stereoscopic guide arrows and the top faces is equal to a preset included angle.
The predetermined angle may be, for example, 90 degrees, or may be any other angle.
In some embodiments, the method for generating the side boundary points may be to generate a normal vector of each boundary point toward the opposite direction of the camera view angle based on each boundary point on the top surface of the stereoscopic guide arrow, and then generate a boundary point of the bottom surface of the stereoscopic guide arrow on the normal vector, where the distance from the boundary point of the bottom surface to the boundary point of the corresponding top surface is equal to the height of the stereoscopic guide arrow recorded in the pattern file. Since the boundary point of the bottom surface is actually the boundary point of the side surface, the boundary point of the bottom surface may be regarded as the boundary point of the side surface.
For example, in other embodiments, by traversing the exoskeleton points on the top surface, when traversing to the exoskeleton points, any triangle is selected from the triangles formed by the exoskeleton points and the endoskeleton points as a target triangle, the direction of the normal vector of the side of the target triangle away from the top surface is taken as the target direction, and then the boundary points of the side surfaces are obtained by interpolation in the target direction according to the height value of the stereoscopic guiding arrow.
And 305, obtaining a stereoscopic guiding arrow based on the boundary point of the top surface and the boundary point of the side surface.
For example, fig. 11 is a generating effect diagram of a stereoscopic guiding arrow provided in an embodiment of the disclosure. The method includes the steps of obtaining a shape point included in a navigation path and a style file of a stereoscopic guiding arrow, generating boundary points of the top surface of the stereoscopic guiding arrow based on the shape point included in the navigation path and an arrow width value recorded in the style file of the stereoscopic guiding arrow, correcting the view angle of the boundary points located in front of an anchor point to be directed to the view angle direction of the camera according to the distance between the boundary points and the anchor point and the current view angle of the camera for the boundary points located in front of the anchor point, and generating boundary points of the side surfaces of the stereoscopic guiding arrow based on the boundary points of the top surface and the boundary points of the side surfaces after correction based on the arrow height value recorded in the style file, as shown in fig. 11. According to the embodiment of the disclosure, the visual angle of the boundary point in front of the anchor point is corrected to the visual angle direction of the camera, so that the visual angle of the boundary point on the twisted top surface faces the visual angle direction of the camera, the top surface of the stereoscopic guiding arrow is ensured to be clearly presented to a user all the time, the user is helped to quickly focus on driving actions such as turning, turning around, changing lanes and the like, and the problem that the stereoscopic guiding arrow cannot be seen from the side surface in a low visual angle in the related art is solved.
By way of example, fig. 12 is a flowchart of a texture mapping method provided by an embodiment of the present disclosure, and as shown in fig. 12, after generating a stereoscopic guide arrow based on the method in the embodiment of fig. 3, textures on a texture image may be mapped onto the guide arrow by the following method.
Step 1201, determining, for coordinate points on the stereoscopic guiding arrow, a proportion of a progressive distance of the coordinate point relative to the anchor point to a maximum progressive distance of the coordinate point on the stereoscopic guiding arrow.
For each coordinate point on the stereoscopic guiding arrow, the method for calculating the progressive distance can be referred to as the method for determining the progressive distance between the inner skeleton point and the outer skeleton point in the above embodiment, which is not described herein.
And 1202, weighting the height of the texture image by taking the proportion as a weighting coefficient to obtain the ordinate of the texture corresponding to the coordinate point.
For example, assuming that the ratio of the progressive distance of the coordinate point to the maximum progressive distance is Q and the height of the texture image is H, the result of multiplying Q by H may be taken as the ordinate of the texture corresponding to the coordinate point on the texture image.
Step 1203, obtaining the abscissa of the texture corresponding to the coordinate point from a preset texture mapping configuration table.
In the embodiment of the present disclosure, a texture mapping configuration table may be preconfigured with information of abscissa coordinates of textures corresponding to each coordinate point on a texture image. According to the embodiment of the disclosure, the abscissa of the texture corresponding to the coordinate point on the texture image can be directly obtained from the texture mapping configuration table.
And step 1204, mapping the texture onto the coordinate point according to the ordinate and the abscissa.
After determining the abscissa and the ordinate of the texture on the texture image, the texture corresponding to the abscissa and the ordinate on the texture image is mapped to the coordinate point of the guiding arrow, so that the display effect of the stereoscopic guiding arrow is improved.
In some implementations of the disclosed embodiments, after mapping the textures corresponding to the abscissa and the ordinate onto the coordinate points of the stereoscopic guide arrow, the textures on the coordinate points of the stereoscopic guide arrow may also be transformed at preset time intervals. For example, in a possible implementation manner, after a preset time interval is reached, the data of the abscissa and the numerical value of the ordinate of the texture corresponding to the coordinate point can be changed according to a preset change amplitude, for example, a first numerical value is added to the abscissa, a second numerical value is added to the ordinate, and the like, and then the changed texture mapping corresponding to the abscissa and the ordinate is remapped to the coordinate point of the stereoscopic guiding arrow, so that the display effect of the streamer is created on the stereoscopic guiding arrow, the identification degree of the stereoscopic guiding arrow is improved, and the problem of weak auxiliary focusing function caused by insufficient protrusion of the stereoscopic guiding arrow is avoided.
For example, in other embodiments, the guiding arrow may be pointed by an additional indicating arrow, so as to help the user to quickly focus on the stereoscopic guiding arrow, and improve the recognition degree of the stereoscopic guiding arrow. For example, in one example, the moving speed of the indication arrow and the closest distance between the indication arrow and the terminal device and the anchor point may be preset, initially, the indication arrow is generated at the closest distance between the indication arrow and the terminal device along the guiding path, then, the position of the indication arrow at each moment is determined according to the preset moving speed of the indication arrow, and the indication arrow is generated at the position, and after the indication arrow moves to the closest distance between the indication arrow and the anchor point, the indication arrow is generated again at the closest distance between the indication arrow and the terminal device according to the positioning position of the terminal device at the next moment, so that the effect of circulating flow of the indication arrow on the guiding path is achieved, thereby helping the user to quickly focus on the stereoscopic guiding arrow and improving the guiding effect of the stereoscopic guiding arrow.
Fig. 13 is a schematic structural diagram of a generating device of a guiding arrow according to an embodiment of the present disclosure. The apparatus may be understood as a terminal device or a part of functional modules in a terminal device in the above embodiments. As shown in fig. 13, the generating apparatus 1300 includes:
a first obtaining module 1301, configured to obtain a shape point and a style file of a stereoscopic guiding arrow included in a navigation path;
a first generating module 1302, configured to generate a boundary point of a top surface of the stereoscopic guiding arrow based on a shape point included in the navigation path and an arrow width value recorded in the style file of the stereoscopic guiding arrow;
The correction module 1303 is configured to correct, for a boundary point located in front of the anchor point, a view angle of the boundary point located in front of the anchor point to be directed toward the camera view angle direction according to a distance between the boundary point and the anchor point and a current camera view angle;
A second generating module 1304, configured to generate, based on the modified boundary point of the top surface and the arrow height value recorded by the style file, a boundary point of a side surface of the stereoscopic guiding arrow, where an included angle between the side surface and the top surface is equal to a preset included angle;
And a third generating module 1305, configured to obtain the stereoscopic guiding arrow based on the boundary point of the top surface and the boundary point of the side surface.
In one embodiment, the first generation module 1302 is configured to:
And dividing an endoskeleton point of the top surface of the stereoscopic guiding arrow at the left side and the right side of the shape point based on the shape point included in the navigation path and the arrow width value recorded in the style file of the stereoscopic guiding arrow, and taking at least the endoskeleton point as a boundary point of the top surface.
In one embodiment, the first generation module 1302 may also be configured to:
And taking the inner skeleton point as a base point, splitting the outer skeleton point of the three-dimensional guiding arrow in a direction deviating from the inner skeleton point, and taking the inner skeleton point and the outer skeleton point as boundary points of the top surface.
In one embodiment, the generating apparatus 1300 may further include a first smoothing module configured to:
Determining the connection line of two adjacent shape points in the shape points included in the navigation path;
Determining normal vectors perpendicular to connecting lines of two adjacent shape points, wherein the normal vectors take the shape points as the homeowners;
determining a first included angle of two normal vectors on a common shape point of two adjacent connecting lines, wherein the two normal vectors point to an inner skeleton point on the same side;
determining the opening direction of the first included angle as a chamfering direction in response to the first included angle being greater than a first preset threshold;
at least one endoskeleton point is inserted in the chamfering direction to smooth the endoskeleton of the side pointed by the two normal vectors, wherein the endoskeleton refers to a curve obtained by connecting the endoskeleton points.
In one embodiment, the correction module 1303 is configured to:
generating a torsion angle for correcting a boundary point in front of an anchor point to a direction towards the camera viewing angle according to the distance between the boundary point and the anchor point and the current camera viewing angle, wherein the anchor point is a shape point for leading out a guiding arrow in shape points included in the navigation path;
And twisting the boundary point positioned in front of the anchor point based on the twisting angle so that the view angle of the boundary point positioned in front of the anchor point on the top surface faces the view angle direction of the camera.
In one embodiment, the correction module 1303 is configured to:
determining the progressive distance of any boundary point relative to the anchor point according to the position of the anchor point aiming at any boundary point in front of the anchor point;
And weighting the angle of the camera view angle by taking the ratio of the progressive distance of any boundary point to the maximum progressive distance of the boundary point on the top surface as a weighting coefficient to obtain the torsion angle of any boundary point.
In one embodiment, the generating apparatus 1300 may further include a fourth generating module for:
Generating a normal vector of the boundary point toward the opposite direction of the camera viewing angle based on each boundary point on the top surface;
Generating boundary points of the bottom surface of the stereoscopic guide arrow on a normal vector, wherein the distance from the boundary points of the bottom surface to the boundary points of the corresponding top surface is equal to the height of the stereoscopic guide arrow recorded in the pattern file.
In one embodiment, the first obtaining module 1301 is configured to:
Determining a guiding action of a front road according to the current positioning position and the navigation path, and acquiring a style file of a stereoscopic guiding arrow corresponding to the guiding action according to the guiding action
In one embodiment, the generating apparatus 1300 may further include a data cleansing module for:
Traversing the shape points, and determining the distance between adjacent shape points;
for any shape point, deleting the shape point in response to the distance between the shape point and the previous shape point or the next shape point being smaller than a preset distance;
and/or
Traversing the shape points, and determining connecting lines of adjacent shape points and a second included angle between two adjacent connecting lines;
determining that the shape points on the two adjacent connecting lines are collinear in response to the second included angle being smaller than a second preset threshold value;
And reserving a first shape point and a last shape point in the collinear multiple shape points, and deleting the shape points between the first shape point and the last shape point.
In one embodiment, the generating apparatus 1300 may further include a mapping module configured to:
determining the proportion of the progressive distance of the coordinate point relative to the anchor point to the maximum progressive distance of the coordinate point on the stereoscopic guiding arrow aiming at the coordinate point on the stereoscopic guiding arrow;
Weighting the heights of the texture images by taking the proportion as a weighting coefficient to obtain the ordinate of the textures corresponding to the coordinate points;
acquiring the abscissa of the texture corresponding to the coordinate point from a preset texture mapping configuration table;
and mapping the texture onto the coordinate point according to the ordinate and the abscissa.
In one embodiment, the generating device 1300, the mapping module is further configured to:
and responding to the preset time interval, changing the numerical values of the abscissa and the ordinate according to the preset change amplitude, and mapping the textures corresponding to the changed abscissa and ordinate onto the coordinate point.
The device provided by the embodiment of the present disclosure may be directed to the method of any one of the above method embodiments, and the implementation manner and the beneficial effects of the method are similar, and are not described herein again.
The embodiment of the disclosure also provides a terminal device, which includes a memory and a processor, where the memory is a nonvolatile memory, and the memory stores a computer program, and when the computer program is executed by the processor, the method of any one of the above embodiments of the method can be implemented, and the execution mode and the beneficial effects are similar, and are not repeated herein.
Fig. 14 is a schematic structural diagram of a terminal device in an embodiment of the disclosure. Referring now in particular to fig. 14, a schematic diagram of a configuration of a terminal device 1400 suitable for use in implementing embodiments of the present disclosure is shown. Terminal device 1400 in embodiments of the present disclosure may include, but is not limited to, devices such as cell phones, PAD (tablet computer), car sets, and the like that navigate and process capabilities. The terminal device shown in fig. 14 is only one example, and should not impose any limitation on the functions and scope of use of the embodiments of the present disclosure.
As shown in fig. 14, the terminal apparatus 1400 may include a processing device (e.g., a central processor, a graphics processor, etc.) 1401, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 1402 or a program loaded from a storage device 1408 into a Random Access Memory (RAM) 1403. In the RAM 1403, various programs and data necessary for the operation of the terminal device 1400 are also stored. The processing device 1401, the ROM 1402, and the RAM 1403 are connected to each other through a bus 1404. An input/output (I/O) interface 1405 is also connected to the bus 1404.
In general, devices including input devices 1406 such as a touch screen, a touch panel, a keyboard, a mouse, a camera, a microphone, an accelerometer, a gyroscope, and the like, output devices 1407 including a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like, storage devices 1408 including a magnetic tape, a hard disk, and the like, and communication devices 1409 may be connected to the I/O interface 1405. The communication means 1409 may allow the terminal device 1400 to communicate wirelessly or by wire with other devices to exchange data. While fig. 14 illustrates a terminal device 1400 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 1409, or installed from the storage means 1408, or installed from the ROM 1402. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 1401.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of a computer-readable storage medium may include, but are not limited to, an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to electrical wiring, fiber optic cable, RF (radio frequency), and the like, or any suitable combination of the foregoing.
The computer readable medium may be contained in the terminal device or may exist alone without being incorporated in the terminal device.
The computer readable medium carries one or more programs, and when the one or more programs are executed by a processing device, the processing device is caused to acquire a shape point and a style file of a stereoscopic guiding arrow included in a navigation path, generate a boundary point of a top surface of the stereoscopic guiding arrow based on the shape point included in the navigation path and an arrow width value recorded in the style file of the stereoscopic guiding arrow, correct a viewing angle of the boundary point located in front of the anchor point to be directed to a viewing angle direction of the camera according to a distance between the boundary point and the anchor point and a current camera viewing angle, generate a boundary point of a side surface of the stereoscopic guiding arrow based on the corrected boundary point of the top surface and the arrow height value recorded in the style file, wherein an included angle between the side surface and the top surface is equal to a preset included angle, and obtain the stereoscopic guiding arrow based on the boundary point of the top surface and the boundary point of the side surface.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic that may be used include Field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems-on-a-chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The embodiments of the present disclosure further provide a computer readable storage medium, where a computer program is stored, where the computer program, when executed by a processor, may implement the method of any one of the embodiments of fig. 2 to 9, and the implementation manner and beneficial effects are similar, and are not repeated herein.
Embodiments of the present disclosure further provide a computer program product stored in a storage medium, where the program product, when executed, may implement the method of any of the embodiments of fig. 2 to 9, in a manner similar to the advantageous effects, and will not be described herein.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
The foregoing is merely a specific embodiment of the disclosure to enable one skilled in the art to understand or practice the disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown and described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.