Disclosure of Invention
The present specification aims to provide a data processing method and device for map region merging, which is simple and fast and meets the technical requirements of map region merging.
In one aspect, an embodiment of the present specification provides a data processing method for map region merging, including:
drawing a map image of an area to be merged by using a line with preset transparency to generate an initial merged map image;
acquiring the transparency of pixel points in the initial merged map image;
and taking the pixel points with the same transparency as the preset transparency as target pixel points, and generating a merged target merged map image according to the target pixel points.
Further, in another embodiment of the method, the drawing a map image of an area to be merged by using a line with a preset transparency to generate an initial merged map image includes:
acquiring the area to be merged selected by the user in a first mapping area;
and according to the area to be merged selected by the user, drawing the map image of the area to be merged in the second map drawing area by using the line with the preset transparency, and generating the initial merged map image.
Further, in another embodiment of the method, the generating a merged target merged map image according to the target pixel point includes:
removing pixel points except the target pixel points in the initial merged map image from the initial merged map image;
and taking a map image formed by the target pixel points in the initial merged map image as the target merged map image.
Further, in another embodiment of the method, after removing the pixels other than the target pixel in the initial merged map image from the initial merged map image, the method includes:
and setting the color of the positions of the pixel points except the target pixel point to be the same as the color of the pixel points in the area to be merged in the initial merged map image.
Further, in another embodiment of the method, the generating a merged target merged map image according to the target pixel point includes:
and extracting the coordinate information corresponding to the target pixel point, and generating the target merged map image according to the set of the coordinate information corresponding to the target pixel point.
Further, in another embodiment of the method, the obtaining the transparency of the pixel point in the initial merged map image includes:
according to the coordinate information of the area to be merged, traversing pixel points in the area to be merged in the initial merged map image, and obtaining the transparency corresponding to the pixel points in the area to be merged in the initial merged map image.
In another aspect, the present specification provides a data processing apparatus for map area merging, comprising:
the initial merged image drawing module is used for drawing the map image of the area to be merged by using a line with preset transparency to generate an initial merged map image;
a transparency obtaining module, configured to obtain a transparency of a pixel point in the initial merged map image;
and the target merged map generation module is used for taking the pixel points with the same transparency as the preset transparency as target pixel points and generating a merged target merged map image according to the target pixel points.
Further, in another embodiment of the apparatus, the initial merged image rendering module is specifically configured to:
acquiring the area to be merged selected by the user in a first mapping area;
and according to the area to be merged selected by the user, drawing the map image of the area to be merged in the second map drawing area by using the line with the preset transparency, and generating the initial merged map image.
Further, in another embodiment of the apparatus, the target merged map generating module is specifically configured to:
removing pixel points except the target pixel points in the initial merged map image from the initial merged map image;
and taking a map image formed by the target pixel points in the initial merged map image as the target merged map image.
Further, in another embodiment of the apparatus, the target merged map generating module is further configured to:
and setting the color of the positions of the pixel points except the target pixel point to be the same as the color of the pixel points in the area to be merged in the initial merged map image.
Further, in another embodiment of the apparatus, the target merged map generating module is specifically configured to:
and extracting the coordinate information corresponding to the target pixel point, and generating the target merged map image according to the set of the coordinate information corresponding to the target pixel point.
Further, in another embodiment of the apparatus, the transparency obtaining module is specifically configured to:
according to the coordinate information of the area to be merged, traversing pixel points in the area to be merged in the initial merged map image, and obtaining the transparency corresponding to the pixel points in the area to be merged in the initial merged map image.
In still another aspect, the present specification provides a computer storage medium, on which a computer program is stored, and when the computer program is executed, the data processing method for map area merging as claimed above is implemented.
In yet another aspect, the present specification provides a data processing system for map region merging, including at least one processor and a memory for storing processor-executable instructions, where the processor executes the instructions to implement the above data processing method for map region merging.
The map area merging data processing method, device, and system provided in this specification can detect whether the transparency of the pixel points changes by a dyeing technique on the canvas, screen out the pixel points with overlapped boundaries and the pixel points with non-overlapped boundaries based on the difference between the transparency of the pixel points in the overlapped part and the transparency of the pixel points in the non-overlapped part when the map and the merging are performed, and further generate the merged map image based on the pixel points with non-overlapped boundaries. The method is simple and rapid, does not need complex data processing, can accurately detect the boundary overlapping part, enables the combination of the map areas to be more accurate and rapid, enables the combined map areas to be used as a whole for interaction, is convenient for subsequent use, and has wide applicability.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present specification, the technical solutions in the embodiments of the present specification will be clearly and completely described below with reference to the drawings in the embodiments of the present specification, and it is obvious that the described embodiments are only a part of the embodiments of the present specification, and not all of the embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present specification without any inventive step should fall within the scope of protection of the present specification.
With the development of computer network technology, people can know information such as geographical positions around the world through an electronic map. Usually, a map or an electronic map is divided into areas by provinces and countries, but in some cases, some designated areas may need to be merged on the map, such as: the three provinces of northeast in the Chinese map are combined into the northeast area, so that the user can conveniently know and know the northeast area integrally.
In the data processing method for merging map areas provided in the embodiments of the present description, by setting a uniform transparency, when map areas are merged, the transparency of a portion that overlaps may change, and designated areas are merged based on the change in transparency in map images. Based on the change of the transparency of the pixel points, the designated map areas are combined, the method is simple and quick, complex mathematical calculation is not needed, the combined map areas can be used as a whole for interaction, and the applicability is strong.
In the embodiment of the present application, a data processing process for merging map areas may be performed on a client, for example: electronic devices such as smart phones, tablet computers, smart wearable devices (smart watches, virtual reality glasses, virtual reality helmets, etc.). The method can be specifically performed at a browser end of a client, such as: PC browser side, mobile browser side, server side web containers, etc.
Specifically, fig. 1 is a schematic flowchart of a data processing method for merging map areas in an embodiment provided in this specification, and as shown in fig. 1, the data processing method for merging map areas in the embodiment provided in this specification includes:
and S2, drawing the map image of the area to be merged by using a line with preset transparency, and generating an initial merged map image.
The area to be merged may include a plurality of map areas, such as: three provinces of Liaoning province, Jilin province and Heilongjiang province in the northeast province can represent three areas to be merged, and map images of the areas to be merged can be drawn in the same canvas (the canvas can represent components or areas used for drawing graphs) or other map drawing areas by using lines with preset transparency. For example: fig. 2 is a schematic diagram of an initial merged map image of a northeast area in an embodiment of this specification, and as shown in fig. 2, map images of the liaoning province, the Jilin province and the Heilongjiang province can be respectively drawn in the same canvas according to relative positions of the Liaoning province, the Jilin province and the Heilongjiang province according to longitude and latitude information of the three northeast provinces, the map images of the three provinces jointly form the initial merged map image of the northeast area, and it can be seen that boundary coincidence may occur between adjacent areas to be merged in the initial merged map image. As shown in fig. 2, when drawing the map image of the areas to be merged, only the contour lines, i.e., the boundary lines, of the areas to be merged may be drawn, and the areas within the boundary lines may represent the areas to be merged.
When the map image of the area to be merged is drawn, lines with preset transparency are used for drawing in one embodiment of the description, the preset transparency can be selected according to actual needs, the preset transparency can be set to be 0-1 in general, and the preset transparency can be 0.5 in one embodiment of the description, so that subsequent detection of pixel points can be facilitated.
In addition, when drawing each map area to be merged, different map areas to be merged may be drawn by using lines of the same color, or may be drawn by using lines of different colors. Such as: when the map area of the northeast three provinces is used as the area to be merged, the map images of the Liaoning province, the Jilin province and the Heilongjiang province can be drawn by using black lines (or other lines with colors such as red, blue and the like) and with the transparency of 0.5 in the same canvas, and the whole of the map images of the three provinces is used as an initial merged image of the northeast region. The black line with the transparency of 0.5 can be used for drawing a map image of the Liaoning province, the red line with the transparency of 0.5 can be used for drawing a map image of the Jilin province, the blue line with the transparency of 0.5 can be used for drawing a map image of the Heilongjiang province, and the whole of the map images of the three provinces can be used as an initial combined image of the northeast region. That is, in one embodiment of the present specification, when drawing the map images of the respective areas in the area to be merged, the map images of the different areas may use the same line of transparency, but the line color may not be specifically limited.
When the map image of the area to be merged is drawn, the GeoJSON data can be imported, and then the map is drawn on the canvas by a program. GeoJSON is a format for coding various geographic data structures, is an organization format of map data, and can draw a map by analyzing the data.
And S4, obtaining the transparency of the pixel points in the initial merged map image.
After the initial merged map image is generated, the transparency of each pixel point in the initial merged map image can be obtained. In an embodiment of the present description, when drawing a map image of an area to be merged, longitude and latitude information of the area to be merged may be converted into coordinate information. According to the coordinate information of the area to be merged, each pixel point in the area to be merged in the initial merged map image can be traversed, that is, canvas pixel points corresponding to each original data (data points containing latitude and longitude information or data points of coordinate information) in the area to be merged in the initial merged map image can be traversed, and the transparency corresponding to the pixel points in the area to be merged in the initial merged map image is obtained. The transparency of each pixel point can be obtained based on a dyeing technology, and the specific method is not particularly limited in the embodiment of the present application.
The transparency change of each pixel point is obtained by traversing the pixel points in the region to be merged, the method is simple, the detection of the pixel points outside the region to be merged can be reduced, and the data processing speed is improved.
And S6, taking the pixel points with the transparency equal to the preset transparency as target pixel points, and generating a merged target merged map image according to the target pixel points.
After the transparency of the pixel points in the initially merged map image is obtained, the transparency of the pixel points can be compared with the preset transparency of the lines used when the map image of the area to be merged is drawn, and the pixel points with the same transparency as the preset transparency are used as target pixel points. For example: when the preset transparency used in drawing the map image of the area to be merged is 0.5, the pixel point with the transparency of 0.5 in the initially merged map image can be used as the target pixel point. And the combined target combined map image can be generated by utilizing the target pixel points, so that the combination of the map areas is completed.
In an embodiment of the present specification, coordinate information of a target pixel point may be extracted, the coordinate information corresponding to the target pixel point is derived, and a set of the coordinate information corresponding to the target pixel point is used to generate a target merged map image. Such as: if the preset transparency used when the map image of the area to be merged is drawn is 0.5, the pixel points with the transparency of 0.5 in the initially merged map image can be used as target pixel points, the coordinate information of the target pixel points is extracted, the coordinate information of the target pixel points can be stored in a coordinate point set, and the coordinate point set formed by the coordinate information of all the target pixel points can be exported. And drawing a target synthetic map image according to the coordinate set of the target pixel points, wherein the target synthetic map image can be composed of boundary images of the areas to be merged, and the generated boundary images of the areas to be merged can not include the overlapped part of the boundaries of the areas to be merged.
For example: in an embodiment of the present description, a map image of an area to be merged may be drawn in a canvas or a map drawing area using a black line with a transparency of 0.5, the map image of the area to be merged may include a boundary image of the area to be merged, the map image of the area to be merged may constitute an initial merged map image, and specifically, a schematic diagram of the initial merged map image of the northeast map in fig. 2 may be referred to. Traversing pixel points in the region to be merged in the initial merged map image, wherein the transparency of the part of the pixel points which are not overlapped at the boundary of the region to be merged is 0.5, the transparency of the part of the pixel points which are overlapped at the boundary is usually more than 0.5, and the transparency of the pixel points is 0 because the image content is not drawn in other regions in the boundary image. The pixel points with the transparency of 0.5 can be used as target pixel points, namely, the pixel points of the non-overlapped part in the boundary image can be used as the target pixel points. The target pixel points are combined together to represent the combined boundary image of the regions to be combined, and at this time, the combined boundary image does not include the overlapping part of each region to be combined, and can represent the whole boundary contour of each region to be combined. The coordinate information of the target pixel points can be extracted and stored, and the coordinate information corresponding to the target pixel points is exported to generate a combined target combined map image. Fig. 3 is a schematic diagram of a target merged map image of a northeast region after merging in an embodiment of this description, and as shown in fig. 3, the boundary overlapping portion of each region to be merged may be removed from the merged target map image in an embodiment of this description, and only the boundary non-overlapping portion is retained, so that the merging effect of the map regions is visually represented, and the user can conveniently view the merged map image.
Fig. 4(a) -4(b) are schematic diagrams illustrating transparency change detection in an embodiment of the present specification, as shown in fig. 4(a), two images with a transparency of 0.5 are partially overlapped, and it can be seen from the diagrams that the transparency of the image in the middle overlapped part is greater than that of the images in other non-overlapped parts. Similarly, as shown in fig. 4(b), the transparency of the two images with transparency of 0.5 without the frame are partially overlapped, and it can be seen that the transparency of the image in the middle overlapped part is greater than that of the images in other non-overlapped parts. In the embodiment of the present specification, by detecting the change of the transparency of the pixel points, it can be accurately and quickly detected which parts overlap and which parts do not overlap in the region to be merged, so as to quickly and accurately generate the merged map image.
In addition, the embodiment of the present application may also name the merged target merged map image according to the geographic location of the area to be merged, for example: the combined three northeast provinces are named northeast region in fig. 3.
The map region merging data processing method provided by the present specification can detect whether the transparency of the pixel points changes through a canvas dyeing technology, screen out the pixel points with overlapped boundaries and the pixel points with non-overlapped boundaries based on that the transparency of the pixel points of the overlapped part is different from the transparency of the pixel points of the non-overlapped part when the map is merged, and further generate the merged map image based on the pixel points with non-overlapped boundaries. The method is simple and rapid, does not need complex data processing, can accurately detect the boundary overlapping part, enables the combination of the map areas to be more accurate and rapid, enables the combined map areas to be used as a whole for interaction, is convenient for subsequent use, and has wide applicability.
On the basis of the foregoing embodiment, in an embodiment of the present specification, the drawing a map image of an area to be merged by using a line with a preset transparency to generate an initial merged map image may include:
acquiring the area to be merged selected by the user in a first mapping area;
and according to the area to be merged selected by the user, drawing the map image of the area to be merged in the second map drawing area by using the line with the preset transparency, and generating the initial merged map image.
Specifically, when a user views a map in a first mapping area (for example, in a canvas of a client), if part of the map areas need to be merged, and if the map areas in the three provinces of northeast province need to be merged, the user may select an area to be merged, which needs to be merged, by clicking or other operations. Such as: the user draws a complete map (such as a Chinese map) by importing the GeoJSON data in the first map drawing area, and selects an area to be merged by clicking the LiaoNING, Jilin and Heilongjiang provinces in the northeast three provinces in the drawn Chinese map. After the area to be merged selected by the user is identified, the map image of the area to be merged may be drawn in the second map drawing area (the second map drawing area may use a hidden canvas) using a line with a preset transparency, so as to generate an initial merged map image. The initial combined map image is generated based on the selection of the user, the user can select the area to be combined according to the actual requirement to combine the map areas, the method is simple and flexible, and the user experience is improved.
On the basis of the foregoing embodiment, in an embodiment of the present specification, the generating a merged target merged map image according to the target pixel point may include:
removing pixel points except the target pixel points in the initial merged map image from the initial merged map image;
and taking a map image formed by the target pixel points in the initial merged map image as the target merged map image.
Specifically, when the target pixel points are determined and the target merged map image is generated based on the target pixel points, non-target pixel points (i.e., pixel points other than the target pixel points) in the initial merged map image can be eliminated from the initial merged map image, at this time, only the target pixel points are left in the initial merged map image, and the left target pixel points can be merged to form the merged target merged map image. Such as: in the above embodiment, the non-target pixel points are removed from the second map drawing area, and the target pixel points are retained, so that the image formed by the remaining target pixel points in the second map drawing area can represent the merged target merged map image.
And non-target pixel points with the transparency not meeting the requirement are removed from the initial merged map image, and the residual target pixel points directly generate the merged target merged map image.
After the non-target pixel points are eliminated, the color of the positions of the non-target pixel points can be set to be the same as the color of the pixel points in the boundary internal area (namely, the internal area of the area to be merged) in the initial merged map image. Therefore, after the non-target pixel points are eliminated, the influence on the display effect of the combined map image caused by the fact that the colors of the non-target pixel points are different from the colors of other areas inside the boundary of the area to be combined can be avoided. Such as: if the map image of the area to be merged is drawn to generate the initial merged map image, the initial merged map image has the ground color as follows: and the boundary internal area of the area to be merged is filled with the red pixel points, so that after the non-target pixel points are eliminated, the color of the positions of the non-target pixel points may become white or colorless, and the color is different from the colors of other pixel points in the boundary internal area of the area to be merged, so that the display effect of the merged map image is influenced. After the non-target pixel points are removed, the color of the non-target pixel points is set to be red, and is consistent with the color of the boundary inner area of the area to be merged, so that the display effect of the merged map image is improved. If different colors are used for filling the interior of each region boundary in the region to be merged, namely, multiple colors exist at the non-boundary position in the map image to be initially merged, the color of any pixel point adjacent to the non-target pixel point can be taken as the color of the position after the non-target pixel point is removed.
In the data processing method for merging map areas provided by the present specification, by setting a uniform transparency, when map areas are merged, the transparency of the overlapped part may change, and designated areas are merged based on the change of the transparency in the map images. Based on the change of the transparency of the pixel points, the designated map areas are combined, the method is simple and quick, complex mathematical calculation is not needed, the combined map areas can be used as a whole for interaction, and the applicability is strong.
In the present specification, each embodiment of the method is described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. Reference is made to the description of the method embodiments.
Based on the map area merging data processing method, one or more embodiments of the present specification further provide a map area merging data processing apparatus. The apparatus may include systems (including distributed systems), software (applications), modules, components, servers, clients, etc. that use the methods described in the embodiments of the present specification in conjunction with any necessary apparatus to implement the hardware. Based on the same innovative conception, embodiments of the present specification provide an apparatus as described in the following embodiments. Since the implementation scheme of the apparatus for solving the problem is similar to that of the method, the specific implementation of the apparatus in the embodiment of the present specification may refer to the implementation of the foregoing method, and repeated details are not repeated. As used hereinafter, the term "unit" or "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Specifically, fig. 5 is a schematic block diagram of an embodiment of a data processing apparatus for merging map areas provided in this specification, and as shown in fig. 5, the data processing apparatus for merging map areas provided in this specification includes: an initial merged image drawing module 51, a transparency obtaining module 52, and a target merged map generating module 53, wherein:
the initial merged image drawing module 51 may be configured to draw a map image of an area to be merged using a line with a preset transparency, and generate an initial merged map image;
a transparency obtaining module 52, configured to obtain a transparency of a pixel point in the initial merged map image;
the target merged map generating module 53 may be configured to use the pixel points with the transparency equal to the preset transparency as target pixel points, and generate a merged target merged map image according to the target pixel points.
The data processing apparatus for map area merging provided in the embodiments of the present specification may detect whether transparency of the pixel points changes through a canvas dyeing technique, screen out the pixel points with overlapped boundaries and the pixel points with non-overlapped boundaries based on that transparency of the pixel points in the overlapped portion is different from transparency of the pixel points in the non-overlapped portion when the map and the map are merged, and further generate the merged map image based on the pixel points with non-overlapped boundaries. The method is simple and rapid, does not need complex data processing, can accurately detect the boundary overlapping part, enables the combination of the map areas to be more accurate and rapid, enables the combined map areas to be used as a whole for interaction, is convenient for subsequent use, and has wide applicability.
On the basis of the foregoing embodiment, the initial merged image drawing module is specifically configured to:
acquiring the area to be merged selected by the user in a first mapping area;
and according to the area to be merged selected by the user, drawing the map image of the area to be merged in the second map drawing area by using the line with the preset transparency, and generating the initial merged map image.
In an embodiment of the present specification, the initial merged image drawing module is specifically configured to:
acquiring the area to be merged selected by the user in a first mapping area;
and according to the area to be merged selected by the user, drawing the map image of the area to be merged in the second map drawing area by using the line with the preset transparency, and generating the initial merged map image.
On the basis of the foregoing embodiment, the target merged map generating module is specifically configured to:
removing pixel points except the target pixel points in the initial merged map image from the initial merged map image;
and taking a map image formed by the target pixel points in the initial merged map image as the target merged map image.
In the embodiment of the specification, the non-target pixel points with the transparency not meeting the requirement are removed from the initial merged map image, and the residual target pixel points directly generate the merged target merged map image.
On the basis of the foregoing embodiment, the target merged map generating module is further configured to:
and setting the color of the positions of the pixel points except the target pixel point to be the same as the color of the pixel points in the area to be merged in the initial merged map image.
In the embodiment of the specification, after the non-target pixel points are removed, the color of the non-target pixel points is set to be red, and is consistent with the color of the boundary internal area of the area to be merged, so that the display effect of the merged map image is improved.
On the basis of the foregoing embodiment, the target merged map generating module is specifically configured to:
and extracting the coordinate information corresponding to the target pixel point, and generating the target merged map image according to the set of the coordinate information corresponding to the target pixel point.
According to the embodiment of the specification, the target merged map image is generated based on the set of the coordinate information of the target pixel points, the method is fast, complex data processing is not needed, the boundary overlapping part can be accurately detected, the merging of the map areas is more accurate and fast, the merged map areas can be used as a whole for interaction, the follow-up use is convenient, and the applicability is wide.
On the basis of the foregoing embodiment, the transparency obtaining module is specifically configured to:
according to the coordinate information of the area to be merged, traversing pixel points in the area to be merged in the initial merged map image, and obtaining the transparency corresponding to the pixel points in the area to be merged in the initial merged map image.
In the embodiment of the specification, the transparency change of each pixel point is obtained by traversing the pixel points inside the region to be merged, the method is simple, the detection of the pixel points outside the region to be merged can be reduced, and the data processing speed is improved.
It should be noted that the above-described apparatus may also include other embodiments according to the description of the method embodiment. The specific implementation manner may refer to the description of the related method embodiment, and is not described in detail herein.
In an embodiment of the present specification, there may also be provided a computer storage medium having a computer program stored thereon, where the computer program, when executed, implements the video data processing method in the above-mentioned embodiment, for example, the following method may be implemented:
drawing a map image of an area to be merged by using a line with preset transparency to generate an initial merged map image;
acquiring the transparency of pixel points in the initial merged map image;
and taking the pixel points with the same transparency as the preset transparency as target pixel points, and generating a merged target merged map image according to the target pixel points.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The method or apparatus provided by the present specification and described in the foregoing embodiments may implement service logic through a computer program and record the service logic on a storage medium, where the storage medium may be read and executed by a computer, so as to implement the effect of the solution described in the embodiments of the present specification.
The map region merging data processing method or apparatus provided in the embodiments of the present specification may be implemented in a computer by a processor executing corresponding program instructions, for example, implemented in a PC end using a c + + language of a windows operating system, implemented in a linux system, or implemented in an intelligent terminal using android and iOS system programming languages, implemented in processing logic based on a quantum computer, and the like. In an embodiment of a map region merging data processing system provided in this specification, fig. 6 is a schematic block diagram of an embodiment of a map region merging data processing system provided in this specification, and as shown in fig. 6, a map region merging data processing system provided in this specification may include a processor 61 and a memory 62 for storing processor executable instructions,
the processor 61 and the memory 62 communicate with each other via a bus 63;
the processor 61 is configured to call the program instructions in the memory 62 to execute the methods provided in the above embodiments of the seismic data processing method, including: drawing a map image of an area to be merged by using a line with preset transparency to generate an initial merged map image; acquiring the transparency of pixel points in the initial merged map image; and taking the pixel points with the same transparency as the preset transparency as target pixel points, and generating a merged target merged map image according to the target pixel points.
It should be noted that descriptions of the apparatus, the computer storage medium, and the system described above according to the related method embodiments may also include other embodiments, and specific implementations may refer to the descriptions of the method embodiments and are not described in detail herein.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the hardware + program class embodiment, since it is substantially similar to the method embodiment, the description is simple, and the relevant points can be referred to the partial description of the method embodiment.
The embodiments of this specification are not limited to what must be in compliance with industry communication standards, standard computer data processing and data storage rules, or the description of one or more embodiments of this specification. Certain industry standards, or implementations modified slightly from those described using custom modes or examples, may also achieve the same, equivalent, or similar, or other, contemplated implementations of the above-described examples. The embodiments using the modified or transformed data acquisition, storage, judgment, processing and the like can still fall within the scope of the alternative embodiments of the embodiments in this specification.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Language Description Language), traffic, pl (core unified Programming Language), HDCal, JHDL (Java Hardware Description Language), langue, Lola, HDL, laspam, hardsradware (Hardware Description Language), vhjhd (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a vehicle-mounted human-computer interaction device, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
Although one or more embodiments of the present description provide method operational steps as described in the embodiments or flowcharts, more or fewer operational steps may be included based on conventional or non-inventive approaches. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of orders and does not represent the only order of execution. When an actual apparatus or end product executes, it may execute sequentially or in parallel (e.g., parallel processors or multi-threaded environments, or even distributed data processing environments) according to the method shown in the embodiment or the figures. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the presence of additional identical or equivalent elements in a process, method, article, or apparatus that comprises the recited elements is not excluded. The terms first, second, etc. are used to denote names, but not any particular order.
For convenience of description, the above devices are described as being divided into various modules by functions, and are described separately. Of course, when implementing one or more of the present description, the functions of each module may be implemented in one or more software and/or hardware, or a module implementing the same function may be implemented by a combination of multiple sub-modules or sub-units, etc. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage, graphene storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
As will be appreciated by one skilled in the art, one or more embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, one or more embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, one or more embodiments of the present description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
One or more embodiments of the present description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. One or more embodiments of the present specification can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment. In the description of the specification, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the specification. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
The above description is merely exemplary of one or more embodiments of the present disclosure and is not intended to limit the scope of one or more embodiments of the present disclosure. Various modifications and alterations to one or more embodiments described herein will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement or the like made within the spirit and principle of the present specification should be included in the scope of the claims.