[go: up one dir, main page]

CN109192054B - Data processing method and device for map region merging - Google Patents

Data processing method and device for map region merging Download PDF

Info

Publication number
CN109192054B
CN109192054B CN201810839748.9A CN201810839748A CN109192054B CN 109192054 B CN109192054 B CN 109192054B CN 201810839748 A CN201810839748 A CN 201810839748A CN 109192054 B CN109192054 B CN 109192054B
Authority
CN
China
Prior art keywords
merged
map image
pixel points
area
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810839748.9A
Other languages
Chinese (zh)
Other versions
CN109192054A (en
Inventor
董晓庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Advanced New Technologies Co Ltd
Advantageous New Technologies Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201810839748.9A priority Critical patent/CN109192054B/en
Publication of CN109192054A publication Critical patent/CN109192054A/en
Priority to TW108117080A priority patent/TWI698841B/en
Priority to PCT/CN2019/091262 priority patent/WO2020019899A1/en
Application granted granted Critical
Publication of CN109192054B publication Critical patent/CN109192054B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • G09B29/005Map projections or methods associated specifically therewith

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The specification provides a data processing method and device for map region merging. The method comprises the following steps: drawing a map image of an area to be merged by using a line with preset transparency to generate an initial merged map image; acquiring the transparency of pixel points in the initial merged map image; and taking the pixel points with the same transparency as the preset transparency as target pixel points, and generating a merged target merged map image according to the target pixel points. By utilizing the embodiments in the specification, the map areas are flexibly combined, the method is simple and rapid, complex data processing is not needed, and the boundary overlapping part can be accurately detected, so that the map areas are combined more accurately and rapidly, the combined map areas can be used as a whole for interaction, the subsequent use is convenient, and the applicability is wide.

Description

Data processing method and device for map region merging
Technical Field
The present disclosure relates to the field of map data processing technologies, and in particular, to a data processing method and apparatus for map region merging.
Background
With the development of computer technology, the appearance of electronic maps brings great convenience to the life of people. When using electronic maps, a need to merge multiple regions in the map into one region is often encountered, for example: the three provinces of northeast China are combined into the big northeast region, Zhejiang, Shanghai and Suzhou are combined into the Huadong region, and a plurality of countries are combined into the Zhongdong region.
In the prior art, when areas in a map are merged, a coincidence line among a plurality of areas can be calculated by using mathematics generally, and the data processing process is complex, inflexible and poor in applicability. Therefore, an implementation scheme for convenient and fast map region merging is needed.
Disclosure of Invention
The present specification aims to provide a data processing method and device for map region merging, which is simple and fast and meets the technical requirements of map region merging.
In one aspect, an embodiment of the present specification provides a data processing method for map region merging, including:
drawing a map image of an area to be merged by using a line with preset transparency to generate an initial merged map image;
acquiring the transparency of pixel points in the initial merged map image;
and taking the pixel points with the same transparency as the preset transparency as target pixel points, and generating a merged target merged map image according to the target pixel points.
Further, in another embodiment of the method, the drawing a map image of an area to be merged by using a line with a preset transparency to generate an initial merged map image includes:
acquiring the area to be merged selected by the user in a first mapping area;
and according to the area to be merged selected by the user, drawing the map image of the area to be merged in the second map drawing area by using the line with the preset transparency, and generating the initial merged map image.
Further, in another embodiment of the method, the generating a merged target merged map image according to the target pixel point includes:
removing pixel points except the target pixel points in the initial merged map image from the initial merged map image;
and taking a map image formed by the target pixel points in the initial merged map image as the target merged map image.
Further, in another embodiment of the method, after removing the pixels other than the target pixel in the initial merged map image from the initial merged map image, the method includes:
and setting the color of the positions of the pixel points except the target pixel point to be the same as the color of the pixel points in the area to be merged in the initial merged map image.
Further, in another embodiment of the method, the generating a merged target merged map image according to the target pixel point includes:
and extracting the coordinate information corresponding to the target pixel point, and generating the target merged map image according to the set of the coordinate information corresponding to the target pixel point.
Further, in another embodiment of the method, the obtaining the transparency of the pixel point in the initial merged map image includes:
according to the coordinate information of the area to be merged, traversing pixel points in the area to be merged in the initial merged map image, and obtaining the transparency corresponding to the pixel points in the area to be merged in the initial merged map image.
In another aspect, the present specification provides a data processing apparatus for map area merging, comprising:
the initial merged image drawing module is used for drawing the map image of the area to be merged by using a line with preset transparency to generate an initial merged map image;
a transparency obtaining module, configured to obtain a transparency of a pixel point in the initial merged map image;
and the target merged map generation module is used for taking the pixel points with the same transparency as the preset transparency as target pixel points and generating a merged target merged map image according to the target pixel points.
Further, in another embodiment of the apparatus, the initial merged image rendering module is specifically configured to:
acquiring the area to be merged selected by the user in a first mapping area;
and according to the area to be merged selected by the user, drawing the map image of the area to be merged in the second map drawing area by using the line with the preset transparency, and generating the initial merged map image.
Further, in another embodiment of the apparatus, the target merged map generating module is specifically configured to:
removing pixel points except the target pixel points in the initial merged map image from the initial merged map image;
and taking a map image formed by the target pixel points in the initial merged map image as the target merged map image.
Further, in another embodiment of the apparatus, the target merged map generating module is further configured to:
and setting the color of the positions of the pixel points except the target pixel point to be the same as the color of the pixel points in the area to be merged in the initial merged map image.
Further, in another embodiment of the apparatus, the target merged map generating module is specifically configured to:
and extracting the coordinate information corresponding to the target pixel point, and generating the target merged map image according to the set of the coordinate information corresponding to the target pixel point.
Further, in another embodiment of the apparatus, the transparency obtaining module is specifically configured to:
according to the coordinate information of the area to be merged, traversing pixel points in the area to be merged in the initial merged map image, and obtaining the transparency corresponding to the pixel points in the area to be merged in the initial merged map image.
In still another aspect, the present specification provides a computer storage medium, on which a computer program is stored, and when the computer program is executed, the data processing method for map area merging as claimed above is implemented.
In yet another aspect, the present specification provides a data processing system for map region merging, including at least one processor and a memory for storing processor-executable instructions, where the processor executes the instructions to implement the above data processing method for map region merging.
The map area merging data processing method, device, and system provided in this specification can detect whether the transparency of the pixel points changes by a dyeing technique on the canvas, screen out the pixel points with overlapped boundaries and the pixel points with non-overlapped boundaries based on the difference between the transparency of the pixel points in the overlapped part and the transparency of the pixel points in the non-overlapped part when the map and the merging are performed, and further generate the merged map image based on the pixel points with non-overlapped boundaries. The method is simple and rapid, does not need complex data processing, can accurately detect the boundary overlapping part, enables the combination of the map areas to be more accurate and rapid, enables the combined map areas to be used as a whole for interaction, is convenient for subsequent use, and has wide applicability.
Drawings
In order to more clearly illustrate the embodiments of the present specification or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present specification, and for those skilled in the art, other drawings can be obtained according to the drawings without any creative effort.
FIG. 1 is a flow diagram of a data processing method for map region merging in one embodiment provided by the present specification;
FIG. 2 is a schematic illustration of an initial merged map image of the northeast region in one embodiment of the present description;
FIG. 3 is a schematic diagram of a merged map image of objects in the northeast region after merging in one embodiment of the present disclosure;
FIGS. 4(a) -4(b) are schematic diagrams of transparency change detection in one embodiment of the present disclosure;
FIG. 5 is a block diagram of an embodiment of a data processing apparatus for map region merging provided in the present specification;
FIG. 6 is a block diagram of an embodiment of a data processing system for map region merging provided in the present specification.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present specification, the technical solutions in the embodiments of the present specification will be clearly and completely described below with reference to the drawings in the embodiments of the present specification, and it is obvious that the described embodiments are only a part of the embodiments of the present specification, and not all of the embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present specification without any inventive step should fall within the scope of protection of the present specification.
With the development of computer network technology, people can know information such as geographical positions around the world through an electronic map. Usually, a map or an electronic map is divided into areas by provinces and countries, but in some cases, some designated areas may need to be merged on the map, such as: the three provinces of northeast in the Chinese map are combined into the northeast area, so that the user can conveniently know and know the northeast area integrally.
In the data processing method for merging map areas provided in the embodiments of the present description, by setting a uniform transparency, when map areas are merged, the transparency of a portion that overlaps may change, and designated areas are merged based on the change in transparency in map images. Based on the change of the transparency of the pixel points, the designated map areas are combined, the method is simple and quick, complex mathematical calculation is not needed, the combined map areas can be used as a whole for interaction, and the applicability is strong.
In the embodiment of the present application, a data processing process for merging map areas may be performed on a client, for example: electronic devices such as smart phones, tablet computers, smart wearable devices (smart watches, virtual reality glasses, virtual reality helmets, etc.). The method can be specifically performed at a browser end of a client, such as: PC browser side, mobile browser side, server side web containers, etc.
Specifically, fig. 1 is a schematic flowchart of a data processing method for merging map areas in an embodiment provided in this specification, and as shown in fig. 1, the data processing method for merging map areas in the embodiment provided in this specification includes:
and S2, drawing the map image of the area to be merged by using a line with preset transparency, and generating an initial merged map image.
The area to be merged may include a plurality of map areas, such as: three provinces of Liaoning province, Jilin province and Heilongjiang province in the northeast province can represent three areas to be merged, and map images of the areas to be merged can be drawn in the same canvas (the canvas can represent components or areas used for drawing graphs) or other map drawing areas by using lines with preset transparency. For example: fig. 2 is a schematic diagram of an initial merged map image of a northeast area in an embodiment of this specification, and as shown in fig. 2, map images of the liaoning province, the Jilin province and the Heilongjiang province can be respectively drawn in the same canvas according to relative positions of the Liaoning province, the Jilin province and the Heilongjiang province according to longitude and latitude information of the three northeast provinces, the map images of the three provinces jointly form the initial merged map image of the northeast area, and it can be seen that boundary coincidence may occur between adjacent areas to be merged in the initial merged map image. As shown in fig. 2, when drawing the map image of the areas to be merged, only the contour lines, i.e., the boundary lines, of the areas to be merged may be drawn, and the areas within the boundary lines may represent the areas to be merged.
When the map image of the area to be merged is drawn, lines with preset transparency are used for drawing in one embodiment of the description, the preset transparency can be selected according to actual needs, the preset transparency can be set to be 0-1 in general, and the preset transparency can be 0.5 in one embodiment of the description, so that subsequent detection of pixel points can be facilitated.
In addition, when drawing each map area to be merged, different map areas to be merged may be drawn by using lines of the same color, or may be drawn by using lines of different colors. Such as: when the map area of the northeast three provinces is used as the area to be merged, the map images of the Liaoning province, the Jilin province and the Heilongjiang province can be drawn by using black lines (or other lines with colors such as red, blue and the like) and with the transparency of 0.5 in the same canvas, and the whole of the map images of the three provinces is used as an initial merged image of the northeast region. The black line with the transparency of 0.5 can be used for drawing a map image of the Liaoning province, the red line with the transparency of 0.5 can be used for drawing a map image of the Jilin province, the blue line with the transparency of 0.5 can be used for drawing a map image of the Heilongjiang province, and the whole of the map images of the three provinces can be used as an initial combined image of the northeast region. That is, in one embodiment of the present specification, when drawing the map images of the respective areas in the area to be merged, the map images of the different areas may use the same line of transparency, but the line color may not be specifically limited.
When the map image of the area to be merged is drawn, the GeoJSON data can be imported, and then the map is drawn on the canvas by a program. GeoJSON is a format for coding various geographic data structures, is an organization format of map data, and can draw a map by analyzing the data.
And S4, obtaining the transparency of the pixel points in the initial merged map image.
After the initial merged map image is generated, the transparency of each pixel point in the initial merged map image can be obtained. In an embodiment of the present description, when drawing a map image of an area to be merged, longitude and latitude information of the area to be merged may be converted into coordinate information. According to the coordinate information of the area to be merged, each pixel point in the area to be merged in the initial merged map image can be traversed, that is, canvas pixel points corresponding to each original data (data points containing latitude and longitude information or data points of coordinate information) in the area to be merged in the initial merged map image can be traversed, and the transparency corresponding to the pixel points in the area to be merged in the initial merged map image is obtained. The transparency of each pixel point can be obtained based on a dyeing technology, and the specific method is not particularly limited in the embodiment of the present application.
The transparency change of each pixel point is obtained by traversing the pixel points in the region to be merged, the method is simple, the detection of the pixel points outside the region to be merged can be reduced, and the data processing speed is improved.
And S6, taking the pixel points with the transparency equal to the preset transparency as target pixel points, and generating a merged target merged map image according to the target pixel points.
After the transparency of the pixel points in the initially merged map image is obtained, the transparency of the pixel points can be compared with the preset transparency of the lines used when the map image of the area to be merged is drawn, and the pixel points with the same transparency as the preset transparency are used as target pixel points. For example: when the preset transparency used in drawing the map image of the area to be merged is 0.5, the pixel point with the transparency of 0.5 in the initially merged map image can be used as the target pixel point. And the combined target combined map image can be generated by utilizing the target pixel points, so that the combination of the map areas is completed.
In an embodiment of the present specification, coordinate information of a target pixel point may be extracted, the coordinate information corresponding to the target pixel point is derived, and a set of the coordinate information corresponding to the target pixel point is used to generate a target merged map image. Such as: if the preset transparency used when the map image of the area to be merged is drawn is 0.5, the pixel points with the transparency of 0.5 in the initially merged map image can be used as target pixel points, the coordinate information of the target pixel points is extracted, the coordinate information of the target pixel points can be stored in a coordinate point set, and the coordinate point set formed by the coordinate information of all the target pixel points can be exported. And drawing a target synthetic map image according to the coordinate set of the target pixel points, wherein the target synthetic map image can be composed of boundary images of the areas to be merged, and the generated boundary images of the areas to be merged can not include the overlapped part of the boundaries of the areas to be merged.
For example: in an embodiment of the present description, a map image of an area to be merged may be drawn in a canvas or a map drawing area using a black line with a transparency of 0.5, the map image of the area to be merged may include a boundary image of the area to be merged, the map image of the area to be merged may constitute an initial merged map image, and specifically, a schematic diagram of the initial merged map image of the northeast map in fig. 2 may be referred to. Traversing pixel points in the region to be merged in the initial merged map image, wherein the transparency of the part of the pixel points which are not overlapped at the boundary of the region to be merged is 0.5, the transparency of the part of the pixel points which are overlapped at the boundary is usually more than 0.5, and the transparency of the pixel points is 0 because the image content is not drawn in other regions in the boundary image. The pixel points with the transparency of 0.5 can be used as target pixel points, namely, the pixel points of the non-overlapped part in the boundary image can be used as the target pixel points. The target pixel points are combined together to represent the combined boundary image of the regions to be combined, and at this time, the combined boundary image does not include the overlapping part of each region to be combined, and can represent the whole boundary contour of each region to be combined. The coordinate information of the target pixel points can be extracted and stored, and the coordinate information corresponding to the target pixel points is exported to generate a combined target combined map image. Fig. 3 is a schematic diagram of a target merged map image of a northeast region after merging in an embodiment of this description, and as shown in fig. 3, the boundary overlapping portion of each region to be merged may be removed from the merged target map image in an embodiment of this description, and only the boundary non-overlapping portion is retained, so that the merging effect of the map regions is visually represented, and the user can conveniently view the merged map image.
Fig. 4(a) -4(b) are schematic diagrams illustrating transparency change detection in an embodiment of the present specification, as shown in fig. 4(a), two images with a transparency of 0.5 are partially overlapped, and it can be seen from the diagrams that the transparency of the image in the middle overlapped part is greater than that of the images in other non-overlapped parts. Similarly, as shown in fig. 4(b), the transparency of the two images with transparency of 0.5 without the frame are partially overlapped, and it can be seen that the transparency of the image in the middle overlapped part is greater than that of the images in other non-overlapped parts. In the embodiment of the present specification, by detecting the change of the transparency of the pixel points, it can be accurately and quickly detected which parts overlap and which parts do not overlap in the region to be merged, so as to quickly and accurately generate the merged map image.
In addition, the embodiment of the present application may also name the merged target merged map image according to the geographic location of the area to be merged, for example: the combined three northeast provinces are named northeast region in fig. 3.
The map region merging data processing method provided by the present specification can detect whether the transparency of the pixel points changes through a canvas dyeing technology, screen out the pixel points with overlapped boundaries and the pixel points with non-overlapped boundaries based on that the transparency of the pixel points of the overlapped part is different from the transparency of the pixel points of the non-overlapped part when the map is merged, and further generate the merged map image based on the pixel points with non-overlapped boundaries. The method is simple and rapid, does not need complex data processing, can accurately detect the boundary overlapping part, enables the combination of the map areas to be more accurate and rapid, enables the combined map areas to be used as a whole for interaction, is convenient for subsequent use, and has wide applicability.
On the basis of the foregoing embodiment, in an embodiment of the present specification, the drawing a map image of an area to be merged by using a line with a preset transparency to generate an initial merged map image may include:
acquiring the area to be merged selected by the user in a first mapping area;
and according to the area to be merged selected by the user, drawing the map image of the area to be merged in the second map drawing area by using the line with the preset transparency, and generating the initial merged map image.
Specifically, when a user views a map in a first mapping area (for example, in a canvas of a client), if part of the map areas need to be merged, and if the map areas in the three provinces of northeast province need to be merged, the user may select an area to be merged, which needs to be merged, by clicking or other operations. Such as: the user draws a complete map (such as a Chinese map) by importing the GeoJSON data in the first map drawing area, and selects an area to be merged by clicking the LiaoNING, Jilin and Heilongjiang provinces in the northeast three provinces in the drawn Chinese map. After the area to be merged selected by the user is identified, the map image of the area to be merged may be drawn in the second map drawing area (the second map drawing area may use a hidden canvas) using a line with a preset transparency, so as to generate an initial merged map image. The initial combined map image is generated based on the selection of the user, the user can select the area to be combined according to the actual requirement to combine the map areas, the method is simple and flexible, and the user experience is improved.
On the basis of the foregoing embodiment, in an embodiment of the present specification, the generating a merged target merged map image according to the target pixel point may include:
removing pixel points except the target pixel points in the initial merged map image from the initial merged map image;
and taking a map image formed by the target pixel points in the initial merged map image as the target merged map image.
Specifically, when the target pixel points are determined and the target merged map image is generated based on the target pixel points, non-target pixel points (i.e., pixel points other than the target pixel points) in the initial merged map image can be eliminated from the initial merged map image, at this time, only the target pixel points are left in the initial merged map image, and the left target pixel points can be merged to form the merged target merged map image. Such as: in the above embodiment, the non-target pixel points are removed from the second map drawing area, and the target pixel points are retained, so that the image formed by the remaining target pixel points in the second map drawing area can represent the merged target merged map image.
And non-target pixel points with the transparency not meeting the requirement are removed from the initial merged map image, and the residual target pixel points directly generate the merged target merged map image.
After the non-target pixel points are eliminated, the color of the positions of the non-target pixel points can be set to be the same as the color of the pixel points in the boundary internal area (namely, the internal area of the area to be merged) in the initial merged map image. Therefore, after the non-target pixel points are eliminated, the influence on the display effect of the combined map image caused by the fact that the colors of the non-target pixel points are different from the colors of other areas inside the boundary of the area to be combined can be avoided. Such as: if the map image of the area to be merged is drawn to generate the initial merged map image, the initial merged map image has the ground color as follows: and the boundary internal area of the area to be merged is filled with the red pixel points, so that after the non-target pixel points are eliminated, the color of the positions of the non-target pixel points may become white or colorless, and the color is different from the colors of other pixel points in the boundary internal area of the area to be merged, so that the display effect of the merged map image is influenced. After the non-target pixel points are removed, the color of the non-target pixel points is set to be red, and is consistent with the color of the boundary inner area of the area to be merged, so that the display effect of the merged map image is improved. If different colors are used for filling the interior of each region boundary in the region to be merged, namely, multiple colors exist at the non-boundary position in the map image to be initially merged, the color of any pixel point adjacent to the non-target pixel point can be taken as the color of the position after the non-target pixel point is removed.
In the data processing method for merging map areas provided by the present specification, by setting a uniform transparency, when map areas are merged, the transparency of the overlapped part may change, and designated areas are merged based on the change of the transparency in the map images. Based on the change of the transparency of the pixel points, the designated map areas are combined, the method is simple and quick, complex mathematical calculation is not needed, the combined map areas can be used as a whole for interaction, and the applicability is strong.
In the present specification, each embodiment of the method is described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. Reference is made to the description of the method embodiments.
Based on the map area merging data processing method, one or more embodiments of the present specification further provide a map area merging data processing apparatus. The apparatus may include systems (including distributed systems), software (applications), modules, components, servers, clients, etc. that use the methods described in the embodiments of the present specification in conjunction with any necessary apparatus to implement the hardware. Based on the same innovative conception, embodiments of the present specification provide an apparatus as described in the following embodiments. Since the implementation scheme of the apparatus for solving the problem is similar to that of the method, the specific implementation of the apparatus in the embodiment of the present specification may refer to the implementation of the foregoing method, and repeated details are not repeated. As used hereinafter, the term "unit" or "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Specifically, fig. 5 is a schematic block diagram of an embodiment of a data processing apparatus for merging map areas provided in this specification, and as shown in fig. 5, the data processing apparatus for merging map areas provided in this specification includes: an initial merged image drawing module 51, a transparency obtaining module 52, and a target merged map generating module 53, wherein:
the initial merged image drawing module 51 may be configured to draw a map image of an area to be merged using a line with a preset transparency, and generate an initial merged map image;
a transparency obtaining module 52, configured to obtain a transparency of a pixel point in the initial merged map image;
the target merged map generating module 53 may be configured to use the pixel points with the transparency equal to the preset transparency as target pixel points, and generate a merged target merged map image according to the target pixel points.
The data processing apparatus for map area merging provided in the embodiments of the present specification may detect whether transparency of the pixel points changes through a canvas dyeing technique, screen out the pixel points with overlapped boundaries and the pixel points with non-overlapped boundaries based on that transparency of the pixel points in the overlapped portion is different from transparency of the pixel points in the non-overlapped portion when the map and the map are merged, and further generate the merged map image based on the pixel points with non-overlapped boundaries. The method is simple and rapid, does not need complex data processing, can accurately detect the boundary overlapping part, enables the combination of the map areas to be more accurate and rapid, enables the combined map areas to be used as a whole for interaction, is convenient for subsequent use, and has wide applicability.
On the basis of the foregoing embodiment, the initial merged image drawing module is specifically configured to:
acquiring the area to be merged selected by the user in a first mapping area;
and according to the area to be merged selected by the user, drawing the map image of the area to be merged in the second map drawing area by using the line with the preset transparency, and generating the initial merged map image.
In an embodiment of the present specification, the initial merged image drawing module is specifically configured to:
acquiring the area to be merged selected by the user in a first mapping area;
and according to the area to be merged selected by the user, drawing the map image of the area to be merged in the second map drawing area by using the line with the preset transparency, and generating the initial merged map image.
On the basis of the foregoing embodiment, the target merged map generating module is specifically configured to:
removing pixel points except the target pixel points in the initial merged map image from the initial merged map image;
and taking a map image formed by the target pixel points in the initial merged map image as the target merged map image.
In the embodiment of the specification, the non-target pixel points with the transparency not meeting the requirement are removed from the initial merged map image, and the residual target pixel points directly generate the merged target merged map image.
On the basis of the foregoing embodiment, the target merged map generating module is further configured to:
and setting the color of the positions of the pixel points except the target pixel point to be the same as the color of the pixel points in the area to be merged in the initial merged map image.
In the embodiment of the specification, after the non-target pixel points are removed, the color of the non-target pixel points is set to be red, and is consistent with the color of the boundary internal area of the area to be merged, so that the display effect of the merged map image is improved.
On the basis of the foregoing embodiment, the target merged map generating module is specifically configured to:
and extracting the coordinate information corresponding to the target pixel point, and generating the target merged map image according to the set of the coordinate information corresponding to the target pixel point.
According to the embodiment of the specification, the target merged map image is generated based on the set of the coordinate information of the target pixel points, the method is fast, complex data processing is not needed, the boundary overlapping part can be accurately detected, the merging of the map areas is more accurate and fast, the merged map areas can be used as a whole for interaction, the follow-up use is convenient, and the applicability is wide.
On the basis of the foregoing embodiment, the transparency obtaining module is specifically configured to:
according to the coordinate information of the area to be merged, traversing pixel points in the area to be merged in the initial merged map image, and obtaining the transparency corresponding to the pixel points in the area to be merged in the initial merged map image.
In the embodiment of the specification, the transparency change of each pixel point is obtained by traversing the pixel points inside the region to be merged, the method is simple, the detection of the pixel points outside the region to be merged can be reduced, and the data processing speed is improved.
It should be noted that the above-described apparatus may also include other embodiments according to the description of the method embodiment. The specific implementation manner may refer to the description of the related method embodiment, and is not described in detail herein.
In an embodiment of the present specification, there may also be provided a computer storage medium having a computer program stored thereon, where the computer program, when executed, implements the video data processing method in the above-mentioned embodiment, for example, the following method may be implemented:
drawing a map image of an area to be merged by using a line with preset transparency to generate an initial merged map image;
acquiring the transparency of pixel points in the initial merged map image;
and taking the pixel points with the same transparency as the preset transparency as target pixel points, and generating a merged target merged map image according to the target pixel points.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The method or apparatus provided by the present specification and described in the foregoing embodiments may implement service logic through a computer program and record the service logic on a storage medium, where the storage medium may be read and executed by a computer, so as to implement the effect of the solution described in the embodiments of the present specification.
The map region merging data processing method or apparatus provided in the embodiments of the present specification may be implemented in a computer by a processor executing corresponding program instructions, for example, implemented in a PC end using a c + + language of a windows operating system, implemented in a linux system, or implemented in an intelligent terminal using android and iOS system programming languages, implemented in processing logic based on a quantum computer, and the like. In an embodiment of a map region merging data processing system provided in this specification, fig. 6 is a schematic block diagram of an embodiment of a map region merging data processing system provided in this specification, and as shown in fig. 6, a map region merging data processing system provided in this specification may include a processor 61 and a memory 62 for storing processor executable instructions,
the processor 61 and the memory 62 communicate with each other via a bus 63;
the processor 61 is configured to call the program instructions in the memory 62 to execute the methods provided in the above embodiments of the seismic data processing method, including: drawing a map image of an area to be merged by using a line with preset transparency to generate an initial merged map image; acquiring the transparency of pixel points in the initial merged map image; and taking the pixel points with the same transparency as the preset transparency as target pixel points, and generating a merged target merged map image according to the target pixel points.
It should be noted that descriptions of the apparatus, the computer storage medium, and the system described above according to the related method embodiments may also include other embodiments, and specific implementations may refer to the descriptions of the method embodiments and are not described in detail herein.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the hardware + program class embodiment, since it is substantially similar to the method embodiment, the description is simple, and the relevant points can be referred to the partial description of the method embodiment.
The embodiments of this specification are not limited to what must be in compliance with industry communication standards, standard computer data processing and data storage rules, or the description of one or more embodiments of this specification. Certain industry standards, or implementations modified slightly from those described using custom modes or examples, may also achieve the same, equivalent, or similar, or other, contemplated implementations of the above-described examples. The embodiments using the modified or transformed data acquisition, storage, judgment, processing and the like can still fall within the scope of the alternative embodiments of the embodiments in this specification.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Language Description Language), traffic, pl (core unified Programming Language), HDCal, JHDL (Java Hardware Description Language), langue, Lola, HDL, laspam, hardsradware (Hardware Description Language), vhjhd (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a vehicle-mounted human-computer interaction device, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
Although one or more embodiments of the present description provide method operational steps as described in the embodiments or flowcharts, more or fewer operational steps may be included based on conventional or non-inventive approaches. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of orders and does not represent the only order of execution. When an actual apparatus or end product executes, it may execute sequentially or in parallel (e.g., parallel processors or multi-threaded environments, or even distributed data processing environments) according to the method shown in the embodiment or the figures. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the presence of additional identical or equivalent elements in a process, method, article, or apparatus that comprises the recited elements is not excluded. The terms first, second, etc. are used to denote names, but not any particular order.
For convenience of description, the above devices are described as being divided into various modules by functions, and are described separately. Of course, when implementing one or more of the present description, the functions of each module may be implemented in one or more software and/or hardware, or a module implementing the same function may be implemented by a combination of multiple sub-modules or sub-units, etc. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage, graphene storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
As will be appreciated by one skilled in the art, one or more embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, one or more embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, one or more embodiments of the present description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
One or more embodiments of the present description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. One or more embodiments of the present specification can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment. In the description of the specification, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the specification. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
The above description is merely exemplary of one or more embodiments of the present disclosure and is not intended to limit the scope of one or more embodiments of the present disclosure. Various modifications and alterations to one or more embodiments described herein will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement or the like made within the spirit and principle of the present specification should be included in the scope of the claims.

Claims (14)

1. A data processing method for map region merging comprises the following steps:
drawing a map image of an area to be merged by using a line with preset transparency to generate an initial merged map image;
acquiring the transparency of pixel points in the initial merged map image;
and taking the pixel points with the same transparency as the preset transparency as target pixel points, and generating a merged target merged map image according to the target pixel points, wherein the target merged map image is a whole.
2. The method of claim 1, wherein the drawing the map image of the area to be merged by using the line with the preset transparency to generate an initial merged map image comprises:
acquiring the area to be merged selected by the user in a first mapping area;
and according to the area to be merged selected by the user, drawing the map image of the area to be merged in the second map drawing area by using the line with the preset transparency, and generating the initial merged map image.
3. The method of claim 1, wherein generating a merged target merged map image from the target pixel points comprises:
removing pixel points except the target pixel points in the initial merged map image from the initial merged map image;
and taking a map image formed by the target pixel points in the initial merged map image as the target merged map image.
4. The method of claim 3, wherein after removing pixels other than the target pixel from the initial merged map image, the method comprises:
and setting the color of the positions of the pixel points except the target pixel point to be the same as the color of the pixel points in the area to be merged in the initial merged map image.
5. The method of claim 1, wherein generating a merged target merged map image from the target pixel points comprises:
and extracting the coordinate information corresponding to the target pixel point, and generating the target merged map image according to the set of the coordinate information corresponding to the target pixel point.
6. The method of claim 1, wherein the obtaining the transparency of the pixel points in the initial merged map image comprises:
according to the coordinate information of the area to be merged, traversing pixel points in the area to be merged in the initial merged map image, and obtaining the transparency corresponding to the pixel points in the area to be merged in the initial merged map image.
7. A map region merged data processing apparatus, comprising:
the initial merged image drawing module is used for drawing the map image of the area to be merged by using a line with preset transparency to generate an initial merged map image;
a transparency obtaining module, configured to obtain a transparency of a pixel point in the initial merged map image;
and the target merged map generation module is used for taking the pixel points with the same transparency as the preset transparency as target pixel points and generating a merged target merged map image according to the target pixel points, wherein the target merged map image is a whole.
8. The apparatus of claim 7, the initial merged image rendering module to:
acquiring the area to be merged selected by the user in a first mapping area;
and according to the area to be merged selected by the user, drawing the map image of the area to be merged in the second map drawing area by using the line with the preset transparency, and generating the initial merged map image.
9. The apparatus of claim 7, wherein the target merged map generation module is specifically configured to:
removing pixel points except the target pixel points in the initial merged map image from the initial merged map image;
and taking a map image formed by the target pixel points in the initial merged map image as the target merged map image.
10. The apparatus of claim 9, the target merged map generation module further to:
and setting the color of the positions of the pixel points except the target pixel point to be the same as the color of the pixel points in the area to be merged in the initial merged map image.
11. The apparatus of claim 7, wherein the target merged map generation module is specifically configured to:
and extracting the coordinate information corresponding to the target pixel point, and generating the target merged map image according to the set of the coordinate information corresponding to the target pixel point.
12. The apparatus of claim 7, wherein the transparency obtaining module is specifically configured to:
according to the coordinate information of the area to be merged, traversing pixel points in the area to be merged in the initial merged map image, and obtaining the transparency corresponding to the pixel points in the area to be merged in the initial merged map image.
13. A computer storage medium having stored thereon a computer program which, when executed, implements the method of any of claims 1-6.
14. A map region consolidation data processing system comprising at least one processor and a memory for storing processor-executable instructions which, when executed by the processor, implement the method of any of claims 1-6.
CN201810839748.9A 2018-07-27 2018-07-27 Data processing method and device for map region merging Active CN109192054B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201810839748.9A CN109192054B (en) 2018-07-27 2018-07-27 Data processing method and device for map region merging
TW108117080A TWI698841B (en) 2018-07-27 2019-05-17 Data processing method and device for merging map areas
PCT/CN2019/091262 WO2020019899A1 (en) 2018-07-27 2019-06-14 Data processing method and apparatus for merging regions in map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810839748.9A CN109192054B (en) 2018-07-27 2018-07-27 Data processing method and device for map region merging

Publications (2)

Publication Number Publication Date
CN109192054A CN109192054A (en) 2019-01-11
CN109192054B true CN109192054B (en) 2020-04-28

Family

ID=64937165

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810839748.9A Active CN109192054B (en) 2018-07-27 2018-07-27 Data processing method and device for map region merging

Country Status (3)

Country Link
CN (1) CN109192054B (en)
TW (1) TWI698841B (en)
WO (1) WO2020019899A1 (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108573653B (en) * 2017-03-13 2022-01-04 腾讯科技(深圳)有限公司 Electronic map generation method and device
CN109192054B (en) * 2018-07-27 2020-04-28 阿里巴巴集团控股有限公司 Data processing method and device for map region merging
CN109785355A (en) * 2019-01-25 2019-05-21 网易(杭州)网络有限公司 Region merging method and device, computer storage medium, electronic equipment
CN111489411B (en) * 2019-01-29 2023-06-20 北京百度网讯科技有限公司 Line drawing method and device, image processor, display card and vehicle
CN111612868B (en) * 2019-02-22 2025-04-25 北京奇虎科技有限公司 A map optimization method and device
CN110068344B (en) * 2019-04-08 2021-11-23 丰图科技(深圳)有限公司 Map data production method, map data production device, server, and storage medium
CN112019702B (en) * 2019-05-31 2023-08-25 北京嗨动视觉科技有限公司 Image processing method, device and video processor
CN112179361B (en) 2019-07-02 2022-12-06 华为技术有限公司 Method, device and storage medium for updating work map of mobile robot
CN111080732B (en) * 2019-11-12 2023-09-22 望海康信(北京)科技股份公司 Method and system for forming virtual map
CN111862204B (en) * 2019-12-18 2025-01-07 北京嘀嘀无限科技发展有限公司 Method and related device for extracting visual feature points of image
WO2021121306A1 (en) 2019-12-18 2021-06-24 北京嘀嘀无限科技发展有限公司 Visual location method and system
CN111127543B (en) * 2019-12-23 2024-04-05 北京金山安全软件有限公司 Image processing method, device, electronic equipment and storage medium
CN111881817B (en) * 2020-07-27 2024-09-24 北京三快在线科技有限公司 Method and device for extracting specific area, storage medium and electronic equipment
CN112269850B (en) * 2020-11-10 2024-05-03 中煤航测遥感集团有限公司 Geographic data processing method and device, electronic equipment and storage medium
CN112652063B (en) * 2020-11-20 2022-09-20 上海莉莉丝网络科技有限公司 Method and system for generating dynamic area boundary in game map and computer readable storage medium
CN112395380B (en) * 2020-11-20 2022-03-22 上海莉莉丝网络科技有限公司 Merging method, merging system and computer readable storage medium for dynamic area boundary in game map
CN115115739B (en) * 2021-03-17 2025-06-24 阿里巴巴创新公司 Data processing method, device and information display method
CN114140548A (en) * 2021-11-05 2022-03-04 深圳集智数字科技有限公司 Map-based drawing method and device

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1299220C (en) * 2004-05-13 2007-02-07 上海交通大学 Automatic splicing method for digital road map
US7911481B1 (en) * 2006-12-14 2011-03-22 Disney Enterprises, Inc. Method and apparatus of graphical object selection
TWI329825B (en) * 2007-04-23 2010-09-01 Network e-map graphic automatically generating system and method therefor
TWI480809B (en) * 2009-08-31 2015-04-11 Alibaba Group Holding Ltd Image feature extraction method and device
US8872848B1 (en) * 2010-09-29 2014-10-28 Google Inc. Rendering vector data as tiles
TWI479343B (en) * 2011-11-11 2015-04-01 Easymap Digital Technology Inc Theme map generating system and method thereof
US9043150B2 (en) * 2012-06-05 2015-05-26 Apple Inc. Routing applications for navigation
GB2499694B8 (en) * 2012-11-09 2017-06-07 Sony Computer Entertainment Europe Ltd System and method of image reconstruction
KR101459636B1 (en) * 2013-04-08 2014-11-07 현대엠엔소프트 주식회사 Method for displaying map of navigation apparatus and navigation apparatus
WO2015130365A2 (en) * 2013-12-04 2015-09-03 Urthecast Corp. Systems and methods for earth observation
CN103714540B (en) * 2013-12-21 2017-01-11 浙江传媒学院 SVM-based transparency estimation method in digital image matting processing
CN103761094A (en) * 2014-01-22 2014-04-30 上海诚明融鑫科技有限公司 Method for polygon combination in planar drawing
CN104077100B (en) * 2014-06-27 2017-04-12 广东威创视讯科技股份有限公司 Composite buffer area image display method and device
CN104715451B (en) * 2015-03-11 2018-01-05 西安交通大学 A kind of image seamless fusion method unanimously optimized based on color and transparency
CN104867170B (en) * 2015-06-02 2017-11-03 厦门卫星定位应用股份有限公司 Public bus network Density Distribution drawing drawing method and system
CN106128291A (en) * 2016-08-31 2016-11-16 武汉拓普伟域网络有限公司 A kind of method based on the self-defined map layer of electronic third-party mapping
CN107919012B (en) * 2016-10-09 2020-11-27 北京嘀嘀无限科技发展有限公司 Method and system for scheduling transport capacity
CN106530219B (en) * 2016-11-07 2020-03-24 青岛海信移动通信技术股份有限公司 Image splicing method and device
CN106557567A (en) * 2016-11-21 2017-04-05 中国农业银行股份有限公司 A kind of data processing method and system
CN107146201A (en) * 2017-05-08 2017-09-08 重庆邮电大学 An Image Stitching Method Based on Improved Image Fusion
CN109192054B (en) * 2018-07-27 2020-04-28 阿里巴巴集团控股有限公司 Data processing method and device for map region merging

Also Published As

Publication number Publication date
CN109192054A (en) 2019-01-11
TW202008328A (en) 2020-02-16
TWI698841B (en) 2020-07-11
WO2020019899A1 (en) 2020-01-30

Similar Documents

Publication Publication Date Title
CN109192054B (en) Data processing method and device for map region merging
CN107562467B (en) Page rendering method, device and equipment
CN109272454B (en) Coordinate system calibration method and device of augmented reality equipment
CN107274442B (en) Image identification method and device
CN111311709A (en) Method and device for generating high-precision map
CN110427215A (en) A kind of program version mRNA differential display mRNA method and device applied to front end exploitation
CN110738722B (en) Thermodynamic diagram texture generation method, device and equipment
CN109978044B (en) Training data generation method and device, and model training method and device
CN105786417B (en) A kind of dynamic display method of static images, device and equipment
CN110910334B (en) Instance segmentation method, image processing device and computer readable storage medium
CN107766703B (en) Watermark adding processing method and device and client
CN106484080A (en) A kind of methods of exhibiting of display interface, device and equipment
CN108280135B (en) Method and device for realizing visualization of data structure and electronic equipment
CN110806847A (en) Distributed multi-screen display method, device, equipment and system
CN117495894A (en) Image generation processing method and electronic equipment
CN109857964B (en) Thermodynamic diagram drawing method and device for page operation, storage medium and processor
CN112070830A (en) Point cloud image labeling method, device, equipment and storage medium
CN113360154B (en) Page construction method, device, equipment and readable medium
US9786073B2 (en) Geometric shape hierarchy determination to provide visualization context
CN107621951B (en) View level optimization method and device
CN116193050B (en) Image processing method, device, equipment and storage medium
US20240319967A1 (en) Script generation method and apparatus, device, and storage medium
CN110968513A (en) Recording method and device of test script
CN106570825B (en) A system and method for superimposing pictures based on transparent pictures
CA2931695C (en) Picture fusion method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20200930

Address after: Cayman Enterprise Centre, 27 Hospital Road, George Town, Grand Cayman Islands

Patentee after: Innovative advanced technology Co.,Ltd.

Address before: Cayman Enterprise Centre, 27 Hospital Road, George Town, Grand Cayman Islands

Patentee before: Advanced innovation technology Co.,Ltd.

Effective date of registration: 20200930

Address after: Cayman Enterprise Centre, 27 Hospital Road, George Town, Grand Cayman Islands

Patentee after: Advanced innovation technology Co.,Ltd.

Address before: Greater Cayman, British Cayman Islands

Patentee before: Alibaba Group Holding Ltd.