CN114053708B - Map generation method and device, computer equipment and computer readable storage medium - Google Patents
Map generation method and device, computer equipment and computer readable storage medium Download PDFInfo
- Publication number
- CN114053708B CN114053708B CN202111342256.7A CN202111342256A CN114053708B CN 114053708 B CN114053708 B CN 114053708B CN 202111342256 A CN202111342256 A CN 202111342256A CN 114053708 B CN114053708 B CN 114053708B
- Authority
- CN
- China
- Prior art keywords
- scene
- map
- processed
- game
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5378—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/308—Details of the user interface
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Remote Sensing (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Optics & Photonics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application discloses a map generation method, a map generation device, computer equipment and a computer readable storage medium, relates to the technical field of image processing, and not only can a map be automatically, quickly and accurately generated, but also developers do not need to manually draw the map, a large amount of manpower and material resources are saved, and the map generation efficiency is improved. The method comprises the following steps: determining a game scene to be processed, and generating a plurality of area maps of the game scene to be processed; constructing a plurality of height maps of a game scene to be processed, and intercepting a plurality of area maps based on the height maps to obtain an initial scene map of the game scene to be processed; generating a contour map of the game scene to be processed, and adjusting the color depth of a plurality of scene heights in the initial scene map according to the contour map to obtain an intermediate scene map; and acquiring scene details of the game scene to be processed, and adding the scene details to the intermediate scene map to obtain a target scene map of the game scene to be processed.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a map generation method and apparatus, a computer device, and a computer-readable storage medium.
Background
With the improvement of living standard, the living quality and living environment of people are changed with the change of covering the ground, and the large-scale city construction brings great pressure to the life of people while bringing high-quality enjoyment, and the problems of anxiety, unconcentration of spirit, incapability of normally completing work and the like easily occur to people in a pressure environment, so that a lot of games for relieving fatigue and entertainment can be produced. In recent years, image processing techniques have been rapidly developed, and games of scenes on the market are favored by a large number of players. Most of scene games are body-following shots, which provide more immersive game experience for players, and the route and fun of the level in the game are very important points, so that developers of the game usually generate a map for the game scene when developing the game, so that the players can comprehensively know the game scene.
In the related technology, a game scene is 3D (3 Dimensions, three-dimensional), a developer can segment and capture a 3D game scene to obtain a 3D scene capture, perform lasso capture on different color blocks in the 3D scene capture, respectively fill different colors, implement 2D (2 Dimensions, two-dimensional) processing on the 3D scene capture, and finally perform processing again on the obtained 2D capture to generate a final scene map.
In carrying out the present application, the applicant has found that the related art has at least the following problems:
the interception and the filling of the scene screenshot are both manually operated by developers, the problems of interception position deviation and inaccurate drawing position can occur due to manual operation, a new map needs to be generated again when the game scene is modified, the generated map is not high in accuracy, a large amount of manpower and material resources are involved, and the map generation efficiency is low.
Disclosure of Invention
In view of this, the present application provides a map generation method, a map generation device, a computer device, and a computer readable storage medium, and mainly aims to solve the problems that the accuracy of a currently generated map is not high, a large amount of manpower and material resources are involved, and the map generation efficiency is low.
According to a first aspect of the present application, there is provided a map generation method, including:
determining a game scene to be processed, and generating a plurality of area maps of the game scene to be processed, wherein the area maps are generated according to a scene model, a ground surface material and a water area range in the game scene to be processed;
constructing a plurality of height maps of the game scene to be processed, and intercepting the plurality of area maps based on the plurality of height maps to obtain an initial scene map of the game scene to be processed;
generating a contour map of the game scene to be processed, and adjusting the color depth of a plurality of scene heights in the initial scene map according to the contour map to obtain an intermediate scene map;
and acquiring scene details of the game scene to be processed, and adding the scene details to the intermediate scene map to obtain a target scene map of the game scene to be processed.
Optionally, the generating a plurality of area maps of the game scene to be processed includes:
determining a plurality of scene models included in the game scene to be processed, and generating a human architectural diagram and a ground surface model diagram of the game scene to be processed according to the model attributes of the scene models;
inquiring a plurality of surface materials included in the game scene to be processed, generating a surface material map for each surface material in the plurality of surface materials, and obtaining a plurality of surface material maps;
determining the water area range in the scene to be processed, and generating a water area map of the game scene to be processed according to the water area depth of the water area range;
and setting the human architecture drawing, the land model drawing, the plurality of land surface material drawings and the water area drawing as the plurality of area drawings.
Optionally, the generating a human architecture diagram and a surface model diagram of the game scene to be processed according to the model attributes of the plurality of scene models includes:
reading model attributes of the scene models, extracting a first scene model of which the model attributes indicate a human building from the scene models, and outputting a human building diagram according to the outline of the first scene model, wherein the outline of the first scene model in the human building diagram is filled with a preset color;
and simultaneously or respectively determining other scene models except the first scene model in the plurality of scene models, extracting a second scene model with the size larger than or equal to a size threshold value from the other scene models, and outputting the earth surface model diagram according to the outer contour of the second scene model, wherein the outer contour of the second scene model in the earth surface model diagram is filled with the preset color.
Optionally, the querying a plurality of surface materials included in the game scene to be processed, generating a surface material map for each surface material in the plurality of surface materials, and obtaining a plurality of surface material maps includes:
for each surface material in the plurality of surface materials, determining a material coverage of the surface material, outputting the material coverage, and filling the material coverage with a preset color;
determining the joint of the ground surface material and other ground surface materials, and counting the material proportion of the ground surface material at the joint, wherein the other ground surface materials are the ground surface materials except the current ground surface material in the plurality of ground surface materials;
according to the material proportion, adjusting the depth of the preset color filled in the joint in the material coverage range to obtain a ground surface material map of the ground surface material;
and repeating the process of generating the land surface material map, and generating a land surface material map for each land surface material to obtain the plurality of land surface material maps.
Optionally, the determining the water area range in the to-be-processed scene, and generating a water area map of the to-be-processed game scene according to the water area depth of the water area range includes:
determining all water areas included in the scene to be processed, identifying the water area coverage range of all the water areas, and outputting the water area coverage range;
filling the water area coverage area by adopting a preset color to obtain a water area coverage area map;
inquiring the water area depth of each water area in all the water areas, and extracting the deep water areas with the water area depth larger than a depth threshold value from all the water areas;
outputting the deepwater coverage of the deepwater area, and filling the deepwater coverage by adopting the preset color to obtain a deepwater map;
and taking the water area range map and the deepwater map as the water area map.
Optionally, the constructing multiple height maps of the game scene to be processed includes:
determining a scene lowest point and a scene highest point of the game scene to be processed, wherein the scene highest point is the highest point of other scene models except a first scene model of a model attribute indicating a civil building in the game scene to be processed;
taking the lowest point of the scene as a reference, and carrying out cross section processing on the game scene to be processed every time a preset height difference is reached upwards until the highest point of the scene is reached to obtain a plurality of scene cross sections of the game scene to be processed;
and filling the plurality of scene cross sections by adopting preset colors to obtain the plurality of height maps.
Optionally, the intercepting the plurality of area maps based on the plurality of height maps to obtain an initial scene map of the game scene to be processed includes:
intercepting the highest area of each area map in the area maps according to the indication of the height maps to obtain a plurality of highest areas;
filling the plurality of highest areas by adopting a preset color, and outputting a scene graph comprising the filled highest areas;
respectively determining the area filling color corresponding to each of the area images to obtain a plurality of area filling colors;
for each region filling color in the plurality of region filling colors, determining a target region graph corresponding to the region filling color, and querying a target highest region of the target region graph in the scene graph;
filling the highest target area by using the area filling color in the scene graph;
and repeatedly executing the process of filling the filling area with the color, and filling each highest area in the plurality of highest areas in the scene graph to obtain the initial scene map.
Optionally, the generating a contour map of the game scene to be processed includes:
identifying cross section contour lines of the scene cross section filled with preset colors in each of the height maps to obtain a plurality of cross section contour lines of the height maps;
and superposing the plurality of cross-section contour lines, and filling the area between every two cross-section contour lines in the plurality of cross-section contour lines with colors to generate the contour map.
Optionally, the obtaining of the scene details of the game scene to be processed and adding the scene details to the intermediate scene map to obtain the target scene map of the game scene to be processed includes:
determining preset pixels, and performing scene interception on the game scene to be processed right above the game scene to be processed according to the preset pixels to obtain a scene aerial view of the game scene to be processed;
identifying the scene aerial view, and extracting model details of a plurality of scene models on the scene aerial view as the scene details;
and determining model areas corresponding to the scene models in the intermediate scene map, and drawing the scene details in the model areas to obtain the target scene map.
According to a second aspect of the present application, there is provided a map generating apparatus, the apparatus including:
the generating module is used for determining a game scene to be processed and generating a plurality of area maps of the game scene to be processed, wherein the area maps are generated according to a scene model, a ground surface material and a water area range in the game scene to be processed;
the intercepting module is used for constructing a plurality of height maps of the game scene to be processed, and intercepting the plurality of area maps based on the plurality of height maps to obtain an initial scene map of the game scene to be processed;
the adjusting module is used for generating a contour map of the game scene to be processed, and adjusting the color depth of a plurality of scene heights in the initial scene map according to the contour map to obtain an intermediate scene map;
and the adding module is used for acquiring scene details of the game scene to be processed, and adding the scene details to the intermediate scene map to obtain a target scene map of the game scene to be processed.
Optionally, the generating module is configured to determine a plurality of scene models included in the game scene to be processed, and generate a human architectural diagram and a surface model diagram of the game scene to be processed according to model attributes of the plurality of scene models; inquiring a plurality of ground surface materials included in the game scene to be processed, and generating a ground surface material diagram for each ground surface material in the plurality of ground surface materials to obtain a plurality of ground surface material diagrams; determining the water area range in the scene to be processed, and generating a water area map of the game scene to be processed according to the water area depth of the water area range; and setting the human architecture drawing, the land model drawing, the plurality of land surface material drawings and the water area drawing as the plurality of area drawings.
Optionally, the generating module is configured to read model attributes of the multiple scene models, extract a first scene model of a human building indicated by the model attributes from the multiple scene models, and output the human building diagram according to an outer contour of the first scene model, where the outer contour of the first scene model in the human building diagram is filled with a preset color; and simultaneously or respectively determining other scene models except the first scene model in the plurality of scene models, extracting a second scene model with the size larger than or equal to a size threshold from the other scene models, and outputting the earth surface model diagram according to the outer contour of the second scene model, wherein the outer contour of the second scene model in the earth surface model diagram is filled with the preset color.
Optionally, the generating module is configured to determine, for each of the plurality of surface materials, a material coverage of the surface material, output the material coverage, and fill the material coverage with a preset color; determining the joint of the ground surface material and other ground surface materials, and counting the material proportion of the ground surface material at the joint, wherein the other ground surface materials are the ground surface materials except the current ground surface material in the plurality of ground surface materials; adjusting the depth of the preset color filled in the joint in the material coverage range according to the material proportion to obtain a ground surface material map of the ground surface material; and repeating the process of generating the land surface material map, and generating a land surface material map for each land surface material to obtain the plurality of land surface material maps.
Optionally, the generating module is configured to determine all water areas included in the scene to be processed, identify a water area coverage range of all the water areas, and output the water area coverage range; filling the water area coverage range by adopting a preset color to obtain a water area range diagram; inquiring the water area depth of each water area in all the water areas, and extracting the deep water areas with the water area depth larger than a depth threshold value from all the water areas; outputting the deepwater coverage range of the deepwater area, and filling the deepwater coverage range by adopting the preset color to obtain a deepwater map; and taking the water area range map and the deepwater map as the water area map.
Optionally, the intercepting module is configured to determine a scene lowest point and a scene highest point of the game scene to be processed, where the scene highest point is a highest point of other scene models in the game scene to be processed except for the first scene model of the civil building indicated by the model attribute; taking the lowest point of the scene as a reference, and carrying out cross section processing on the game scene to be processed every time a preset height difference is reached upwards until the highest point of the scene is reached to obtain a plurality of scene cross sections of the game scene to be processed; and filling the plurality of scene cross sections by adopting a preset color to obtain the plurality of height maps.
Optionally, the intercepting module is configured to intercept a highest area of each of the plurality of area maps according to the indication of the plurality of height maps to obtain a plurality of highest areas; filling the plurality of highest areas by adopting a preset color, and outputting a scene graph comprising the filled highest areas; respectively determining the area filling color corresponding to each area map in the area maps to obtain a plurality of area filling colors; for each region filling color in the plurality of region filling colors, determining a target region graph corresponding to the region filling color, and querying a target highest region of the target region graph in the scene graph; filling the highest target area by adopting the area filling color in the scene graph; and repeatedly executing the process of filling the filling area with the color, and filling each highest area in the plurality of highest areas in the scene map to obtain the initial scene map.
Optionally, the adjusting module is configured to identify a cross-section contour line of a scene cross section filled with a preset color in each of the plurality of height maps, so as to obtain a plurality of cross-section contour lines of the plurality of height maps; and superposing the plurality of cross-section contour lines, and filling the area between every two cross-section contour lines in the plurality of cross-section contour lines with colors to generate the contour map.
Optionally, the adding module is configured to determine preset pixels, and perform scene interception on the game scene to be processed right above the game scene to be processed according to the preset pixels to obtain a scene overhead view of the game scene to be processed; identifying the scene aerial view, and extracting model details of a plurality of scene models on the scene aerial view as the scene details; and determining model areas corresponding to the scene models in the intermediate scene map, and drawing the scene details in the model areas to obtain the target scene map.
According to a third aspect of the present application, there is provided a computer device comprising a memory storing a computer program and a processor implementing the steps of the method of any of the first aspects when the computer program is executed.
According to a fourth aspect of the present application, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of any of the first aspects described above.
By means of the technical scheme, the map generation method, the map generation device, the computer equipment and the computer readable storage medium are provided by the application, a plurality of area maps are generated according to a scene model, a ground surface material and a water area range in a game scene to be processed, a plurality of height maps of the game scene to be processed are constructed, the plurality of area maps are intercepted based on the plurality of height maps to obtain an initial scene map of the game scene to be processed, a contour map of the game scene to be processed is generated, the color depth of the plurality of scene heights in the initial scene map is adjusted according to the contour map to obtain an intermediate scene map, then scene details of the game scene to be processed are obtained, and the scene details are added to the intermediate scene map to obtain a target scene map. According to the technical scheme, each regional graph is generated according to the landform and the building of the game scene to be processed, the regions needing to be displayed in each regional graph are directly intercepted to be filled with colors, the contour map is supplemented to increase the brightness change of the map, the building details are processed on the map to form the final map, the map can be automatically, quickly and accurately generated, the development personnel do not need to manually draw, a large amount of manpower and material resources are saved, and the generation efficiency of the map is improved.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
Various additional advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 is a schematic flow chart illustrating a map generation method provided by an embodiment of the present application;
FIG. 2A is a schematic flow chart illustrating a map generation method provided by an embodiment of the present application;
FIG. 2B is a schematic diagram illustrating a map generation method provided by an embodiment of the present application;
FIG. 2C is a diagram illustrating a map generation method provided by an embodiment of the present application;
FIG. 2D is a schematic diagram illustrating a map generation method provided by an embodiment of the present application;
FIG. 2E is a schematic diagram illustrating a map generation method provided by an embodiment of the present application;
FIG. 2F is a schematic diagram illustrating a map generation method provided by an embodiment of the present application;
FIG. 2G is a schematic diagram illustrating a map generation method provided by an embodiment of the present application;
FIG. 2H is a diagram illustrating a map generation method provided by an embodiment of the present application;
FIG. 2I is a schematic diagram illustrating a map generation method provided by an embodiment of the present application;
FIG. 2J is a diagram illustrating a map generation method provided by an embodiment of the present application;
FIG. 2K is a diagram illustrating a map generation method provided by an embodiment of the present application;
fig. 3 shows a schematic structural diagram of a map generating apparatus provided in an embodiment of the present application;
fig. 4 shows a schematic device structure diagram of a computer apparatus according to an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
An embodiment of the present application provides a map generation method, as shown in fig. 1, the method includes:
101. determining a game scene to be processed, and generating a plurality of area maps of the game scene to be processed, wherein the area maps are generated according to a scene model, a ground surface material and a water area range in the game scene to be processed.
102. And constructing a plurality of height maps of the game scene to be processed, and intercepting the plurality of area maps based on the plurality of height maps to obtain an initial scene map of the game scene to be processed.
103. And generating a contour map of the game scene to be processed, and adjusting the color depth of a plurality of scene heights in the initial scene map according to the contour map to obtain an intermediate scene map.
104. And acquiring scene details of the game scene to be processed, and adding the scene details to the intermediate scene map to obtain a target scene map of the game scene to be processed.
According to the method provided by the embodiment of the application, a plurality of area maps are generated according to a scene model, a ground surface material and a water area range in a game scene to be processed, a plurality of height maps of the game scene to be processed are constructed, the area maps are intercepted based on the height maps to obtain an initial scene map of the game scene to be processed, a contour map of the game scene to be processed is generated, the color depth of the scene heights in the initial scene map is adjusted according to the contour map to obtain an intermediate scene map, then scene details of the game scene to be processed are obtained, and the scene details are added to the intermediate scene map to obtain a target scene map. According to the technical scheme, each regional graph is generated according to the landform and the building of the game scene to be processed, the regions needing displaying in each regional graph are directly intercepted for color filling, the contour map is supplemented to increase the lightness change of the map, the building details are processed on the map to form the final map, the map can be automatically, quickly and accurately generated, the manual drawing of developers is not needed, a large amount of manpower and material resources are saved, and the generation efficiency of the map is improved.
An embodiment of the present application provides a map generation method, as shown in fig. 2A, the method includes:
201. determining a game scene to be processed, and generating a plurality of area maps of the game scene to be processed.
In recent years, the area of a game scene of a scene-based game has become larger, and it is generally difficult for a player to perceive the entire game scene when playing the game scene, and therefore, a developer of the game generates a scene map in an overhead form for the game scene so that the player can determine which position and which direction the character is traveling in the game. When generating a scene map, developers usually adopt two methods: one method is to manually draw a map with a game scene screenshot as a reference. The method can be understood as that 2D processing is carried out on the 3D scene screenshot, the screenshot is processed step by step, lasso capture needs to be carried out on different color blocks, different colors are filled respectively, and finally the complete map is processed again to be drawn into the final map. The other method is that the scene is directly used as a map after being simply processed, namely 3D scene in the game is aerial-photographed, and the aerial 2D map is simply processed and directly used as an in-game map.
However, the applicant realizes that when the first method is used for generating the scene map, due to the existence of manual operations such as manual interception, refilling and the like, the interception position deviation occurs, and the drawing position is not accurate; in addition, a large amount of labor time is consumed in the intercepting process, so that the cost is overhigh; and once the scene modification situation occurs, the scene modification situation must be intercepted and manufactured again. Therefore, in summary, the first method has three defects that the map is not accurately manufactured, the labor time is consumed, and once the scene is modified, the map must be manufactured again. When the second method is adopted to generate the scene map, the scene screenshot in the game is more biased to the real scene of the game, the biased 3D real scene can cause the player to be interfered by various light and shadow, the player is inconvenient to check, the player is not beneficial to expressing the topographic elements in the game, and the indicative property in the map is weak. Therefore, in summary, the second method has three defects that the generated map is inconvenient for the player to view the information as a whole, and has poor readability and poor appearance.
Therefore, the application provides a map generation method, which includes generating a plurality of area maps according to a scene model, a ground surface material and a water area range in a game scene to be processed, constructing a plurality of height maps of the game scene to be processed, intercepting the plurality of area maps based on the plurality of height maps to obtain an initial scene map of the game scene to be processed, generating a contour map of the game scene to be processed, adjusting the color depth of the plurality of scene heights in the initial scene map according to the contour map to obtain an intermediate scene map, then obtaining scene details of the game scene to be processed, and adding the scene details to the intermediate scene map to obtain a target scene map of the game scene to be processed. According to the technical scheme, each regional graph is generated according to the landform and the building of the game scene to be processed, the regions needing displaying in each regional graph are directly intercepted for color filling, the contour map is supplemented to increase the lightness change of the map, the building details are processed on the map to form the final map, the map can be automatically, quickly and accurately generated, the manual drawing of developers is not needed, a large amount of manpower and material resources are saved, and the generation efficiency of the map is improved.
The technical scheme provided by the application can be realized by a map generation tool which is used for respectively intercepting all earth surfaces, models and heights of the game so as to replace manual interception of scene color blocks. The developer can input the game scene to be processed, of which the map needs to be generated, into the map generation tool, so that the map generation tool can determine the game scene to be processed, and a plurality of area maps are generated according to a scene model, a ground surface material and a water area range in the game scene to be processed. The plurality of area maps actually include a human architectural map, a surface model map, a plurality of surface texture maps, and a water area map, and the generation processes of these maps are described below:
1. a human architectural drawing and a map of a surface model.
In an optional embodiment, when generating the human architectural drawing, a plurality of scene models included in the game scene to be processed need to be determined, and the human architectural drawing and the earth surface model drawing of the game scene to be processed are generated according to model attributes of the plurality of scene models. Specifically, model attributes of a plurality of scene models are read, a first scene model of a model attribute indication human architecture is extracted from the scene models, and a human architecture drawing is output according to an outer contour of the first scene model, wherein the outer contour of the first scene model in the human architecture drawing is filled with a preset color. And simultaneously or respectively determining other scene models except the first scene model in the plurality of scene models, extracting a second scene model with the size larger than or equal to a size threshold value from the other scene models, and outputting a surface model diagram according to the outer contour of the second scene model, wherein the outer contour of the second scene model in the surface model diagram is filled with a preset color.
In fact, the above process is to output the human architecture drawing according to all the human architectures in the game. The human buildings are uniformly placed in the building folder of the game, so that in another alternative embodiment, the human buildings of all maps can be output according to the outer contour to obtain the human building map. The preset color may be black, so that the output civil building map has black areas and transparent areas, as shown in fig. 2B.
When the surface model map is output, considering that small sundries, such as small stones, small branches on the ground, small mushrooms and the like exist in the game scene to be processed, the small sundries are used for decorating the game scene to be processed and have little influence on the finally generated scene map, so that the small sundries are omitted here and the surface model map is output directly according to all the surface large models in the game. Further, since the civil buildings are within a special folder building, in another alternative embodiment, the mapping tool needs to exclude the models within the folder building, i.e. all the civil buildings, and the rest are other models of the earth's surface. The map generation tool can configure a size threshold, and models with the size smaller than the size threshold in other earth surface models are small stones, small herbaceous plants and the like, so that the small models are excluded and are not output. And for the model with the size larger than or equal to the size threshold value in other models of the earth surface, the map generation tool outputs an earth surface model map according to the model outer contour. Specifically, as shown in the ground model diagram of fig. 2C, the ground model is black (i.e., a predetermined color), and the rest is transparent. In the process of creating the surface model map, it is possible that the surface material or the water area is higher than the model portion, and the surface material or the water area higher than the model portion may be an ineffective area and not displayed on the surface model map, and the output may be transparent.
2. A plurality of ground surface material maps.
The number of surface material maps generated by the number of surface materials in the game scene to be processed is generally about 10 to 20 materials per scene, and therefore, 10 to 20 surface material maps are generally generated. In an optional embodiment, the map generation tool queries a plurality of surface materials included in the game scene to be processed, generates a surface material map for each surface material in the plurality of surface materials, and obtains a plurality of surface material maps, where the process of generating the surface material map is described below by taking any surface material in the plurality of surface materials as an example:
for each of the plurality of surface materials, firstly, the map generation tool determines a material coverage of the surface material, outputs the material coverage, and fills the material coverage with a predetermined color.
And then, determining the joint of the ground surface material and other ground surface materials by the map generation tool, counting the material proportion of the ground surface material at the joint, and adjusting the depth of the preset color filled in the joint in the material coverage range according to the material proportion to obtain a ground surface material map of the ground surface material. Wherein, other earth surface materials are earth surface materials except the current earth surface material in the plurality of earth surface materials, that is, if the earth surface materials are the material boundary combination part, the corresponding semitransparent map is output according to the proportion of the earth surface materials.
In this way, by repeating the above-described process of generating the surface texture map, a surface texture map can be generated for each surface texture, and a plurality of surface texture maps can be obtained. It should be noted that, in an alternative embodiment, the map generation tool may output a plurality of map maps of the surface material according to terrain (the surface material except for the model) in each game, where the current material part is an active area, the active area is black, the inactive area is transparent, and each material is output one by one. In addition, when the surface material map is generated, if the model, the water area, or the like built on the surface is higher than the surface, the surface material of the surface is covered with the model, the water area, or the like and is invisible. Referring specifically to fig. 2D, fig. 2D is a map of the surface material generated when the surface material is grass, the black portion is the coverage of grass, and the translucent portion is the junction of grass and other surface materials.
3. A water area diagram.
When the water area map is generated, the map generation tool can determine the water area range in the scene to be processed, and the water area map of the game scene to be processed is generated according to the water area depth of the water area range, wherein the specific generation process is as follows:
in an optional embodiment, first, all water areas included in a scene to be processed are determined, the water area coverage of all the water areas is identified, the water area coverage is output, and the water area coverage is filled with a preset color to obtain a water area coverage map. The specific map generation tool can generate a water area range map according to all water areas in the game, including waterfalls, rivers, lakes, oceans and the like and all water-containing parts, wherein the water area parts are black, and the rest are transparent. It should be noted that some water in the game may be lower than the ground material, i.e. the underground river, and the water therein cannot be actually observed through overlooking the game scene, so that the water is an invalid area and is not displayed on the water area range diagram. Specifically, the water area range diagram shown in fig. 2E may be output, where the black portion is the water area coverage.
Then, because the water area has a certain depth, the deep water area needs to be additionally reflected on the finally generated map, so that the colors of the deep water area and the shallow water area are different, the map generation tool can inquire the water area depth of each water area in all the water areas, extract the deep water area with the water area depth larger than the depth threshold value in all the water areas, output the deep water coverage range of the deep water area, fill the deep water coverage range with the preset color to obtain a deep water map, and use the water area range map and the deep water map as the water area map. Specifically, taking the water area range diagram shown in fig. 2E as an example, the deep water diagram shown in fig. 2F may be generated, and the water area with the depth reaching the depth threshold in the water area range diagram is additionally shown by a diagram.
In this way, by executing the above three contents, the human architecture map, the surface model map, the plurality of surface material maps and the water area map are generated for the game scene to be processed, and the human architecture map, the surface model map, the plurality of surface material maps and the water area map are used as the plurality of area maps, so that the final target scene map is generated by intercepting the area maps in the following.
202. And constructing a plurality of height maps of the game scene to be processed.
In the embodiment of the present application, in order to intercept the multiple area maps, a part of the multiple area maps that can be displayed in the final target scene map is intercepted, and multiple height maps of the game scene to be processed need to be constructed, so that each area map is intercepted according to the multiple height maps in the following process, and a process of specifically generating the multiple height maps is as follows:
in an alternative embodiment, the map generation tool may determine scene nadir and scene apex for the game scene to be processed. The scene highest point is the highest point of other scene models except the first scene model of the model attribute indication humanistic building in the game scene to be processed, namely the humanistic building is not intercepted, and models of other terrains, large stones in the scene and the like can be intercepted.
And then, the map generation tool performs cross section processing on the game scene to be processed by taking the lowest point of the scene as a reference and every time when a preset height difference is reached, until the highest point of the scene is reached, and a plurality of scene cross sections of the game scene to be processed are obtained.
And finally, filling the plurality of scene cross sections by using a preset color by using a map generation tool to obtain a plurality of height maps.
In fact, when the cross section is cut, except for the civil buildings, all the cut entities are effective areas, and when the effective areas cannot be cut on the cross section, the cross section is stopped, and the effective areas are filled with preset colors (namely black), so that a height map is obtained. In the process of practical application, one of the generated height maps may be as shown in fig. 2G.
203. And intercepting the area maps based on the height maps to obtain an initial scene map of the game scene to be processed.
In the embodiment of the application, after the plurality of height maps are generated, the map generation tool starts to intercept the plurality of area maps based on the plurality of height maps, determines parts capable of being displayed for each area map, and outputs the parts as effective areas to form an initial scene map of the game scene to be processed.
In an alternative embodiment, the process of generating the initial scene map is as follows:
first, the map generation tool determines the highest area of each of the plurality of area maps according to the indication of the plurality of height maps, and obtains a plurality of highest areas. And then, filling the plurality of highest areas with preset colors, and outputting a scene graph comprising the filled plurality of highest areas. And then, respectively determining the area filling color corresponding to each of the area images to obtain a plurality of area filling colors. And for each regional filling color in the plurality of regional filling colors, determining a target regional graph corresponding to the regional filling color, inquiring the highest target region of the target regional graph in the scene graph, and filling the highest target region by adopting the regional filling color in the scene graph. And repeatedly executing the process of filling the filling area, and filling each highest area in the plurality of highest areas in the scene map to obtain the initial scene map.
Actually, according to the heights indicated by the height maps, the highest part in all the area maps is determined to be used as an effective area for output display, and the output effective part is subjected to color filling according to the area filling color corresponding to each area map to be processed, for example, the effective area output based on the water area map is filled with blue; and filling the effective area output by the surface material map based on the grassland material with green so as to obtain an initial scene map which is used for distinguishing different areas by colors, and subsequently, continuously carrying out refining operation on the initial scene map.
204. And generating a contour map of the game scene to be processed.
In the embodiment of the application, considering that the brightness degrees of the areas with different heights displayed in the final target scene map are different, the brightness of the different areas is adjusted according to the heights of the areas to ensure that the generated target scene map is real and accurate, so the map generation tool generates a contour map of the game scene to be processed, and the brightness of the different heights is adjusted according to the contour map.
In an optional embodiment, in generating the contour map, the map generation tool identifies cross-sectional contour lines of the scene cross-section filled with a preset color in each of the plurality of height maps, obtains a plurality of cross-sectional contour lines of the plurality of height maps, superimposes the plurality of cross-sectional contour lines, and color fills a region between every two cross-sectional contour lines in the plurality of cross-sectional contour lines to generate the contour map. In fact, in the height map, the line of the outer contour of the effective area is taken as the contour line of the height, the contour lines are superposed, and different colors are filled between different contour lines, so that a contour map can be formed, and the specific form of the contour map can be shown as fig. 2H, wherein the contour line in the map is the contour line, and different colors are filled between the contour lines.
205. And according to the contour map, adjusting the color depth of a plurality of scene heights in the initial scene map to obtain an intermediate scene map.
In the embodiment of the application, after the contour map of the game scene to be processed is generated, the map generation tool is assisted by the contour map to adjust the color depth of the heights of a plurality of scenes in the initial scene map, increase the lightness (depth) change of the initial scene map, and show the difference between the areas with different heights, so that a subsequent player can know where the terrain is high and where the terrain is low through the map.
206. And acquiring scene details of the game scene to be processed, and adding the scene details to the intermediate scene map to obtain a target scene map of the game scene to be processed.
In the embodiment of the application, some scene models such as the human buildings in the game scene to be processed may be drawn with special patterns or configured with special weapons and the like, which are symbolic patterns or configurations of the human buildings, so in the embodiment of the application, the map generation tool acquires scene details of the game scene to be processed, adds the scene details to the intermediate scene map, and obtains the target scene map of the game scene to be processed, so that the player can directly find the special symbolic buildings on the target scene map, and quickly locate the position where the player wants to go to or is currently located.
In an alternative embodiment, the process of determining and adding scene details is as follows:
firstly, the map generation tool determines preset pixels, and scene capture is carried out on the game scene to be processed right above the game scene to be processed according to the preset pixels, so that a scene aerial view of the game scene to be processed is obtained. The scene aerial view is obtained by intercepting a real game scene to be processed through a map generation tool, and the size of the image element of the intercepted scene aerial view can be configured, so that the definition and the efficiency are guaranteed. It should be noted that, when the scene overhead view is captured, the shadow, the illumination and the perspective relation in the scene to be processed need to be ignored, and the scene overhead view shown in fig. 2I can be obtained in the process of practical application, where the scene overhead view is colored, the color is matched with the actual situation of the game scene to be processed, and the color is not shown in fig. 2I.
Then, the map generation tool identifies a scene overhead view, extracts model details of the multiple scene models on the scene overhead view as scene details, determines model areas corresponding to the multiple scene models in the intermediate scene map, and draws the scene details in the model areas to obtain a target scene map. In the process of practical application, the target scene map shown in fig. 2J may be obtained, it should be noted that the target scene map is colored, the color is matched with the actual situation of the game scene to be processed, and the color is not shown in fig. 2J.
In this way, by executing the processes in step 201 to step 206, an accurate target scene map is automatically generated for the game scene to be processed, and the target scene map is also applied in the game to provide a game reference for the player. It should be noted that fig. 2B to fig. 2J are schematic diagrams illustrating specific styles of each drawing by taking the game scene shown in fig. 2I as an example and distance, and in a practical application process, what style the real scene of the game scene to be processed is, an area diagram, a height diagram, and a contour diagram of what style are obtained, and are not limited to be consistent with the maps shown in fig. 2B to fig. 2J.
In summary, the specific process of the map generation method provided by the application is as follows:
referring to fig. 2K, a game scene to be processed is obtained, a plurality of scene models included in the game scene to be processed are determined, and whether the model attribute of the scene model is a cultural building is determined. Generating a human building graph for a scene model of a human building according to the model attributes; and for the scene model with the model attribute not being the civil building, judging whether the size of the scene model is larger than or equal to a size threshold value, generating a ground surface model map according to the scene model with the size larger than or equal to the size threshold value, and determining the scene model with the size smaller than the size threshold value as small sundries without outputting. Then, a plurality of ground surface materials included in the game scene to be processed are inquired, a ground surface material map is generated for each ground surface material in the ground surface materials, and a plurality of ground surface material maps are obtained. And then, determining all water areas included in the scene to be processed, identifying the water area coverage range of all the water areas, outputting the water area coverage range, and filling the water area coverage range by adopting a preset color to obtain a water area range diagram. Meanwhile, the water area depth of each water area in all the water areas is inquired, the deep water area with the water area depth larger than the depth threshold value is extracted from all the water areas, the deep water coverage range of the deep water area is output, the deep water coverage range is filled with preset colors, a deep water map is obtained, and the shallow water area with the water area depth lower than the depth threshold value is ignored. Then, a map generation tool generates a plurality of height maps of the game scene to be processed, a human architectural map, a surface model map, a plurality of surface material maps, a water area range map and a deep water map are intercepted according to the height maps, the highest part in the maps is output as an effective area to be displayed, and the effective areas are distinguished by different colors to obtain an initial scene map. And then, the map generation tool generates a contour map according to the height maps, adjusts the color depth of the height of the scenes in the initial scene map according to the contour map to obtain an intermediate scene map, and extracts scene details on the scene overhead view of the game scene to be processed to be added to the intermediate scene map to obtain a final target scene map.
According to the method provided by the embodiment of the application, a plurality of area maps are generated according to a scene model, a ground surface material and a water area range in a game scene to be processed, a plurality of height maps of the game scene to be processed are constructed, the area maps are intercepted based on the height maps to obtain an initial scene map of the game scene to be processed, a contour map of the game scene to be processed is generated, the color depth of the scene heights in the initial scene map is adjusted according to the contour map to obtain an intermediate scene map, then scene details of the game scene to be processed are obtained, and the scene details are added to the intermediate scene map to obtain a target scene map. According to the technical scheme, each regional graph is generated according to the landform and the building of the game scene to be processed, the regions needing to be displayed in each regional graph are directly intercepted to be filled with colors, the contour map is supplemented to increase the brightness change of the map, the building details are processed on the map to form the final map, the map can be automatically, quickly and accurately generated, the development personnel do not need to manually draw, a large amount of manpower and material resources are saved, and the generation efficiency of the map is improved.
Further, as a specific implementation of the method shown in fig. 1, an embodiment of the present application provides a map generating apparatus, as shown in fig. 3, the apparatus includes: a generation module 301, a truncation module 302, an adjustment module 303, and an addition module 304.
The generating module 301 is configured to determine a game scene to be processed, and generate a plurality of area maps of the game scene to be processed, where the area maps are generated according to a scene model, a ground surface material, and a water area range in the game scene to be processed;
the capture module 302 is configured to construct a plurality of height maps of the game scene to be processed, capture the plurality of area maps based on the plurality of height maps, and obtain an initial scene map of the game scene to be processed;
the adjusting module 303 is configured to generate a contour map of the game scene to be processed, and adjust the color shades of a plurality of scene heights in the initial scene map according to the contour map to obtain an intermediate scene map;
the adding module 304 is configured to obtain scene details of the game scene to be processed, add the scene details to the intermediate scene map, and obtain a target scene map of the game scene to be processed.
In a specific application scenario, the generating module 301 is configured to determine a plurality of scene models included in the game scene to be processed, and generate a human architectural diagram and a ground surface model diagram of the game scene to be processed according to model attributes of the plurality of scene models; inquiring a plurality of ground surface materials included in the game scene to be processed, and generating a ground surface material diagram for each ground surface material in the plurality of ground surface materials to obtain a plurality of ground surface material diagrams; determining the water area range in the scene to be processed, and generating a water area map of the game scene to be processed according to the water area depth of the water area range; and setting the human architecture drawing, the land model drawing, the plurality of land surface material drawings and the water area drawing as the plurality of area drawings.
In a specific application scenario, the generating module 301 is configured to read model attributes of the plurality of scene models, extract a first scene model of a human building indicated by the model attributes from the plurality of scene models, and output the human building map according to an outer contour of the first scene model, where the outer contour of the first scene model in the human building map is filled with a preset color; and simultaneously or respectively determining other scene models except the first scene model in the plurality of scene models, extracting a second scene model with the size larger than or equal to a size threshold from the other scene models, and outputting the earth surface model diagram according to the outer contour of the second scene model, wherein the outer contour of the second scene model in the earth surface model diagram is filled with the preset color.
In a specific application scenario, the generating module 301 is configured to determine, for each of the plurality of surface materials, a material coverage of the surface material, output the material coverage, and fill the material coverage with a preset color; determining a junction of the earth surface material and other earth surface materials, and counting the material proportion of the earth surface material at the junction, wherein the other earth surface materials are earth surface materials except the current earth surface material in the earth surface materials; according to the material proportion, adjusting the depth of the preset color filled in the joint in the material coverage range to obtain a ground surface material map of the ground surface material; and repeating the process of generating the land surface material map, and generating a land surface material map for each land surface material to obtain the plurality of land surface material maps.
In a specific application scenario, the generating module 301 is configured to determine all water areas included in the scene to be processed, identify a water area coverage range of all the water areas, and output the water area coverage range; filling the water area coverage range by adopting a preset color to obtain a water area range diagram; inquiring the water area depth of each water area in all the water areas, and extracting the deep water areas with the water area depth larger than a depth threshold value from all the water areas; outputting the deepwater coverage range of the deepwater area, and filling the deepwater coverage range by adopting the preset color to obtain a deepwater map; and taking the water area range map and the deepwater map as the water area map.
In a specific application scenario, the intercepting module 302 is configured to determine a scene lowest point and a scene highest point of the game scene to be processed, where the scene highest point is a highest point of other scene models in the game scene to be processed except for a first scene model of a human building indicated by the model attribute; taking the lowest point of the scene as a reference, and carrying out cross section processing on the game scene to be processed every time a preset height difference is reached upwards until the highest point of the scene is reached to obtain a plurality of scene cross sections of the game scene to be processed; and filling the plurality of scene cross sections by adopting a preset color to obtain the plurality of height maps.
In a specific application scenario, the intercepting module 302 is configured to intercept a highest area of each of the area maps according to the indications of the height maps to obtain a plurality of highest areas; filling the plurality of highest areas by adopting a preset color, and outputting a scene graph comprising the filled highest areas; respectively determining the area filling color corresponding to each area map in the area maps to obtain a plurality of area filling colors; for each region filling color in the plurality of region filling colors, determining a target region graph corresponding to the region filling color, and querying a target highest region of the target region graph in the scene graph; filling the highest target area by adopting the area filling color in the scene graph; and repeatedly executing the process of filling the filling area with the color, and filling each highest area in the plurality of highest areas in the scene graph to obtain the initial scene map.
In a specific application scenario, the adjusting module 303 is configured to identify a cross-section contour line of a scene cross section filled with a preset color in each of the height maps, so as to obtain a plurality of cross-section contour lines of the height maps; and superposing the plurality of cross-section contour lines, and filling the area between every two cross-section contour lines in the plurality of cross-section contour lines with colors to generate the contour map.
In a specific application scenario, the adding module 304 is configured to determine a preset pixel, and perform scene interception on the game scene to be processed right above the game scene to be processed according to the preset pixel to obtain a scene aerial view of the game scene to be processed; identifying the scene aerial view, and extracting model details of a plurality of scene models on the scene aerial view as the scene details; and determining model areas corresponding to the scene models in the intermediate scene map, and drawing the scene details in the model areas to obtain the target scene map.
The device provided by the embodiment of the application generates a plurality of area maps according to a scene model, a ground surface material and a water area range in a game scene to be processed, constructs a plurality of height maps of the game scene to be processed, intercepts the area maps based on the height maps to obtain an initial scene map of the game scene to be processed, generates a contour map of the game scene to be processed, adjusts the color shades of the scene heights in the initial scene map according to the contour map to obtain an intermediate scene map, then obtains scene details of the game scene to be processed, and adds the scene details to the intermediate scene map to obtain a target scene map. According to the technical scheme, each regional graph is generated according to the landform and the building of the game scene to be processed, the regions needing displaying in each regional graph are directly intercepted for color filling, the contour map is supplemented to increase the lightness change of the map, the building details are processed on the map to form the final map, the map can be automatically, quickly and accurately generated, the manual drawing of developers is not needed, a large amount of manpower and material resources are saved, and the generation efficiency of the map is improved.
It should be noted that other corresponding descriptions of the functional units related to the map generating apparatus provided in the embodiment of the present application may refer to the corresponding descriptions in fig. 1 and fig. 2A to fig. 2K, and are not described again here.
In an exemplary embodiment, referring to fig. 4, a computer device is further provided, the computer device includes a bus, a processor, a memory, a communication interface, an input/output interface, and a display device, wherein the functional units can communicate with each other through the bus. The memory stores computer programs, and the processor executes the programs stored in the memory to perform the map generation method in the above embodiment.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the map generation method.
Through the description of the above embodiments, those skilled in the art can clearly understand that the present application can be implemented by hardware, and can also be implemented by software plus a necessary general hardware platform. Based on such understanding, the technical solution of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, or the like), and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, or the like) to execute the method described in the implementation scenarios of the present application.
Those skilled in the art will appreciate that the figures are merely schematic representations of one preferred implementation scenario and that the blocks or flow diagrams in the figures are not necessarily required to practice the present application.
Those skilled in the art will appreciate that the modules in the devices in the implementation scenario may be distributed in the devices in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the present implementation scenario with corresponding changes. The modules of the implementation scenario may be combined into one module, or may be further split into a plurality of sub-modules.
The above application serial number is merely for description and does not represent the superiority and inferiority of the implementation scenario.
The above disclosure is only a few specific implementation scenarios of the present application, but the present application is not limited thereto, and any variations that can be made by those skilled in the art are intended to fall within the scope of the present application.
Claims (12)
1. A map generation method, comprising:
determining a game scene to be processed, and generating a plurality of area maps of the game scene to be processed, wherein the area maps are generated according to a scene model, a ground surface material and a water area range in the game scene to be processed;
constructing a plurality of height maps of the game scene to be processed, and intercepting the plurality of area maps based on the plurality of height maps to obtain an initial scene map of the game scene to be processed, wherein the plurality of height maps are obtained by performing cross section processing and color filling processing on the game scene to be processed every time a preset height difference is reached by taking the scene lowest point as a reference after the scene lowest point and the scene highest point of the game scene to be processed are determined;
generating a contour map of the game scene to be processed, and adjusting the color depth of a plurality of scene heights in the initial scene map according to the contour map to obtain an intermediate scene map, wherein the contour map is obtained by superposing a plurality of cross section contour lines extracted from the plurality of height maps;
and acquiring scene details of the game scene to be processed, and adding the scene details to the intermediate scene map to obtain a target scene map of the game scene to be processed.
2. The method of claim 1, wherein generating the plurality of region maps of the game scene to be processed comprises:
determining a plurality of scene models included in the game scene to be processed, and generating a human architectural diagram and a ground surface model diagram of the game scene to be processed according to the model attributes of the scene models;
inquiring a plurality of surface materials included in the game scene to be processed, generating a surface material map for each surface material in the plurality of surface materials, and obtaining a plurality of surface material maps;
determining the water area range in the game scene to be processed, and generating a water area map of the game scene to be processed according to the water area depth of the water area range;
and setting the human architecture drawing, the land model drawing, the plurality of land surface material drawings and the water area drawing as the plurality of area drawings.
3. The method of claim 2, wherein generating the human architecture map and the surface model map of the game scene to be processed according to the model attributes of the scene models comprises:
reading model attributes of the scene models, extracting a first scene model of which the model attributes indicate a human building from the scene models, and outputting a human building diagram according to the outline of the first scene model, wherein the outline of the first scene model in the human building diagram is filled with a preset color;
and simultaneously or respectively determining other scene models except the first scene model in the plurality of scene models, extracting a second scene model with the size larger than or equal to a size threshold value from the other scene models, and outputting the earth surface model diagram according to the outer contour of the second scene model, wherein the outer contour of the second scene model in the earth surface model diagram is filled with the preset color.
4. The method of claim 2, wherein the querying a plurality of surface materials included in the game scene to be processed, and generating a surface material map for each of the plurality of surface materials, to obtain a plurality of surface material maps, comprises:
for each of the plurality of surface materials, determining a material coverage of the surface material, outputting the material coverage, and filling the material coverage with a preset color;
determining a junction of the earth surface material and other earth surface materials, and counting the material proportion of the earth surface material at the junction, wherein the other earth surface materials are earth surface materials except the current earth surface material in the earth surface materials;
adjusting the depth of the preset color filled in the joint in the material coverage range according to the material proportion to obtain a ground surface material map of the ground surface material;
and repeating the process of generating the ground surface material map, and generating the ground surface material map for each ground surface material to obtain the plurality of ground surface material maps.
5. The method of claim 2, wherein the determining the water area range in the game scene to be processed, and generating the water area map of the game scene to be processed according to the water area depth of the water area range comprises:
determining all water areas included in the game scene to be processed, identifying the water area coverage range of all the water areas, and outputting the water area coverage range;
filling the water area coverage area by adopting a preset color to obtain a water area coverage area map;
inquiring the water area depth of each water area in all the water areas, and extracting the deep water areas with the water area depth larger than a depth threshold value from all the water areas;
outputting the deepwater coverage range of the deepwater area, and filling the deepwater coverage range by adopting the preset color to obtain a deepwater map;
and taking the water area range map and the deepwater map as the water area map.
6. The method of claim 1, wherein the constructing the plurality of height maps of the game scene to be processed comprises:
determining a scene lowest point and a scene highest point of the game scene to be processed, wherein the scene highest point is the highest point of other scene models except a first scene model of a model attribute indicating a civil building in the game scene to be processed;
taking the lowest point of the scene as a reference, and carrying out cross section processing on the game scene to be processed every time a preset height difference is reached upwards until the highest point of the scene is reached to obtain a plurality of scene cross sections of the game scene to be processed;
and filling the plurality of scene cross sections by adopting preset colors to obtain the plurality of height maps.
7. The method of claim 1, wherein the intercepting the plurality of area maps based on the plurality of height maps to obtain an initial scene map of the game scene to be processed comprises:
according to the indication of the height maps, intercepting the highest area of each area map in the area maps to obtain a plurality of highest areas;
filling the plurality of highest areas by adopting a preset color, and outputting a scene graph comprising the filled highest areas;
respectively determining the area filling color corresponding to each area map in the area maps to obtain a plurality of area filling colors;
for each region filling color in the plurality of region filling colors, determining a target region graph corresponding to the region filling color, and querying a target highest region of the target region graph in the scene graph;
filling the highest target area by using the area filling color in the scene graph;
and repeatedly executing the process of filling the filling area with the color, and filling each highest area in the plurality of highest areas in the scene graph to obtain the initial scene map.
8. The method of claim 1, wherein generating the contour map of the game scene to be processed comprises:
identifying cross section contour lines of the scene cross section filled with preset colors in each of the height maps to obtain a plurality of cross section contour lines of the height maps;
and superposing the plurality of cross-section contour lines, and filling the area between every two cross-section contour lines in the plurality of cross-section contour lines with colors to generate the contour map.
9. The method of claim 1, wherein the obtaining scene details of the game scene to be processed and adding the scene details to the intermediate scene map to obtain a target scene map of the game scene to be processed comprises:
determining preset pixels, and performing scene interception on the game scene to be processed right above the game scene to be processed according to the preset pixels to obtain a scene aerial view of the game scene to be processed;
identifying the scene aerial view, and extracting model details of a plurality of scene models on the scene aerial view as the scene details;
and determining model areas corresponding to the scene models in the intermediate scene map, and drawing the scene details in the model areas to obtain the target scene map.
10. A map generation apparatus, comprising:
the generating module is used for determining a game scene to be processed and generating a plurality of area maps of the game scene to be processed, wherein the area maps are generated according to a scene model, a ground surface material and a water area range in the game scene to be processed;
the intercepting module is used for constructing a plurality of height maps of the game scene to be processed, intercepting the plurality of area maps based on the plurality of height maps to obtain an initial scene map of the game scene to be processed, wherein the plurality of height maps are obtained by performing cross section processing and color filling processing on the game scene to be processed every time a preset height difference is reached upwards by taking the scene lowest point as a reference after determining the scene lowest point and the scene highest point of the game scene to be processed;
the adjusting module is used for generating a contour map of the game scene to be processed, and adjusting the color depth of a plurality of scene heights in the initial scene map according to the contour map to obtain an intermediate scene map, wherein the contour map is obtained by superposing a plurality of cross section contour lines extracted from the height maps;
and the adding module is used for acquiring scene details of the game scene to be processed, and adding the scene details to the intermediate scene map to obtain a target scene map of the game scene to be processed.
11. A computer arrangement comprising a memory and a processor, the memory storing a computer program, wherein the processor when executing the computer program performs the steps of the method according to any of claims 1 to 9.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 9.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202111342256.7A CN114053708B (en) | 2021-11-12 | 2021-11-12 | Map generation method and device, computer equipment and computer readable storage medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202111342256.7A CN114053708B (en) | 2021-11-12 | 2021-11-12 | Map generation method and device, computer equipment and computer readable storage medium |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN114053708A CN114053708A (en) | 2022-02-18 |
| CN114053708B true CN114053708B (en) | 2022-12-16 |
Family
ID=80271626
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202111342256.7A Active CN114053708B (en) | 2021-11-12 | 2021-11-12 | Map generation method and device, computer equipment and computer readable storage medium |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN114053708B (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN119925937B (en) * | 2025-01-07 | 2026-01-27 | 网易(杭州)网络有限公司 | Map data processing methods, devices, electronic equipment, and storage media |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120069051A1 (en) * | 2008-09-11 | 2012-03-22 | Netanel Hagbi | Method and System for Compositing an Augmented Reality Scene |
| CN109771951A (en) * | 2019-02-13 | 2019-05-21 | 网易(杭州)网络有限公司 | Method, apparatus, storage medium and the electronic equipment that map generates |
| CN110262865A (en) * | 2019-06-14 | 2019-09-20 | 网易(杭州)网络有限公司 | Construct method and device, the computer storage medium, electronic equipment of scene of game |
| CN111111172A (en) * | 2019-12-02 | 2020-05-08 | 网易(杭州)网络有限公司 | Method and device for processing ground surface of game scene, processor and electronic device |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109701273B (en) * | 2019-01-16 | 2022-04-19 | 腾讯科技(北京)有限公司 | Game data processing method and device, electronic equipment and readable storage medium |
-
2021
- 2021-11-12 CN CN202111342256.7A patent/CN114053708B/en active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120069051A1 (en) * | 2008-09-11 | 2012-03-22 | Netanel Hagbi | Method and System for Compositing an Augmented Reality Scene |
| CN109771951A (en) * | 2019-02-13 | 2019-05-21 | 网易(杭州)网络有限公司 | Method, apparatus, storage medium and the electronic equipment that map generates |
| CN110262865A (en) * | 2019-06-14 | 2019-09-20 | 网易(杭州)网络有限公司 | Construct method and device, the computer storage medium, electronic equipment of scene of game |
| CN111111172A (en) * | 2019-12-02 | 2020-05-08 | 网易(杭州)网络有限公司 | Method and device for processing ground surface of game scene, processor and electronic device |
Also Published As
| Publication number | Publication date |
|---|---|
| CN114053708A (en) | 2022-02-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12299817B2 (en) | Virtual scenario generation method and apparatus, computer device and storage medium | |
| KR101165534B1 (en) | Geospatial modeling system providing simulated tree trunks and branches for groups of tree crown vegetation points and related methods | |
| US12456254B2 (en) | Terrain generation and population system | |
| CN111737790B (en) | Method and equipment for constructing simulated city model | |
| CN101488226A (en) | Tree measurement and reconstruction method based on single three-dimensional laser scanning | |
| CN112102492A (en) | Game resource manufacturing method and device, storage medium and terminal | |
| US20230162432A1 (en) | System and method for translating a 2d image to a 3d image | |
| CN114053708B (en) | Map generation method and device, computer equipment and computer readable storage medium | |
| CN109064556A (en) | A kind of landforms High Precision Simulation modeling towards ISR | |
| Zhang et al. | Linking image-based metrics to 3D model-based metrics for assessment of visual landscape quality | |
| CN114202626A (en) | Model replacement method and storage medium for visual building | |
| Wang et al. | A Web3D forest geo-visualization and user interface evaluation | |
| CN113689515A (en) | Map rendering system, method and medium | |
| KR102534464B1 (en) | System for generating virtual golf course having the geometry of real golf courses, and method for providing putting pracice using the system | |
| CN118673088A (en) | Thermal map generation method, device, electronic device, storage medium and program product | |
| Lovett et al. | GIS-based landscape visualization—The state of the art | |
| Van Bilsen et al. | 3D visibility analysis in virtual worlds: the case of supervisor | |
| CN116843843B (en) | March road line three-dimensional scene simulation method | |
| CN115888103B (en) | Game display control method, device, computer equipment and medium | |
| CN115018998B (en) | Battlefield terrain generation method and device for visualizing network attack and defense situation | |
| Sang | Application of UAV-based 3D modeling and visualization technology in urban planning | |
| CN116351066B (en) | Method and device for rendering weather objects in games, storage medium, and electronic device | |
| Han | Research on the Application of Virtual Reality Technology in the Integrated Design of Architectural Landscape | |
| Cassettari | More mapping, less cartography: tackling the challenge | |
| Begay | Assessment of the Impact of Forest Management Activities on Forest Aesthetics Using Photogrammetric Point Clouds |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |