CN111462292B - Layering rendering method, medium, equipment and device - Google Patents
Layering rendering method, medium, equipment and device Download PDFInfo
- Publication number
- CN111462292B CN111462292B CN202010202049.0A CN202010202049A CN111462292B CN 111462292 B CN111462292 B CN 111462292B CN 202010202049 A CN202010202049 A CN 202010202049A CN 111462292 B CN111462292 B CN 111462292B
- Authority
- CN
- China
- Prior art keywords
- rendering
- layer
- information
- results
- layered
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
- G06T15/205—Image-based rendering
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Image Generation (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention discloses a method, a medium, equipment and a device for layered rendering, wherein the method comprises the following steps: obtaining rendering layer information, wherein the rendering layer information comprises a data source, rendering information, type information and sequencing information corresponding to a rendering layer; rendering the data source corresponding to the rendering layer according to the rendering information to generate a first rendering result; traversing all layers, judging whether the adjacent layers are of the same type according to the type information, and if so, superposing first rendering results corresponding to the adjacent layers of the same type to generate a second rendering result; superposing all the first rendering results and all the second rendering results to complete the layered rendering; the access and understanding cost of the use layer in the image rendering process can be greatly reduced, and meanwhile, the decoupling performance and the expansibility are improved.
Description
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a layered rendering method, a computer-readable storage medium, a computer device, and a layered rendering apparatus.
Background
In the related art, when rendering is performed on materials selected by a user, such as special effect superposition and the like, the rendering is performed in a mode of packaging an effect module, assembly logic in the mode needs to be controlled by a use layer, and the use layer is often different in platform. Therefore, a large amount of repeated work exists in the processing process, and when a non-simple filter effect exists, all logics need to be assembled by the use layer, so that the use layer frequently adjusts the old codes, the development efficiency is reduced, and meanwhile, the decoupling performance of the code structure and the function expansibility are influenced.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the art described above. Therefore, one objective of the present invention is to provide a layer-based rendering method, which can greatly reduce the access and understanding costs of the layers used in the image rendering process, and improve the decoupling performance and the extensibility.
A second object of the invention is to propose a computer-readable storage medium.
A third object of the invention is to propose a computer device.
The fourth purpose of the invention is to provide a layering rendering device.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides a layer-based rendering method, including the following steps: obtaining rendering layer information, wherein the rendering layer information comprises a data source, rendering information, type information and sequencing information corresponding to a rendering layer; rendering the data source corresponding to the rendering layer according to the rendering information to generate a first rendering result; traversing all layers, judging whether the adjacent layers are of the same type according to the type information, and if so, superposing first rendering results corresponding to the adjacent layers of the same type to generate a second rendering result; and superposing all the first rendering results and all the second rendering results to finish the layered rendering.
According to the layering rendering method provided by the embodiment of the invention, firstly, rendering layer information is obtained, wherein the rendering layer information comprises a data source, rendering information, type information and sequencing information corresponding to a rendering layer; then, rendering the data source corresponding to the rendering layer according to the rendering information to generate a first rendering result; then, traversing all layers, judging whether the adjacent layers are of the same type according to the type information, and if so, superposing first rendering results corresponding to the adjacent layers of the same type to generate a second rendering result; then, overlapping all the first rendering results and all the second rendering results to complete the layered rendering; therefore, the access and understanding cost of the use layer in the image rendering process is greatly reduced, and meanwhile, the decoupling performance and the expansibility are improved.
In addition, the layering rendering method provided according to the above embodiment of the present invention may further have the following additional technical features:
optionally, after obtaining the rendering layer information, the method further includes: and acquiring adjustment layer information, wherein the adjustment layer information comprises adjustment information, type information and sequencing information corresponding to an adjustment layer.
Optionally, when traversing all layers, whether the currently traversed layer is an adjustment layer is further determined according to the type information, if the currently traversed layer is the adjustment layer, the first rendering result and the second rendering result corresponding to the rendering layers sequenced before the adjustment layer are superimposed according to the sequencing information to generate a third rendering result, so that the layered rendering is completed according to all the first rendering results, all the second rendering results and all the third rendering results.
Optionally, after obtaining the rendering layer information, the method further includes: obtaining information of a mask layer, wherein the information of the mask layer comprises information of a mask area and effective layer information corresponding to the mask area.
Optionally, when traversing all layers, determining whether the currently traversed layer is a mask layer according to the type information, and if the currently traversed layer is the mask layer, performing mask processing on the corresponding effective layer according to the mask region information.
Optionally, after obtaining the rendering layer information, the method further includes: and obtaining background layer information, wherein after all the first rendering results and all the second rendering results are superposed, adding a canvas background according to the background layer information.
In order to achieve the above object, a second aspect of the present invention provides a computer-readable storage medium, on which a layered rendering program is stored, and when executed by a processor, the layered rendering program implements the above layered rendering method.
According to the computer-readable storage medium of the embodiment of the invention, the graph-layered rendering program is stored, so that the processor can realize the graph-layered rendering method when executing the graph-layered rendering program, thereby greatly reducing the access and understanding cost of the used layer in the image rendering process, and simultaneously improving the decoupling property and the expansibility.
In order to achieve the above object, a third embodiment of the present invention provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the method for layered rendering as described above when executing the program.
According to the computer equipment provided by the embodiment of the invention, the layering rendering program is stored through the memory, so that the layering rendering method is realized when the processor executes the layering rendering program, the access and understanding cost of the used layer in the image rendering process is greatly reduced, and meanwhile, the decoupling property and the expansibility are improved.
In order to achieve the above object, a fourth aspect of the present invention provides a layered rendering apparatus, including: the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring rendering layer information, and the rendering layer information comprises a data source, rendering information, type information and sequencing information corresponding to a rendering layer; the first rendering module is used for rendering the data source corresponding to the rendering layer according to the rendering information so as to generate a first rendering result; the grouping module is used for traversing all layers, judging whether the adjacent layers are of the same type according to the type information, and superposing first rendering results corresponding to the adjacent layers of the same type to generate a second rendering result when the judgment result is yes; a second rendering module, configured to superimpose all the first rendering results and all the second rendering results to complete the layered rendering.
According to the layering rendering device provided by the embodiment of the invention, the setting acquisition module is used for acquiring the rendering layer information, wherein the rendering layer information comprises a data source, rendering information, type information and sequencing information corresponding to a rendering layer; the first rendering module is used for rendering the data source corresponding to the rendering layer according to the rendering information to generate a first rendering result; the grouping module is used for traversing all the layers, judging whether the adjacent layers are of the same type according to the type information, and superposing first rendering results corresponding to the adjacent layers of the same type to generate a second rendering result if the adjacent layers are of the same type; the second rendering module is used for overlaying all the first rendering results and all the second rendering results to complete the layered rendering, so that the access and understanding cost of a use layer in the image rendering process is greatly reduced, and meanwhile, the decoupling performance and the expansibility are improved.
In addition, the layering rendering device provided according to the above embodiment of the present invention may further have the following additional technical features:
optionally, after obtaining the rendering layer information, the method further includes: and acquiring adjusting layer information, wherein the adjusting layer information comprises adjusting information, type information and sequencing information corresponding to an adjusting layer.
Drawings
Fig. 1 is a schematic flowchart of a layered rendering method according to an embodiment of the present invention;
fig. 2 is a block diagram of a layered rendering apparatus according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
According to the layering rendering method provided by the embodiment of the invention, firstly, rendering layer information is obtained, wherein the rendering layer information comprises a data source, rendering information, type information and sequencing information corresponding to a rendering layer; then, rendering the data source corresponding to the rendering layer according to the rendering information to generate a first rendering result; then, traversing all layers, judging whether the adjacent layers are of the same type according to the type information, and if so, superposing first rendering results corresponding to the adjacent layers of the same type to generate a second rendering result; then, overlapping all the first rendering results and all the second rendering results to complete the layered rendering; therefore, the access and understanding cost of the use layer in the image rendering process is greatly reduced, and meanwhile, the decoupling performance and the expansibility are improved.
In order to better understand the above technical solutions, exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
In order to better understand the technical solution, the technical solution will be described in detail with reference to the drawings and the specific embodiments.
Fig. 1 is a schematic flowchart of a layer-based rendering method according to an embodiment of the present invention, and as shown in fig. 1, the layer-based rendering method includes the following steps:
s101, obtaining rendering layer information, wherein the rendering layer information comprises a data source, rendering information, type information and sequencing information corresponding to a rendering layer.
That is, rendering layer information corresponding to a layer to be rendered is obtained.
It should be noted that the data source represents raw data of various types of materials used for processing, for example, a frame of data, a segment of text, and the like, and the subsequent rendering processing is based on the data source. The data source can be multiplexed to different layers, so that the occupied space of a memory can be saved, and the output results of a plurality of layers using the same data source can be influenced while one data source is updated through the data source.
The rendering information may include various data types, for example, 2D transformation information, 3D transformation information, cropping information, scaling information, layout information, and the like of the user for the data source; the data type of the rendering information is not limited herein.
The type information may include multiple types, for example, a 2D processing rendering layer, a 3D processing rendering layer, a scaling rendering layer, and the like, where the type information is not limited herein.
The acquisition mode of the sequencing information can be various; for example, natural sequencing is performed according to the generation sequence of the layers to generate sequencing information corresponding to each layer; or acquiring a sequencing code corresponding to each layer input by a user, and generating sequencing information corresponding to each layer according to the sequencing code; or, obtaining a dragging operation of the user on the layers, and determining the sequencing information corresponding to each layer according to the dragging operation of the user.
And S102, rendering the data source corresponding to the rendering layer according to the rendering information to generate a first rendering result.
S103, traversing all layers, judging whether the adjacent layers are of the same type according to the type information, and if so, overlapping the first rendering results corresponding to the adjacent layers of the same type to generate a second rendering result.
As an example, when five rendering layers, namely a (2D rendering), B (2D rendering), C (3D rendering), D (2D rendering) and E (3D rendering), are included, all layers are traversed according to the sequence of the layers, and whether the currently traversed layer and the adjacent layer belong to the same type is determined according to the type information of each layer, so that it can be known that the types of the a layer and the B layer in the adjacent layers are consistent according to the type information corresponding to the five rendering layers, the two layers are used as the same layer group, and the rendering results corresponding to the two layers are superimposed, so that the superimposed rendering result is used as the effect of the layer group.
It can be understood that different types of rendering layers have different rendering modes; for example, a rendering layer of 2D transform is rendered in a different manner from a rendering layer of 3D transform, and the concept of depth exists in 3D transform but does not exist in 2D transform. Therefore, different types of rendering layers are grouped according to the types of the rendering layers, and the rendering efficiency of the image can be further improved.
And S104, superposing all the first rendering results and all the second rendering results to complete the layered rendering.
That is, after all layers are traversed, all first rendering results and all second rendering results are superimposed to complete the layered rendering.
In some embodiments, in order to improve operability of the layer rendering method provided by the embodiment of the present invention, development efficiency is further improved.
After the rendering layer information is obtained, adjusting layer information is further obtained, wherein the adjusting layer information includes adjusting information, type information and sequencing information corresponding to an adjusting layer.
And in the process of traversing all layers, judging whether the currently traversed layer is an adjusting layer according to the type information, if the currently traversed layer is the adjusting layer, superposing a first rendering result and a second rendering result corresponding to a rendering layer sequenced before the adjusting layer according to the sequencing information to generate a third rendering result, so that the layered rendering is completed according to all the first rendering results, all the second rendering results and all the third rendering results.
As an example, when five rendering layers including a (2D rendering), B (2D rendering), C (3D rendering), D (adjustment layer), and E (3D rendering) are included, traversing all layers according to layer sequencing, when traversing to a layer B, constructing a layer a and a layer B as a same type layer group according to type information, and overlaying rendering results of two layers, continuously traversing the layers, and when traversing to D and judging that the currently traversed layer is an adjustment layer according to the type information, overlaying a rendering result (second rendering result) of the same type layer group and a rendering result (first rendering result) of a layer C according to sequencing information to generate a third rendering result; and finally, overlapping the third rendering result with the first rendering result of the layer E to finish the layered rendering.
In some embodiments, after obtaining the rendering layer information, the method for layered rendering according to the embodiments of the present invention further includes: and acquiring the information of the mask layer, wherein the information of the mask layer comprises information of a mask area and effective layer information corresponding to the mask area.
And when traversing all the layers, judging whether the current traversed layer is a mask layer according to the type information, and if the current traversed layer is the mask layer, performing mask processing on the corresponding effective layer according to the mask region information.
That is to say, the masking layer may take effect corresponding to a single layer, or may take effect corresponding to a plurality of layers, and in the process of traversing all the layers, when the masking layer is traversed, the masking region masking processing is performed on the corresponding layer according to the information of the taking effect layer.
In some embodiments, after obtaining the rendering layer information, the method further includes: and obtaining background layer information, wherein after all the first rendering results and all the second rendering results are superposed, adding canvas backgrounds according to the background layer information.
That is, when the traversed layer is the background layer, the layer is not processed, and the background is added after all rendering results are completely superimposed.
In summary, according to the layer-based rendering method of the embodiment of the present invention, first, rendering layer information is obtained, where the rendering layer information includes a data source, rendering information, type information, and sequencing information corresponding to a rendering layer; then, rendering the data source corresponding to the rendering layer according to the rendering information to generate a first rendering result; then, traversing all layers, judging whether the adjacent layers are of the same type according to the type information, and if so, superposing first rendering results corresponding to the adjacent layers of the same type to generate a second rendering result; then, overlapping all the first rendering results and all the second rendering results to complete the layered rendering; therefore, the access and understanding cost of the use layer in the image rendering process is greatly reduced, and meanwhile, the decoupling performance and the expansibility are improved.
In order to implement the foregoing embodiments, an embodiment of the present invention provides a computer-readable storage medium, on which a layered rendering program is stored, and the layered rendering program, when executed by a processor, implements the layered rendering method as described above.
According to the computer-readable storage medium of the embodiment of the invention, the graph-layered rendering program is stored, so that the processor can realize the graph-layered rendering method when executing the graph-layered rendering program, thereby greatly reducing the access and understanding cost of the used layer in the image rendering process, and simultaneously improving the decoupling property and the expansibility.
In order to implement the foregoing embodiments, an embodiment of the present invention provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the method for layered rendering as described above.
According to the computer equipment provided by the embodiment of the invention, the layering rendering program is stored through the memory, so that the layering rendering method is realized when the processor executes the layering rendering program, the access and understanding cost of the used layer in the image rendering process is greatly reduced, and meanwhile, the decoupling property and the expansibility are improved.
In order to implement the foregoing embodiment, an embodiment of the present invention provides a layering rendering apparatus, and as shown in fig. 2, the layering rendering apparatus includes: an acquisition module 10, a first rendering module 20, a grouping module 30 and a second rendering module 40.
The obtaining module 10 is configured to obtain rendering layer information, where the rendering layer information includes a data source, rendering information, type information, and ordering information corresponding to a rendering layer;
the first rendering module 20 is configured to render the data source corresponding to the rendered layer according to the rendering information to generate a first rendering result;
the grouping module 30 is configured to traverse all layers, determine whether adjacent layers are of the same type according to the type information, and superimpose first rendering results corresponding to adjacent layers of the same type when the determination result is yes, so as to generate a second rendering result;
the second rendering module 40 is configured to overlay all the first rendering results and all the second rendering results to complete the layered rendering.
In some embodiments, after obtaining the rendering layer information, the method further includes: and acquiring adjusting layer information, wherein the adjusting layer information comprises adjusting information, type information and sequencing information corresponding to an adjusting layer.
It should be noted that the above description about the layered rendering method in fig. 1 is also applicable to the layered rendering apparatus, and is not repeated herein.
In summary, according to the layered rendering apparatus in the embodiment of the present invention, the obtaining module is configured to obtain the rendering layer information, where the rendering layer information includes a data source, rendering information, type information, and sequencing information corresponding to a rendering layer; the first rendering module is used for rendering the data source corresponding to the rendering layer according to the rendering information so as to generate a first rendering result; the grouping module is used for traversing all the layers, judging whether the adjacent layers are of the same type according to the type information, and superposing first rendering results corresponding to the adjacent layers of the same type to generate a second rendering result if the adjacent layers are of the same type; the second rendering module is used for superposing all the first rendering results and all the second rendering results to complete layered rendering, so that the access and understanding cost of a use layer in the image rendering process is greatly reduced, and meanwhile, the decoupling performance and the expansibility are improved.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.
In the description of the present invention, it is to be understood that the terms "first", "second", and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or to imply that the number of technical features indicated are in fact significant. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through an intermediate. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above should not be understood to necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Moreover, various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without being mutually inconsistent.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.
Claims (7)
1. A layering rendering method is characterized by comprising the following steps:
obtaining rendering layer information, wherein the rendering layer information comprises a data source, rendering information, type information and sequencing information corresponding to a rendering layer;
rendering the data source corresponding to the rendering layer according to the rendering information to generate a first rendering result;
traversing all layers, judging whether the adjacent layers are of the same type according to the type information, and if so, superposing first rendering results corresponding to the adjacent layers of the same type to generate a second rendering result;
superposing all the first rendering results and all the second rendering results to complete the layered rendering;
after the rendering layer information is obtained, the method further includes: acquiring adjusting layer information, wherein the adjusting layer information comprises adjusting information, type information and sequencing information corresponding to an adjusting layer;
and when traversing all layers, judging whether the currently traversed layer is an adjusting layer according to the type information, if the currently traversed layer is the adjusting layer, superposing a first rendering result and a second rendering result corresponding to a rendering layer sequenced before the adjusting layer according to the sequencing information to generate a third rendering result, so that the layered rendering is completed according to all the first rendering results, all the second rendering results and all the third rendering results.
2. The layered rendering method of claim 1, after obtaining rendering layer information, further comprising: obtaining information of a mask layer, wherein the information of the mask layer comprises information of a mask area and effective layer information corresponding to the mask area.
3. The layered rendering method according to claim 2, wherein when traversing all layers, it is further determined whether the currently traversed layer is a masking layer according to the type information, and if the currently traversed layer is the masking layer, the corresponding effective layer is masked according to the masking region information.
4. The layered rendering method of claim 1, after obtaining rendering layer information, further comprising: and obtaining background layer information, wherein after all the first rendering results and all the second rendering results are superposed, adding a canvas background according to the background layer information.
5. A computer-readable storage medium, on which a layered rendering program is stored, which when executed by a processor implements the layered rendering method according to any one of claims 1-4.
6. A computer device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the graphics rendering method of any of claims 1-4 when executing the program.
7. A layered rendering apparatus, comprising:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring rendering layer information, and the rendering layer information comprises a data source, rendering information, type information and sequencing information corresponding to a rendering layer;
the first rendering module is used for rendering the data source corresponding to the rendering layer according to the rendering information so as to generate a first rendering result;
the grouping module is used for traversing all layers, judging whether the adjacent layers are of the same type according to the type information, and superposing first rendering results corresponding to the adjacent layers of the same type to generate a second rendering result when the judgment result is yes;
a second rendering module, configured to superimpose all the first rendering results and all the second rendering results to complete a layered rendering;
after the rendering layer information is obtained, the method further includes: acquiring adjusting layer information, wherein the adjusting layer information comprises adjusting information, type information and sequencing information corresponding to an adjusting layer;
and when traversing all layers, judging whether the currently traversed layer is an adjusting layer according to the type information, if the currently traversed layer is the adjusting layer, superposing a first rendering result and a second rendering result corresponding to a rendering layer sequenced before the adjusting layer according to the sequencing information to generate a third rendering result, so that the layered rendering is completed according to all the first rendering results, all the second rendering results and all the third rendering results.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010202049.0A CN111462292B (en) | 2020-03-20 | 2020-03-20 | Layering rendering method, medium, equipment and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010202049.0A CN111462292B (en) | 2020-03-20 | 2020-03-20 | Layering rendering method, medium, equipment and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111462292A CN111462292A (en) | 2020-07-28 |
CN111462292B true CN111462292B (en) | 2023-02-24 |
Family
ID=71682935
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010202049.0A Active CN111462292B (en) | 2020-03-20 | 2020-03-20 | Layering rendering method, medium, equipment and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111462292B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112862940B (en) * | 2021-03-10 | 2024-11-19 | 广州南方卫星导航仪器有限公司 | Map rendering method, device, equipment and storage medium |
CN113992981B (en) * | 2021-10-21 | 2024-03-15 | 稿定(厦门)科技有限公司 | Video image processing method and device |
CN114064172A (en) * | 2021-11-08 | 2022-02-18 | 北京沃东天骏信息技术有限公司 | Data rendering method and device |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120013938A1 (en) * | 2010-07-14 | 2012-01-19 | Hiroshi Nogawa | Image processing device, hardware accelerator, and image processing method |
US20140333657A1 (en) * | 2013-05-10 | 2014-11-13 | Rightware Oy | Method of and system for rendering an image |
WO2018036526A1 (en) * | 2016-08-24 | 2018-03-01 | 北京小米移动软件有限公司 | Display method and device |
CN109308173B (en) * | 2017-07-26 | 2021-10-15 | 腾讯科技(深圳)有限公司 | Display method and device, display terminal and computer storage medium |
CN108648249B (en) * | 2018-05-09 | 2022-03-29 | 歌尔科技有限公司 | Image rendering method and device and intelligent wearable device |
CN110288689B (en) * | 2019-06-20 | 2020-09-01 | 北京三快在线科技有限公司 | Method and device for rendering electronic map |
-
2020
- 2020-03-20 CN CN202010202049.0A patent/CN111462292B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN111462292A (en) | 2020-07-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111462292B (en) | Layering rendering method, medium, equipment and device | |
Gajer et al. | Grip: Graph drawing with intelligent placement | |
US8933937B2 (en) | Visualizing a layered graph using edge bundling | |
JP7111152B2 (en) | Manufacturing support system and method | |
TWI398158B (en) | Method for generating the depth of a stereo image | |
WO2009099703A2 (en) | Efficient geometric tessellation and displacement | |
CN113160068A (en) | Point cloud completion method and system based on image | |
CN112669429A (en) | Image distortion rendering method and device | |
CN106985395A (en) | The increasing material manufacturing method and device of feature based | |
EP3231168A1 (en) | Determining halftone schemes for 3d printing | |
CN106874955A (en) | A kind of 3D shape sorting technique based on depth convolutional neural networks | |
CN112734887A (en) | Face mixing-deformation generation method and device based on deep learning | |
CN105678831A (en) | Image rendering method and apparatus | |
CN112598802B (en) | Thermodynamic diagram generation method and system based on crowdsourcing data | |
KR101954589B1 (en) | Method and equipment for generating a numerical representation of a three-dimensional object, said numerical representation being suited to be used for making said three-dimensional object through stereolithography | |
CN111062902B (en) | Image deformation method, medium, device and apparatus | |
CN110874824B (en) | Image restoration method and device | |
CN112700526B (en) | Concave-convex material image rendering method and device | |
KR101086228B1 (en) | 3D model cross section simulation method and device | |
CN111461960B (en) | Multi-layer matrix transformation method and device | |
CN111354082B (en) | Method and device for generating surface map, electronic equipment and storage medium | |
CN107507276A (en) | The 3-dimensional digital rock core storage method that slabbed core for any direction is shown | |
CN113205579A (en) | Three-dimensional reconstruction method, device, equipment and storage medium | |
CN113628102B (en) | Entity model blanking method and device, electronic equipment and storage medium | |
CN119313565A (en) | Image generation method and system based on gradual super resolution and region differentiation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |