CN111068314A - Unity-based NGUI resource rendering processing method and device - Google Patents
Unity-based NGUI resource rendering processing method and device Download PDFInfo
- Publication number
- CN111068314A CN111068314A CN201911243490.7A CN201911243490A CN111068314A CN 111068314 A CN111068314 A CN 111068314A CN 201911243490 A CN201911243490 A CN 201911243490A CN 111068314 A CN111068314 A CN 111068314A
- Authority
- CN
- China
- Prior art keywords
- rendering
- result
- texture
- user interface
- special effect
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000009877 rendering Methods 0.000 title claims abstract description 301
- 238000003672 processing method Methods 0.000 title claims description 20
- 230000000694 effects Effects 0.000 claims abstract description 116
- 238000000034 method Methods 0.000 claims abstract description 29
- 238000012937 correction Methods 0.000 claims description 36
- 230000008569 process Effects 0.000 claims description 17
- 238000006243 chemical reaction Methods 0.000 claims description 14
- 238000012545 processing Methods 0.000 claims description 11
- 238000003860 storage Methods 0.000 claims description 9
- 238000011161 development Methods 0.000 description 10
- 238000004364 calculation method Methods 0.000 description 8
- 238000005286 illumination Methods 0.000 description 7
- 238000013507 mapping Methods 0.000 description 6
- 239000000463 material Substances 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 3
- 230000001172 regenerating effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000003631 expected effect Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/40—Filling a planar surface by adding surface attributes, e.g. colour or texture
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application provides a Unity-based NGUI resource rendering method and a Unity-based NGUI resource rendering device, wherein the Unity-based NGUI resource rendering method comprises the following steps: acquiring NGUI resources, wherein the NGUI resources comprise a user interface map and a special effect map; rendering the user interface map into a first rendered texture, obtaining a user interface result, wherein the first rendered texture is identified as a linear texture and is identified as gamma corrected; rendering the special effect map into a second rendering texture to obtain a special effect result, wherein the second rendering texture is identified as a linear texture and is identified as gamma corrected; combining the user interface result and the special effect result to obtain a combined result; and rendering the merged result into a third rendering texture to obtain a rendering result and displaying the rendering result in a display area.
Description
Technical Field
The present application relates to the field of internet technologies, and in particular, to a Unity-based NGUI resource rendering method and apparatus, a computing device, and a computer-readable storage medium.
Background
With the increasing development of internet technology, the electronic game industry has also made great progress, and the production of game pictures is more and more exquisite.
At present, with the development of technology, the linear-based physical rendering requirement is increasingly applied to game development, the linear-based physical rendering requirement can make the picture more exquisite, vivid and improve the image quality, but tens of thousands of NGUI resources are generated in a non-linear space, and if the NGUI resources are generated again in a linear space, a great amount of manpower and material resources are consumed, and resources are wasted.
How to achieve the rendering purpose in a linear space without changing the existing NGUI resources becomes a problem to be solved by technical staff.
Disclosure of Invention
In view of this, embodiments of the present application provide a Unity-based NGUI resource rendering method and apparatus, a computing device, and a computer-readable storage medium, so as to solve technical defects in the prior art.
According to a first aspect of the embodiments of the present application, there is provided a Unity-based NGUI resource rendering processing method, where a color space in Unity is a nonlinear space, the method including:
acquiring NGUI resources, wherein the NGUI resources comprise a user interface map and a special effect map;
rendering the user interface map into a first rendered texture, obtaining a user interface result, wherein the first rendered texture is identified as a linear texture and is identified as gamma corrected;
rendering the special effect map into a second rendering texture to obtain a special effect result, wherein the second rendering texture is identified as a linear texture and is identified as gamma corrected;
combining the user interface result and the special effect result to obtain a combined result;
and rendering the merged result into a third rendering texture to obtain a rendering result and displaying the rendering result in a display area.
Optionally, rendering the user interface map into a first rendering texture to obtain a user interface result, including:
extracting the color of the user interface vertex in the user interface map;
rendering the vertex color of the user interface to a first rendering texture through linear conversion to obtain a color rendering texture of the user interface;
and rendering the texture and the user interface map according to the user interface color to obtain a user interface result.
Optionally, rendering the special effect map into a second rendering texture to obtain a special effect result, where the method includes:
extracting the color of a special effect vertex in the special effect map;
rendering the special effect vertex color to a second rendering texture through linear conversion to obtain a special effect color rendering texture;
and rendering the texture and the special effect map according to the special effect color to obtain a special effect result.
Optionally, rendering the merged result into a third rendering texture, obtaining a rendering result, and displaying the rendering result in a display area, including:
rendering the merged result into a third rendering texture in an adding or mixing mode to obtain a linear rendering result;
and displaying the linear rendering result in a display area.
Optionally, displaying the linear rendering result in a display area, including:
and displaying the linear rendering result in a display area through nonlinear correction.
Optionally, the displaying the linear rendering result in a display area through nonlinear correction includes:
and displaying the linear rendering result in a display area through gamma correction.
According to a second aspect of the embodiments of the present application, there is provided a Unity-based NGUI resource rendering processing apparatus, including:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is configured to acquire NGUI resources, and the NGUI resources comprise user interface maps and special effect maps;
a first rendering module configured to render the user interface map into a first rendered texture, obtaining a user interface result, wherein the first rendered texture is identified as a linear texture and is identified as gamma corrected;
a second rendering module configured to render the special effect map into a second rendered texture, obtaining a special effect result, wherein the second rendered texture is identified as a linear texture and is identified as gamma corrected;
a merging module configured to merge the user interface result and the special effect result to obtain a merged result;
and the rendering display module is configured to render the merged result into a third rendering texture, obtain a rendering result and display the rendering result in a display area.
Optionally, the first rendering module is further configured to extract a user interface vertex color in the user interface map; rendering the vertex color of the user interface to a first rendering texture through linear conversion to obtain a color rendering texture of the user interface; and rendering the texture and the user interface map according to the user interface color to obtain a user interface result.
Optionally, the second rendering module is further configured to extract a color of a vertex of the special effect in the special effect map; rendering the special effect vertex color to a second rendering texture through linear conversion to obtain a special effect color rendering texture; and rendering the texture and the special effect map according to the special effect color to obtain a special effect result.
Optionally, the rendering and displaying module is further configured to render the merged result into a third rendering texture in an adding or mixing manner, so as to obtain a linear rendering result; and displaying the linear rendering result in a display area.
Optionally, the rendering and displaying module is further configured to display the linear rendering result in a display area through nonlinear correction.
Optionally, the rendering display module is further configured to display the linear rendering result in a display area through gamma correction.
According to a third aspect of embodiments of the present application, there is provided a computing device comprising a memory, a processor, and computer instructions stored on the memory and executable on the processor, the processor implementing the steps of the Unity-based NGUI resource rendering processing method when executing the instructions.
According to a fourth aspect of embodiments of the present application, there is provided a computer-readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the Unity-based NGUI resource rendering processing method.
In the embodiment of the application, NGUI resources are obtained, wherein the NGUI resources comprise a user interface map and a special effect map; rendering the user interface map into a first rendered texture, obtaining a user interface result, wherein the first rendered texture is identified as a linear texture and is identified as gamma corrected; rendering the special effect map into a second rendering texture to obtain a special effect result, wherein the second rendering texture is identified as a linear texture and is identified as gamma corrected; combining the user interface result and the special effect result to obtain a combined result; and rendering the merged result into a third rendering texture to obtain a rendering result and displaying the rendering result in a display area. The method has the advantages that the nonlinear user interface mapping and the special effect mapping are rendered into the linear rendering texture, the rendered linear rendering texture is combined into the other linear rendering texture, the color of the mapping is converted into linearity in the rendering process, then the linear rendering texture is directly combined and rendered, the rendering time is saved and the rendering efficiency is improved due to the fact that the linear calculation is independent of the sequence.
Drawings
FIG. 1 is a block diagram of a computing device provided by an embodiment of the present application;
FIG. 2 is a flowchart of a Unity-based NGUI resource rendering processing method provided in an embodiment of the present application;
FIG. 3 is a flowchart of a Unity-based NGUI resource rendering processing method according to another embodiment of the present application;
fig. 4 is a schematic structural diagram of a Unity-based NGUI resource rendering processing apparatus according to an embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of implementation in many different ways than those herein set forth and of similar import by those skilled in the art without departing from the spirit of this application and is therefore not limited to the specific implementations disclosed below.
The terminology used in the one or more embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the one or more embodiments of the present application. As used in one or more embodiments of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in one or more embodiments of the present application refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It will be understood that, although the terms first, second, etc. may be used herein in one or more embodiments of the present application to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, a first aspect may be termed a second aspect, and, similarly, a second aspect may be termed a first aspect, without departing from the scope of one or more embodiments of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
First, the noun terms to which one or more embodiments of the present invention relate are explained.
Unity: the game development engine is a powerful cross-platform game development engine, the global accumulated downloading amount exceeds 5 hundred million times, the game development tool is a multi-platform and comprehensive game development tool, the game development tool is one of the most excellent 3D engines at present, the Unity3D engine can enable game developers to easily create interactive contents such as 3D video games, real-time 3D animations and the like, and the game development tool is widely applied to the development fields of hand games, network games, single machines, new VR games and the like at present.
And (3) an NGUI plug-in: NGUI is a Unity plugin written in C # strictly following the KISS principle, providing a powerful UI system and event notification framework. The code is simple, and most classes are less than 200 lines of code. This means that the programmer can easily extend the functionality of the NGUI or leverage existing functionality. This means higher performance, lower learning difficulty and greater interest to all other users. Fully integrated into the Inspector panel, what is seen in the scene view is what is obtained in the game view. Component-based, modular nature: let your interface control do what only needs to attach the corresponding behavior, and no coding is needed. And the iOS/Android and Flash are comprehensively supported.
Linear space: the numerical intensity is proportional to the perceived intensity in linear space, and colors can be correctly added and multiplied in linear space.
Nonlinear space: the numerical intensity is not proportional to the perceived intensity in the non-linear space and needs to be gamma corrected.
Rendering Texture (Render Texture): for storing resources in the image rendering process.
Gamma correction: the gamma value is changed to match the intermediate gray of the monitor, so that the color display difference existing in different output devices is compensated, and the image has the same effect on different monitors.
In the present application, a Unity-based NGUI resource rendering processing method and apparatus, a computing device, and a computer-readable storage medium are provided, which are described in detail in the following embodiments one by one.
FIG. 1 shows a block diagram of a computing device 100 according to an embodiment of the present application. The components of the computing device 100 include, but are not limited to, memory 110 and processor 120. The processor 120 is coupled to the memory 110 via a bus 130 and a database 150 is used to store data.
Computing device 100 also includes access device 140, access device 140 enabling computing device 100 to communicate via one or more networks 160. Examples of such networks include the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. Access device 140 may include one or more of any type of network interface (e.g., a Network Interface Card (NIC)) whether wired or wireless, such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth.
In one embodiment of the present application, the above-mentioned components of the computing device 100 and other components not shown in fig. 1 may also be connected to each other, for example, by a bus. It should be understood that the block diagram of the computing device architecture shown in FIG. 1 is for purposes of example only and is not limiting as to the scope of the present application. Those skilled in the art may add or replace other components as desired.
Computing device 100 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), a mobile phone (e.g., smartphone), a wearable computing device (e.g., smartwatch, smartglasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 100 may also be a mobile or stationary server.
Wherein, the processor 120 can execute the steps in the Unity-based NGUI resource rendering processing method shown in fig. 2. Fig. 2 is a flowchart illustrating a Unity-based NGUI resource rendering processing method according to an embodiment of the present application, including steps 202 to 210, where a color space in Unity is a non-linear space.
Step 202: and acquiring NGUI resources, wherein the NGUI resources comprise user interface maps and special effect maps.
The NGUI resources are resource materials under Unity, including user interface pastels and special effect pastels made by art, and in a medium-large project, the magnitude is in the level of thousands to ten thousands, if the NGUI resources are made again, a large amount of manpower and material resources are consumed, and the resources are wasted.
The linear rendering is irrelevant to the rendering sequence, so that the time in the rendering process can be reasonably arranged, and the rendering efficiency is improved.
Step 204: rendering the user interface map into a first rendered texture, obtaining a user interface result, wherein the first rendered texture is identified as a linear texture and identified as gamma corrected.
The rendering texture is set as a linear texture and the input texture is set as gamma correction, in the mode, when the hardware samples the input chartlet texture, the input chartlet texture is automatically converted into a linear space, before the user interface chartlet is rendered to the first rendering texture, the gamma correction is carried out, and in the rendering process, the color of the user interface chartlet and the color in the first rendering texture are firstly converted into the linear space and then are rendered and mixed.
The first rendering texture is used for storing a user interface map in an image rendering process, and meanwhile the first rendering texture is set to carry out gamma correction on the input map, namely the user interface map is converted into a linear space firstly, then the user interface map is rendered into the first rendering texture for rendering, and then the gamma correction is carried out to obtain a nonlinear user interface result.
In the embodiment provided by the application, the user interface map A is rendered into the first rendering texture, and a user interface result X is obtained.
Step 206: rendering the special effect map into a second rendering texture to obtain a special effect result, wherein the second rendering texture is identified as a linear texture and is identified as gamma corrected.
And the second rendering texture is used for storing the special effect map in the image rendering process, and meanwhile, the second rendering texture is set to carry out gamma correction on the input special effect map, namely the special effect map is converted into a linear space firstly, then the special effect map is rendered into the second rendering texture for rendering, and then the gamma correction is carried out to obtain a nonlinear special effect result.
In the embodiment provided by the application, the special effect map B is rendered into the second rendering texture to obtain a special effect result Y.
Step 208: and combining the user interface result and the special effect result to obtain a combined result.
And performing gamma conversion on the nonlinear user interface result to obtain a linear user interface result, performing gamma conversion on the nonlinear special effect result to obtain a linear special effect result, and performing various illumination calculations in a linear space to obtain a linear combined result.
Step 210: and rendering the merged result into a third rendering texture to obtain a rendering result and displaying the rendering result in a display area.
And the third rendering texture is used for storing the merging result in the image rendering process, and similarly, the third rendering texture is set to perform gamma correction on the input result, render the linear merging result in the third rendering texture, perform various illumination calculations in a linear space to obtain a linear rendering result, and perform gamma correction on the linear rendering result to obtain a final rendering result. And displaying the final rendering result in a display area.
The display area may be a terminal screen, such as a mobile phone, a computer, a television, or may be a projection projected onto an external display device.
Alternatively, step 210 may be implemented by steps S2101 to S2102 described below.
S2101, rendering the merged result into a third rendering texture in an adding or mixing mode to obtain a linear rendering result.
In the process of rendering the merged result to the third rendering texture, a linear rendering result can be obtained by adding or mixing.
And S2102, displaying the linear rendering result in a display area.
Optionally, the linear rendering result is displayed in a display area through nonlinear correction.
Because the rendering process is linear rendering, the sequence of the linear rendering and the rendering is irrelevant, and the time of each rendering result is not fixed, the linear rendering ensures that the rendering is carried out as long as the rendering result is completed, other rendering results do not need to be considered, and the rendering time is saved. However, in the display process, because the display area is non-linear, the rendering result needs to be corrected for non-linearity before the display area is displayed. The linear rendering result can be normally displayed in the display area after being subjected to nonlinear correction.
Optionally, the linear rendering result is displayed in a display area through gamma correction.
The non-linear correction may be gamma correction, that is, the linear rendering result is gamma-corrected and then displayed in the display area.
According to the Unity-based NGUI resource rendering processing method provided by the embodiment of the application, the nonlinear NGUI resource is rendered into the linear rendering texture, a more real expression effect can be obtained in a linear space, in the rendering process, the user interface mapping and the special effect mapping are respectively rendered into the rendering texture through gamma correction to obtain the user interface result and the special effect result, then the user interface result and the special effect result are rendered into another rendering texture, various illumination calculations are performed in the linear space, and gamma correction is performed during final output, so that the rendering can achieve an expected effect, manpower and material resources required for regenerating the linear NGUI resource are saved, the resource is saved, and the rendering efficiency is improved.
Fig. 3 illustrates a Unity-based NGUI resource rendering processing method according to an embodiment of the present application, which includes steps 302 to 318.
Step 302: and acquiring NGUI resources, wherein the NGUI resources comprise user interface maps and special effect maps.
Step 304 and step 310 are performed after step 302, respectively.
Step 302 is consistent with the method of step 202, and for the specific explanation of step 302, refer to the details of step 202 in the foregoing embodiment, which are not repeated herein.
Step 304: and extracting the color of the user interface vertex in the user interface map.
The user interface map comprises a plurality of user interface vertexes, wherein each user interface vertex comprises a vertex color corresponding to the vertex, and the color of each user interface vertex in the user interface map is extracted.
Step 306: and rendering the user interface vertex color to a first rendering texture through linear conversion to obtain the user interface color rendering texture, wherein the first rendering texture is identified as a linear texture and is identified as gamma correction.
And converting the vertex color of the user interface into the vertex color of the linear user interface through gamma correction in a linear color space, rendering the vertex color of the linear user interface into a first rendering texture, and performing various illumination calculations in the linear space to obtain a user interface color rendering texture.
Step 308: and rendering the texture and the user interface map according to the user interface color to obtain a user interface result.
And combining the user interface color rendering texture and the user interface mapping to obtain a user interface result in a mixing mode, wherein the user interface result is nonlinear.
Step 310: and extracting the color of the special effect vertex in the special effect map.
The special effect map comprises a plurality of special effect vertexes, wherein each special effect vertex comprises a vertex color corresponding to the vertex, and each special effect vertex color in the special effect map is extracted.
Step 312: and rendering the special effect vertex color to a second rendering texture through linear conversion to obtain a special effect color rendering texture, wherein the second rendering texture is identified as a linear texture and is identified as gamma correction.
And converting the special effect vertex color into a linear special effect vertex color through gamma correction in a linear space, rendering the linear special effect vertex color into a second rendering texture, and performing various illumination calculations in the linear space to obtain a special effect color rendering texture.
Step 314: and rendering the texture and the special effect map according to the special effect color to obtain a special effect result.
Combining the special effect color rendering texture with the special effect chartlet, and obtaining a special effect result in a mixing mode, wherein the special effect result is nonlinear.
Step 316: and combining the user interface result and the special effect result to obtain a combined result.
Step 318: and rendering the merged result into a third rendering texture to obtain a rendering result and displaying the rendering result in a display area.
The steps 316 to 318 are the same as the above-mentioned steps 208 to 210, and for the specific explanation of the steps 316 to 318, refer to the details of the steps 208 to 210 in the foregoing embodiment, which will not be repeated herein.
According to the Unity-based NGUI resource rendering processing method provided by the embodiment of the application, the nonlinear NGUI resource is rendered into the linear rendering texture, the user interface map and the special effect map are respectively rendered into the rendering texture through gamma correction in the rendering process to obtain the user interface result and the special effect result, the user interface result and the special effect result are rendered into another rendering texture, various illumination calculations are performed in a linear space, and gamma correction is performed during final output, so that the rendering can achieve the expected effect, manpower and material resources required for regenerating the linear NGUI resource are saved, the resource is saved, and the rendering efficiency is improved.
Corresponding to the above method embodiment, the present application further provides an embodiment of a Unity-based NGUI resource rendering processing apparatus, and fig. 4 shows a schematic structural diagram of the Unity-based NGUI resource rendering processing apparatus according to an embodiment of the present application. As shown in fig. 4, the apparatus includes:
an obtaining module 402 configured to obtain an NGUI resource, wherein the NGUI resource includes a user interface map and a special effects map;
a first rendering module 404 configured to render the user interface map into a first rendered texture, obtaining a user interface result, wherein the first rendered texture is identified as a linear texture and is identified as gamma corrected;
a second rendering module 406 configured to render the special effect map into a second rendered texture, the second rendered texture being identified as a linear texture and identified as gamma corrected, obtaining a special effect result;
a merging module 408 configured to merge the user interface result and the special effect result to obtain a merged result;
and a rendering display module 410 configured to render the merged result into a third rendering texture, obtain a rendering result and display the rendering result in a display area.
Optionally, the first rendering module 404 is further configured to extract a user interface vertex color in the user interface map; rendering the vertex color of the user interface to a first rendering texture through linear conversion to obtain a color rendering texture of the user interface; and rendering the texture and the user interface map according to the user interface color to obtain a user interface result.
Optionally, the second rendering module 406 is further configured to extract a color of a vertex of the special effect in the special effect map; rendering the special effect vertex color to a second rendering texture through linear conversion to obtain a special effect color rendering texture; and rendering the texture and the special effect map according to the special effect color to obtain a special effect result.
Optionally, the rendering and displaying module 410 is further configured to render the merged result into a third rendering texture by adding or mixing, so as to obtain a linear rendering result; and displaying the linear rendering result in a display area.
Optionally, the rendering and displaying module 410 is further configured to display the linear rendering result in a display area through nonlinear correction.
Optionally, the rendering and displaying module 410 is further configured to display the linear rendering result in a display area through gamma correction.
According to the Unity-based NGUI resource rendering processing device, the nonlinear NGUI resources are rendered into the linear rendering texture, in the rendering process, the user interface map and the special effect map are respectively rendered into the rendering texture through gamma correction to obtain the user interface result and the special effect result, the user interface result and the special effect result are rendered into another rendering texture, various illumination calculations are performed in a linear space, and gamma correction is performed during final output, so that the rendering effect can be expected, manpower and material resources required for regenerating the linear NGUI resources are saved, the resources are saved, and the rendering efficiency is improved.
An embodiment of the present application further provides a computing device, which includes a memory, a processor, and computer instructions stored in the memory and executable on the processor, where the processor executes the instructions to implement the steps of the Unity-based NGUI resource rendering processing method.
An embodiment of the present application also provides a computer readable storage medium storing computer instructions, which when executed by a processor, implement the steps of the Unity-based NGUI resource rendering processing method as described above.
The above is an illustrative scheme of a computer-readable storage medium of the present embodiment. It should be noted that the technical solution of the storage medium and the technical solution of the above-mentioned Unity-based NGUI resource rendering processing method belong to the same concept, and details that are not described in detail in the technical solution of the storage medium can be referred to the description of the technical solution of the above-mentioned Unity-based NGUI resource rendering processing method.
The foregoing description of specific embodiments of the present application has been presented. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The computer instructions comprise computer program code which may be in the form of source code, object code, an executable file or some intermediate form, or the like. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The preferred embodiments of the present application disclosed above are intended only to aid in the explanation of the application. Alternative embodiments are not exhaustive and do not limit the invention to the precise embodiments described. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the application and its practical applications, to thereby enable others skilled in the art to best understand and utilize the application. The application is limited only by the claims and their full scope and equivalents.
Claims (14)
1. A NGUI resource rendering processing method based on Unity is characterized in that a color space in Unity is a nonlinear space, and comprises the following steps:
acquiring NGUI resources, wherein the NGUI resources comprise a user interface map and a special effect map;
rendering the user interface map into a first rendered texture, obtaining a user interface result, wherein the first rendered texture is identified as a linear texture and is identified as gamma corrected;
rendering the special effect map into a second rendering texture to obtain a special effect result, wherein the second rendering texture is identified as a linear texture and is identified as gamma corrected;
combining the user interface result and the special effect result to obtain a combined result;
and rendering the merged result into a third rendering texture to obtain a rendering result and displaying the rendering result in a display area.
2. The Unity-based NGUI resource rendering process of claim 1, wherein rendering the user interface map into a first rendering texture, obtaining a user interface result, comprises:
extracting the color of the user interface vertex in the user interface map;
rendering the vertex color of the user interface to a first rendering texture through linear conversion to obtain a color rendering texture of the user interface;
and rendering the texture and the user interface map according to the user interface color to obtain a user interface result.
3. The Unity-based NGUI resource rendering process of claim 1, wherein rendering the special effect map into a second rendering texture to obtain a special effect result comprises:
extracting the color of a special effect vertex in the special effect map;
rendering the special effect vertex color to a second rendering texture through linear conversion to obtain a special effect color rendering texture;
and rendering the texture and the special effect map according to the special effect color to obtain a special effect result.
4. The Unity-based NGUI resource rendering processing method of claim 1, wherein rendering the merged result into a third rendering texture, obtaining a rendering result and displaying the rendering result on a display area comprises:
rendering the merged result into a third rendering texture in an adding or mixing mode to obtain a linear rendering result;
and displaying the linear rendering result in a display area.
5. The Unity-based NGUI resource rendering process of claim 4, wherein displaying the linear rendering result in a display area comprises:
and displaying the linear rendering result in a display area through nonlinear correction.
6. The Unity-based NGUI resource rendering processing method of claim 5, wherein displaying the linear rendering result in a display area through non-linear correction comprises:
and displaying the linear rendering result in a display area through gamma correction.
7. A Unity-based NGUI resource rendering device is characterized in that a color space in Unity is a nonlinear space, and comprises:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is configured to acquire NGUI resources, and the NGUI resources comprise user interface maps and special effect maps;
a first rendering module configured to render the user interface map into a first rendered texture, obtaining a user interface result, wherein the first rendered texture is identified as a linear texture and is identified as gamma corrected;
a second rendering module configured to render the special effect map into a second rendered texture, obtaining a special effect result, wherein the second rendered texture is identified as a linear texture and is identified as gamma corrected;
a merging module configured to merge the user interface result and the special effect result to obtain a merged result;
and the rendering display module is configured to render the merged result into a third rendering texture, obtain a rendering result and display the rendering result in a display area.
8. The Unity-based NGUI resource rendering processing apparatus of claim 7, wherein,
the first rendering module further configured to extract a user interface vertex color in the user interface map; rendering the vertex color of the user interface to a first rendering texture through linear conversion to obtain a color rendering texture of the user interface; and rendering the texture and the user interface map according to the user interface color to obtain a user interface result.
9. The Unity-based NGUI resource rendering processing apparatus of claim 7, wherein,
the second rendering module is further configured to extract a special effect vertex color in the special effect map; rendering the special effect vertex color to a second rendering texture through linear conversion to obtain a special effect color rendering texture; and rendering the texture and the special effect map according to the special effect color to obtain a special effect result.
10. The Unity-based NGUI resource rendering processing apparatus of claim 7, wherein,
the rendering display module is further configured to render the merged result into a third rendering texture in an adding or mixing manner, so as to obtain a linear rendering result; and displaying the linear rendering result in a display area.
11. The Unity-based NGUI resource rendering processing apparatus of claim 10, wherein,
the rendering display module is further configured to display the linear rendering result in a display area through nonlinear correction.
12. The Unity-based NGUI resource rendering processing apparatus of claim 11, wherein,
the rendering display module is further configured to display the linear rendering result in a display area through gamma correction.
13. A computing device comprising a memory, a processor, and computer instructions stored on the memory and executable on the processor, wherein the processor implements the steps of the method of any one of claims 1-6 when executing the instructions.
14. A computer-readable storage medium storing computer instructions, which when executed by a processor, perform the steps of the method of any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911243490.7A CN111068314B (en) | 2019-12-06 | 2019-12-06 | NGUI resource rendering processing method and device based on Unity |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911243490.7A CN111068314B (en) | 2019-12-06 | 2019-12-06 | NGUI resource rendering processing method and device based on Unity |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111068314A true CN111068314A (en) | 2020-04-28 |
CN111068314B CN111068314B (en) | 2023-09-05 |
Family
ID=70313101
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911243490.7A Active CN111068314B (en) | 2019-12-06 | 2019-12-06 | NGUI resource rendering processing method and device based on Unity |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111068314B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111617470A (en) * | 2020-06-04 | 2020-09-04 | 珠海金山网络游戏科技有限公司 | Rendering method and device for interface special effect |
CN114191815A (en) * | 2021-11-09 | 2022-03-18 | 网易(杭州)网络有限公司 | Display control method and device in a game |
CN114307143A (en) * | 2021-12-31 | 2022-04-12 | 上海完美时空软件有限公司 | Image processing method and device, storage medium and computer equipment |
CN114882164A (en) * | 2022-05-18 | 2022-08-09 | 上海完美时空软件有限公司 | Game image processing method and device, storage medium and computer equipment |
CN118608674A (en) * | 2024-08-06 | 2024-09-06 | 深圳易帆互动科技有限公司 | A method and system for mixing UI interface background and three-dimensional objects of Unity application |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6016151A (en) * | 1997-09-12 | 2000-01-18 | Neomagic Corp. | 3D triangle rendering by texture hardware and color software using simultaneous triangle-walking and interpolation for parallel operation |
US20050206645A1 (en) * | 2004-03-22 | 2005-09-22 | Hancock William R | Graphics processor with gamma translation |
US20150348315A1 (en) * | 2014-05-30 | 2015-12-03 | Apple Inc. | Dynamic Lighting Effects For Textures Without Normal Maps |
US20160078637A1 (en) * | 2014-09-12 | 2016-03-17 | Samsung Electronics Co., Ltd. | Method and apparatus for rendering |
CN106056658A (en) * | 2016-05-23 | 2016-10-26 | 珠海金山网络游戏科技有限公司 | A virtual object rendering method and device |
CN108921775A (en) * | 2018-05-09 | 2018-11-30 | 苏州蜗牛数字科技股份有限公司 | A method of the software based on Unity linearly renders |
CN109961498A (en) * | 2019-03-28 | 2019-07-02 | 腾讯科技(深圳)有限公司 | Image rendering method, device, terminal and storage medium |
CN110189274A (en) * | 2019-05-28 | 2019-08-30 | 北京字节跳动网络技术有限公司 | Image processing method, device and computer readable storage medium |
-
2019
- 2019-12-06 CN CN201911243490.7A patent/CN111068314B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6016151A (en) * | 1997-09-12 | 2000-01-18 | Neomagic Corp. | 3D triangle rendering by texture hardware and color software using simultaneous triangle-walking and interpolation for parallel operation |
US20050206645A1 (en) * | 2004-03-22 | 2005-09-22 | Hancock William R | Graphics processor with gamma translation |
US20150348315A1 (en) * | 2014-05-30 | 2015-12-03 | Apple Inc. | Dynamic Lighting Effects For Textures Without Normal Maps |
US20160078637A1 (en) * | 2014-09-12 | 2016-03-17 | Samsung Electronics Co., Ltd. | Method and apparatus for rendering |
CN106056658A (en) * | 2016-05-23 | 2016-10-26 | 珠海金山网络游戏科技有限公司 | A virtual object rendering method and device |
CN108921775A (en) * | 2018-05-09 | 2018-11-30 | 苏州蜗牛数字科技股份有限公司 | A method of the software based on Unity linearly renders |
CN109961498A (en) * | 2019-03-28 | 2019-07-02 | 腾讯科技(深圳)有限公司 | Image rendering method, device, terminal and storage medium |
CN110189274A (en) * | 2019-05-28 | 2019-08-30 | 北京字节跳动网络技术有限公司 | Image processing method, device and computer readable storage medium |
Non-Patent Citations (3)
Title |
---|
LEILV: ""Unity性能优化-图形渲染优化"", pages 1 - 13 * |
LUISA Z UWA: "Unity Gamma校正转为线性空间", pages 1 - 6 * |
TESNADO: "NGUI渲染机制——从顶点和UV说起", pages 1 - 10 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111617470A (en) * | 2020-06-04 | 2020-09-04 | 珠海金山网络游戏科技有限公司 | Rendering method and device for interface special effect |
CN111617470B (en) * | 2020-06-04 | 2023-09-26 | 珠海金山数字网络科技有限公司 | Interface special effect rendering method and device |
CN114191815A (en) * | 2021-11-09 | 2022-03-18 | 网易(杭州)网络有限公司 | Display control method and device in a game |
CN114307143A (en) * | 2021-12-31 | 2022-04-12 | 上海完美时空软件有限公司 | Image processing method and device, storage medium and computer equipment |
CN114882164A (en) * | 2022-05-18 | 2022-08-09 | 上海完美时空软件有限公司 | Game image processing method and device, storage medium and computer equipment |
CN118608674A (en) * | 2024-08-06 | 2024-09-06 | 深圳易帆互动科技有限公司 | A method and system for mixing UI interface background and three-dimensional objects of Unity application |
CN118608674B (en) * | 2024-08-06 | 2024-11-29 | 深圳易帆互动科技有限公司 | A method and system for mixing UI interface background and three-dimensional objects of Unity application |
Also Published As
Publication number | Publication date |
---|---|
CN111068314B (en) | 2023-09-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111068314B (en) | NGUI resource rendering processing method and device based on Unity | |
CN108600781B (en) | Video cover generation method and server | |
CN109389661B (en) | Animation file conversion method and device | |
CN113421312B (en) | Coloring method and device for black-and-white video, storage medium and terminal | |
CN110570506A (en) | Map resource management method and device, computing equipment and storage medium | |
US11481927B2 (en) | Method and apparatus for determining text color | |
CN110975284A (en) | Unity-based NGUI resource rendering processing method and device | |
CN112907700A (en) | Color filling method and device | |
CN110784739A (en) | Video synthesis method and device based on AE | |
CN110853121B (en) | Cross-platform data processing method and device based on AE | |
CN106327415A (en) | Image processing method and device | |
CN112714357A (en) | Video playing method, video playing device, electronic equipment and storage medium | |
CN112991497B (en) | Method, device, storage medium and terminal for coloring black-and-white cartoon video | |
CN110990104B (en) | Texture rendering method and device based on Unity3D | |
CN111930461B (en) | Mobile terminal APP full page graying method and device based on Android | |
CN112991412B (en) | Liquid crystal instrument sequence frame animation performance optimization method and liquid crystal instrument | |
CN104021579A (en) | Method and device for changing colors of image | |
CN114307143B (en) | Image processing method and device, storage medium, and computer equipment | |
CN117891546A (en) | Interface display method and device | |
CN111062638A (en) | Project resource processing method and device | |
CN114307144B (en) | Image processing method and device, storage medium, and computer equipment | |
CN110555799A (en) | Method and apparatus for processing video | |
CN111080763A (en) | Method and device for merging maps | |
CN115063333A (en) | Image processing method, apparatus, electronic device, and computer-readable storage medium | |
CN114968572A (en) | Method and device for determining occupied memory during picture loading |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 519000 Room 102, 202, 302 and 402, No. 325, Qiandao Ring Road, Tangjiawan Town, high tech Zone, Zhuhai City, Guangdong Province, Room 102 and 202, No. 327 and Room 302, No. 329 Applicant after: Zhuhai Jinshan Digital Network Technology Co.,Ltd. Address before: 519000 Room 102, 202, 302 and 402, No. 325, Qiandao Ring Road, Tangjiawan Town, high tech Zone, Zhuhai City, Guangdong Province, Room 102 and 202, No. 327 and Room 302, No. 329 Applicant before: ZHUHAI KINGSOFT ONLINE GAME TECHNOLOGY Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |