CN113836705A - Method, device, storage medium and electronic device for processing illumination data - Google Patents
Method, device, storage medium and electronic device for processing illumination data Download PDFInfo
- Publication number
- CN113836705A CN113836705A CN202111040489.1A CN202111040489A CN113836705A CN 113836705 A CN113836705 A CN 113836705A CN 202111040489 A CN202111040489 A CN 202111040489A CN 113836705 A CN113836705 A CN 113836705A
- Authority
- CN
- China
- Prior art keywords
- color
- illumination
- target model
- light source
- virtual light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000005286 illumination Methods 0.000 title claims abstract description 163
- 238000000034 method Methods 0.000 title claims abstract description 54
- 238000012545 processing Methods 0.000 title claims abstract description 52
- 238000004088 simulation Methods 0.000 claims abstract description 59
- 238000004590 computer program Methods 0.000 claims description 15
- 238000009877 rendering Methods 0.000 claims description 5
- 238000004040 coloring Methods 0.000 claims description 4
- 230000000694 effects Effects 0.000 abstract description 28
- 238000003672 processing method Methods 0.000 abstract description 5
- 238000010586 diagram Methods 0.000 description 17
- 238000004364 calculation method Methods 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 6
- 230000000052 comparative effect Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 239000000047 product Substances 0.000 description 5
- 230000001502 supplementing effect Effects 0.000 description 5
- 238000005282 brightening Methods 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 239000013589 supplement Substances 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000004043 dyeing Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000003760 hair shine Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2119/00—Details relating to the type or aim of the analysis or the optimisation
- G06F2119/02—Reliability analysis or reliability optimisation; Failure analysis, e.g. worst case scenario performance, failure mode and effects analysis [FMEA]
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Evolutionary Computation (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Image Generation (AREA)
Abstract
The invention discloses a processing method and device of illumination data, a storage medium and an electronic device. The method comprises the following steps: acquiring a first position of a target model to be processed; determining a second position of the virtual light source based on the first position; simulating the illumination of the target model based on the virtual light source at a second position to obtain a simulation result; and adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model. By the method and the device, the technical effect of improving the processing efficiency of the illumination data is achieved.
Description
Technical Field
The present invention relates to the field of computers, and in particular, to a method and an apparatus for processing illumination data, a storage medium, and an electronic apparatus.
Background
Currently, in the illumination data processing, the model may be subjected to a baking of illumination (Bake) process, which may include a method of generating an illumination map (lightmap). However, this usually requires a strong lighting experience of the user to adjust the illumination map properly, and a user with insufficient experience usually causes a dark part of the model when performing the operation of generating the illumination map, or even if the scene is lighted by a lot of light, the model still has a problem of not being lighted.
Therefore, since the processing effect of the illumination data is difficult to grasp, the user is often required to continuously debug, and thus there is a technical problem that the efficiency of processing the illumination data is low.
Aiming at the technical problem of low efficiency of processing the illumination data in the prior art, no effective solution is provided at present.
Disclosure of Invention
The invention mainly aims to provide a method and a device for processing illumination data, a storage medium and an electronic device, so as to at least solve the technical problem of low efficiency of processing the illumination data.
In order to achieve the above object, according to an aspect of the present invention, there is provided a method of processing illumination data. The method can comprise the following steps: acquiring a first position of a target model to be processed; determining a second position of the virtual light source based on the first position; simulating the illumination of the target model based on the virtual light source at a second position to obtain a simulation result; and adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model.
Optionally, simulating illumination of the target model based on the virtual light source to obtain a simulation result, including: determining a grayscale image based on the target model and the virtual light source; coloring the gray level image to obtain a color image; the color image is determined as a simulation result.
Optionally, the target model comprises a plurality of vertices, and determining a grayscale image based on the target model and the virtual light source comprises: determining a first distance based on the second position of the virtual light source and the position of each vertex to obtain a plurality of first distances, wherein the plurality of first distances correspond to the plurality of vertices one to one; the plurality of first distances are represented as a grayscale image.
Optionally, the method further comprises: acquiring the radius of a virtual light source; determining a first distance based on the second position of the virtual light source and the position of each vertex, comprising: based on the second location, the location of each vertex, and the radius, a first distance is determined.
Optionally, the rendering processing is performed on the grayscale image to obtain a color image, and the method includes: converting the first distance into a second distance and a third distance based on the first parameter; determining a first color based on the second distance and a second color based on the third distance; the color image is determined based on a first color and a second color, wherein the first color is a color of a first region of the color image and the second color is a color of a second region of the color image.
Optionally, the method further comprises: adjusting the color intensity of the color image from the first color intensity to the second color intensity; determining a color image as a simulation result, comprising: a color image of the second color intensity is determined as a simulation result.
Optionally, the method further comprises: obtaining an original color map of a target model; adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model, comprising: adjusting the original color map based on the simulation result to obtain an adjustment result; and superposing the adjustment result and the illumination map to obtain target illumination data.
Optionally, adjusting the original color map based on the simulation result to obtain an adjustment result, including: and adjusting the original color map based on the second parameter and the simulation result to obtain an adjustment result.
Optionally, the step of superposing the adjustment result and the illumination map to obtain target illumination data includes: acquiring a product of the third parameter, the adjustment result, the first channel data of the illumination map and the second channel data of the illumination map; the product is determined as color data in the target illumination data.
Optionally, obtaining a first position of the target model to be processed includes: obtaining the axis position of a target model; the axial center position is determined as a first position.
Optionally, determining the second position of the virtual light source based on the first position comprises: a target offset is obtained, and a second position is determined based on the first position and the target offset.
In order to achieve the above object, according to another aspect of the present invention, there is also provided an illumination data processing apparatus. The apparatus may include: the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring a first position of a target model to be processed; a determination unit for determining a second position of the virtual light source based on the first position; the simulation unit is used for simulating the illumination of the target model based on the virtual light source at the second position to obtain a simulation result; and the adjusting unit is used for adjusting the illumination map of the target model based on the simulation result to obtain the target illumination data of the target model.
To achieve the above object, according to another aspect of the present invention, there is provided a computer-readable storage medium. The computer readable storage medium stores a computer program, wherein when the computer program is executed by a processor, the apparatus where the computer readable storage medium is located is controlled to execute the method for processing the illumination data according to the embodiment of the present invention.
In order to achieve the above object, according to another aspect of the present invention, an electronic device is provided. The electronic device comprises a memory and a processor, and is characterized in that the memory stores a computer program, and the processor is configured to run the computer program to execute the data processing method of the embodiment of the invention.
In this embodiment, a first position of a target model to be processed is obtained; determining a second position of the virtual light source based on the first position; simulating the illumination of the target model based on the virtual light source at a second position to obtain a simulation result; and adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model. That is to say, in this embodiment, the virtual light source is set in combination with the position of the target model, the result of the illumination of the target model is simulated by using the virtual light source, and then the illumination map of the target model is adjusted to obtain the illumination data of the target model, so that the effect of supplementing light (lighting) to the target model by using the virtual light source is achieved, the situation that the dark part of the target model is dark is reduced, the technical problem of low efficiency in processing the illumination data is solved, and the technical effect of improving the processing efficiency of the illumination data is achieved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a block diagram of a hardware structure of a mobile terminal of a method for processing illumination data according to an embodiment of the present invention;
fig. 2 is a flowchart of a method of processing illumination data according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a light scene bake according to the related art;
FIG. 4 is a schematic view of only direct illumination according to one of the related art;
FIG. 5 is a schematic diagram of direct illumination and indirect illumination according to one of the related art;
fig. 6 is a schematic view of an effect before baking and an effect after baking according to one of the related art;
FIG. 7 is a comparative schematic of illumination data processing according to embodiments of the present invention;
FIG. 8 is a comparative schematic of another illumination data processing according to embodiments of the present invention;
FIG. 9 is a graphical representation of the results of processing illumination data in accordance with an embodiment of the present invention;
FIG. 10 is a schematic illustration of a gray scale plot in accordance with embodiments of the present invention;
FIG. 11 is a schematic illustration of another gray scale plot in accordance with an embodiment of the present invention;
FIG. 12 is a schematic diagram of a graph of converting gray to color according to an embodiment of the present invention;
FIG. 13 is a schematic illustration of a comparison of a gray image and a color image in accordance with an embodiment of the present invention;
FIG. 14 is a comparative schematic of another illumination data processing according to embodiments of the present invention;
fig. 15 is a schematic diagram of an illumination data processing apparatus according to an embodiment of the present invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The method provided by the embodiment of the application can be executed in a mobile terminal, a computer terminal or a similar operation device. Taking the example of being operated on a mobile terminal, fig. 1 is a hardware structure block diagram of the mobile terminal of a method for processing illumination data according to an embodiment of the present invention. As shown in fig. 1, the mobile terminal may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, and optionally may also include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration, and does not limit the structure of the mobile terminal. For example, the mobile terminal may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 can be used for storing computer programs, for example, software programs and modules of application software, such as a computer program corresponding to a data processing method in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer programs stored in the memory 104, so as to implement the above-mentioned method. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the mobile terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
In this embodiment, a method for processing lighting data running on the mobile terminal is provided, and fig. 2 is a flowchart of a method for processing lighting data according to an embodiment of the present invention. As shown in fig. 2, the method may include the steps of:
step S202, a first position of a target model to be processed is obtained.
In the technical solution provided by step S202 above in the present invention, the target model to be processed may be a scene model, for example, a scene model in a game scene, which may include, but is not limited to, a model of a landscape class. The embodiment may determine a first location of the target model. Alternatively, this embodiment obtains the axis (pivot) heart position of the target model, which may be represented by frag.
Step S204, a second position of the virtual light source is determined based on the first position.
In the technical solution provided by step S204 of the present invention, after the first position of the target model to be processed is obtained, the second position of the virtual light source may be determined based on the first position.
In this embodiment, a virtual light source may be added in the renderer (shader), and the virtual light source may be a false point light source, which may be referred to as a simulated point light source or a simulated light source, and is used to simulate a light supplement effect of the target model. Alternatively, the embodiment may determine the second position of the virtual light source based on the first position according to a parameter related to the position, and the second position may be a position of a center point of the virtual light source, which may be represented by (x0, y0, z 0). Optionally, the parameters related to the position may be manually input by a user, so as to achieve the purpose of manually and flexibly adjusting the target model by the user.
Alternatively, the embodiment may obtain a target offset, which is a parameter that may be adjusted and may be represented by pivotoffest, determine the second position based on the first position and the target offset, for example, offset the target offset from the first position to obtain the second position of the virtual light source, and may be the second position of the virtual light source being the first position + the target offset, so that the second position may be represented by new _ pivot being the frag.
And S206, simulating the illumination of the target model based on the virtual light source at the second position to obtain a simulation result.
In the technical solution provided in step S206 of the present invention, after the second position of the virtual light source is determined based on the first position, the illumination of the target model may be simulated based on the virtual light source at the second position, so as to obtain a simulation result.
In this embodiment, at the second position, the virtual light source may be adjusted, and the illumination of the target model may be simulated based on the second position of the virtual light source, the radius (Rx, Ry, Rz) of the virtual light source, and the like, so as to obtain a simulation result, and the simulation result may be represented by color, so that the simulation result is also a result of simulating the illumination. Optionally, the radius of the virtual light source may be manually input by a user, and is a parameter that can be adjusted, so that the range of the virtual light source can be adjusted by adjusting the radius of the virtual light source.
And S208, adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model.
In the technical solution provided in step S208 of the present invention, after the illumination of the target model is simulated based on the virtual light source to obtain the simulation result, the illumination map of the target model may be adjusted based on the simulation result to obtain the target illumination data of the target model.
In this embodiment, the lighting map of the target model is a map generated by baking the target model, and may be implemented by a target engine, where the target engine may be a game engine. The illumination map may include color channel (RGB) data and gray channel (Alpha) data, and the modeling result may be calculated by superimposing the color channel data and the gray channel data of the illumination map, so as to obtain target illumination data of the target model, where the target illumination data may be local illumination data of the target model, including final color data.
Optionally, the lighting map of the target model in this embodiment is a map generated after the target model is subjected to scene baking lighting, and records textures of scene lighting data of the target model, which may be used to increase lighting atmosphere and artistic effect of the scene. The illumination map is essentially one or more maps applied to the target model, and includes information such as indirect illumination and shadow obtained by pre-calculation in an illumination map baking manner (only indirect illumination and no shadow can be baked when baking is performed). The embodiment uses the illumination map to avoid real-time illumination and shadow calculation during the game running, thereby improving the running performance of the game, and the embodiment can be suitable for computing platforms with weaker performance, such as mobile platforms.
The method of the embodiment is applied to scene baking, the scene baking can be lamplight scene baking, the scene baking can be a processing method in a game scene, and the scene baking can be used for simulating an illumination environment and generating an illumination map, so that the use of dynamic illumination is reduced in a game, and further, a large amount of illumination calculation is reduced. Alternatively, in gaming applications, the photopic and dynamic direct lighting act simultaneously.
Through the steps S202 to S208 described above, a first position of a target model to be processed is obtained; determining a second position of the virtual light source based on the first position; simulating the illumination of the target model based on the virtual light source at a second position to obtain a simulation result; and adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model. That is to say, in this embodiment, the virtual light source is set in combination with the position of the target model, the result of the illumination of the target model is simulated by using the virtual light source, and then the illumination map of the target model is adjusted to obtain the illumination data of the target model, so that the effect of supplementing light (lighting) to the target model by using the virtual light source is achieved, the situation that the dark part of the target model is completely dark is reduced, the technical problem of low efficiency in processing the illumination data is solved, and the technical effect of improving the processing efficiency of the illumination data is achieved.
The above method of this embodiment is further explained below.
As an alternative embodiment, in step S206, the simulating the illumination of the target model based on the virtual light source to obtain a simulation result, includes: determining a grayscale image based on the target model and the virtual light source; coloring the gray level image to obtain a color image; the color image is determined as a simulation result.
In this embodiment, when the simulation of the illumination of the target model based on the virtual light source is implemented to obtain the simulation result, a gray image, that is, a gray map, may be determined based on the target model and the virtual light source, and the gray value may be uniformly changed. After determining the gray-scale image, it may be that the gray-scale image is subjected to a rendering process, and the gray-scale image is converted into a color image, that is, the embodiment converts a graph of gray obtained based on the target model and the virtual light source into a graph of color, and then determines a simulation result of simulating the target model based on the graph.
As an alternative embodiment, the target model includes a plurality of vertices, and determining the grayscale image based on the target model and the virtual light source includes: determining a first distance based on the second position of the virtual light source and the position of each vertex to obtain a plurality of first distances, wherein the plurality of first distances correspond to the plurality of vertices one to one; the plurality of first distances are represented as a grayscale image.
In this embodiment, a plurality of vertices are included on the target model, a position of each vertex may be determined, and then a first distance may be determined based on the second position of the virtual light source and the position (x1, y1, z1) of each vertex, where the first distance may be the position of the virtual light source to each vertex, so as to obtain a plurality of first distances corresponding to the plurality of vertices one to one, and further represent the plurality of first distances as a grayscale image, that is, the grayscale image of this embodiment may represent the plurality of first distances.
As an optional implementation, the method further comprises: acquiring the radius of a virtual light source; determining a first distance based on the second position of the virtual light source and the position of each vertex, comprising: based on the second location, the location of each vertex, and the radius, a first distance is determined.
In this embodiment, the radius of the virtual light source may be obtained in response to the first operation instruction acting on the graphic user interface, that is, the parameter input by the user may include the radius of the virtual light source, which may be used to adjust the range of the virtual light source, and further, the embodiment may be based on the second position (x0, y0, z0) of the virtual light source, the position (x1, y1, z1) of each vertex, and the position of the virtual light sourceThe radius (Rx, Ry, Rz) determines a first distance, which may be denoted by a0, which may, alternatively,thereby generating a gray-scale image by the first distance a0 corresponding to each vertex of the object model.
As an alternative embodiment, the process of rendering the grayscale image to obtain a color image includes: converting the first distance into a second distance and a third distance based on the first parameter; determining a first color based on the second distance and a second color based on the third distance; the color image is determined based on a first color and a second color, wherein the first color is a color of a first region of the color image and the second color is a color of a second region of the color image.
In this embodiment, when the gray-scale image is colored to obtain a color image, a first parameter may be obtained first, where the first parameter may be a parameter for controlling a distance and may be represented by Mid. This embodiment may convert the first distance into the second distance based on the first parameter, for example, the first distance is a0, the second distance is a1, then a1 ═ saturrate (a0/Mid), the second distance corresponds to the first region a1 of the color image, and may be used to determine the first color of the first region of the color image, and optionally, the first color may be determined according to an interpolation function (lerp) and a1, the first color ═ lerp (color1.rgb color intensity.x, color2.rgb color intensity.y, a1), wherein color1.rgb, color2.rgb are color parameters used to determine the first color, color intensity.x, color intensity.y are color parameters used to determine the first color, and the first color may include red and green. Optionally, the embodiment further converts the first distance into a third distance based on the first parameter, for example, if the third distance is a2, then a2 ═ saturrate ((a0-Mid)/(1.0-Mid)), which may correspond to a second region a2 of the color image, which may be used to determine a second color of the second region of the color image, which may be determined according to interpolation functions (lerp) and a2, and a second color ═ lerp (color.rgb, color3.rgb × color intensity.z, a2), where color.rgb, color3.rgb are color parameters used to determine the second color, color intensity.z is used to determine a color intensity of the second color, which may include blue, and the embodiment may determine the first color and the second color based on the first color, may first dye the first region, and then dye the second color region, thereby obtaining the color image, the color image may be a three color image (red, green, blue image) with red in the center, green in the middle, and blue in the periphery.
Optionally, the first region and the second region of this embodiment are adjacent regions in the color image, where the second region may surround the first region, that is, a distance between a point in the first region and a center of the color image may be smaller than a distance between a point in the second region and the center of the color image. For example, the color image may be a circular (or approximately circular) color image, the first region may be an inner circle of the color image, the second region may be an outer circle of the color image, the color of the inner circle may be determined by a second distance a1 ═ saturrate (a0/Mid) corresponding to the inner circle, and the color of the outer circle may be determined by a third distance a2 ═ saturrate ((a0-Mid)/(1.0-Mid)) corresponding to the outer circle, thereby achieving the purpose of determining the circular (or approximately circular) color image.
It should be noted that the color image of the embodiment is obtained by performing a rendering process on a gray scale image determined by a target model and a virtual light source, and is used for representing a simulation result of simulating illumination of the target model, the shape of the color image is a circle (or an approximate circle) is only an example of the embodiment of the present invention, and is not limited to the setting of the color image of the embodiment of the present invention, such as a triangle, a rectangle, a square, or other shapes that can be formed into a region, and the shape of any color image that can be used for representing a simulation result of simulating illumination of the target model is within the scope of the embodiment, and is not limited in particular here.
As an optional implementation, the method further comprises: adjusting the color intensity of the color image from the first color intensity to the second color intensity; determining a color image as a simulation result, comprising: a color image of the second color intensity is determined as a simulation result.
In this embodiment, after determining the color image based on the first color and the second color, the color intensity (ColorIntensity) of the color image may be adjusted, and the color intensity of the color image may be adjusted from the original first color intensity to the second color intensity. Alternatively, this embodiment multiplies the color data of the color image by a coefficient to adjust the color intensity, for example, the color data of the color image is (0.5, 0, 0), the coefficient is 2, the color data (0.5, 0, 0) of the color image can adjust the color intensity, and the multiplication by the coefficient 2 becomes the color data (1, 0, 0), and even the value may exceed 1. After adjusting the color intensity of the color image from the first color intensity to the second color intensity, the color image of the second color intensity may be determined as a simulation result.
As an optional implementation, the method further comprises: obtaining an original color map of a target model; adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model, comprising: adjusting the original color map based on the simulation result to obtain an adjustment result; and superposing the adjustment result and the illumination map to obtain target illumination data.
In this embodiment, an original color map of the target model is obtained, which may be referred to as a base color map (a Diffuse map refers to a color map before the target model is baked and may be represented by Raw _ data. tex 0.rgb).
As an alternative embodiment, adjusting the original color map based on the simulation result to obtain an adjustment result includes: and adjusting the original color map based on the second parameter and the simulation result to obtain an adjustment result.
In this embodiment, the second parameter may be obtained in response to a second operation instruction acting on the graphical user interface, that is, the parameter input by the user may include a second parameter, which is an adjustable coloring parameter, which may be used to adjust the color of the virtual light source, and may be represented by BaseColor. This embodiment may adjust the original color map Raw _ data.tex0.rgb based on the second parameter basecolor.rgb and the simulation result color to obtain an adjustment result, for example, mtl.
As an optional implementation manner, the step of superimposing the adjustment result and the illumination map to obtain the target illumination data includes: acquiring a product of the third parameter, the adjustment result, the first channel data of the illumination map and the second channel data of the illumination map; the product is determined as color data in the target illumination data.
In this embodiment, when the adjustment result and the illumination map are superimposed to obtain the target illumination data, a third parameter may be obtained, where the third parameter may be 0.2, and first channel data and second channel data of the illumination map may also be obtained, where the first channel number may be color channel data (RGB), the second channel data may be gray channel data (Alpha), and a product between the third parameter, the adjustment result, the first channel data of the illumination map, and the second channel data of the illumination map may be obtained, for example, 0.2 × the adjustment result × RGB of the illumination map is final color data in the target illumination data.
The embodiment optimizes the baked shader, the second position of the virtual light source can be adjusted according to the input target offset, the range of the virtual light source is determined through the input radius, the color of the virtual light source is determined through the adjustable dyeing parameters, the algorithm of the virtual light source is added to the original color chartlet of the target model, the effect of supplementing light (lightening) to the target model through the virtual light source through manual parameter input is achieved, the final effect can be better seen through adjustment even for art classmates with insufficient experience, the situation that dark parts are dead and black in the target model is reduced, the technical problem of low efficiency of processing the illumination data is solved, and the technical effect of improving the processing efficiency of the illumination data is achieved.
The technical solution of the embodiment of the present invention is further described below by way of example with reference to a preferred implementation, specifically by way of example with reference to a game scene in a game application.
Light scene baking is a maneuver in a game scene, as shown in FIG. 3. Fig. 3 is a schematic diagram of a lighting scene baking in the related art, which can simulate a lighting environment and generate a lighting map, where the lighting map may be a set including multiple lighting maps, so as to reduce the use of dynamic lighting in a game and greatly reduce lighting calculation.
Fig. 4 is a schematic view of only direct illumination according to one of the related art. Fig. 5 is a schematic diagram of direct illumination and indirect illumination according to one of the related art. As shown in fig. 4 and 5, the illumination range of the direct illumination and the indirect illumination is larger than the range of only the direct light, and the light map baked light and the dynamic direct light should act simultaneously in the game.
The lighting baking usually needs a strong lighting experience of a user to adjust, and the user with insufficient experience can cause the dark part of the model to be dead black when baking, or the problem that the model cannot be lighted even if the scene is lighted much exists.
Whereas the effect before baking and the effect after baking in the related art are somewhat different, as shown in fig. 6. Fig. 6 is a schematic diagram of an effect before baking and an effect after baking according to one of the related arts, where a scene model is a room scene model, and information such as illumination and shadow of the room scene model is different between the effect before baking and the effect after baking, so that the actual baking effect is difficult to master, and the baking is often required to be continuously baked and debugged, and the baking is not very friendly to users with insufficient experience.
In the embodiment, in order to improve the working efficiency of a user, the baked shader is optimized, a virtual light source is added in the shader to simulate the light supplement effect of a model, and a manual local brightening function is added; the model is finely adjusted under the condition that the light arrangement of the baking illumination mapping is not changed, so that the problems of dark part death and bright part brightening of the model can be well reduced.
Fig. 7 is a comparative schematic of illumination data processing according to an embodiment of the present invention. As shown in fig. 7, a model of the manual fill-in light parameter is added to the model before processing, and a model after dark part is adjusted is added.
A lighting map is essentially one or more maps that are applied to a scene model. They contain information such as indirect lighting, shading, etc. obtained by precalculation in a lighting map baking mode (only indirect lighting may be baked and shading may not be baked when baking). The illumination map can be used for avoiding real-time illumination and shadow calculation during game running, the running performance of the game is improved, and the illumination map can be applied to a computing platform with weak performance, such as a mobile platform.
Fig. 8 is a comparative schematic of another illumination data processing according to an embodiment of the invention. As shown in fig. 8, a virtual light source is additionally arranged in the shader to simulate a light supplement effect of a target model to be processed, so that a function of local brightening is realized; the target model is finely adjusted under the condition that the light arrangement of the illumination map of the baking target model is not changed, so that the problems of dark part death and bright part brightening can be well reduced.
The above-described method of this embodiment is further described below.
The embodiment can add an algorithm simulating a point light source to a basic color map (Diffuse map) of the target model according to the input parameters, and can adjust the position, the color and the range of the virtual light source to achieve the purpose of processing the illumination data of the target model. For example, one could let a ball turn red in the center, green in the middle, and blue in the periphery, as shown in fig. 9. Fig. 9 is a diagram illustrating a result of processing illumination data according to an embodiment of the present invention.
The embodiment may determine the position of the virtual light source by offsetting the pivot axis position frag of the target model according to an offset pivotoffest, where the offset is an adjustable parameter, for example, the position of the virtual light source may be new _ pivot ═ frag.
This embodiment calculates the distance from each vertex on the target model to the virtual light source, and optionally, the vertex position of the target model (x1, y1, z1), the position of the center point of the virtual light source (x0, y0, z0), and the radius of the virtual light source are (Rx, Ry, Rz), the distance a0 from each vertex on the target model to the virtual light source may be expressed as the following formula:
this embodiment may represent the distance of each vertex to the virtual light source as a gray plot, as shown in fig. 10, where fig. 10 is a schematic illustration of a gray plot and the result of a0 is uniformly varied, according to an embodiment of the present invention.
This example can be used to color gray images according to the previous results, a1 saturrate (a0/Mid) and a2 saturrate ((a0-Mid)/(1.0-Mid)), where Mid is a parameter for controlling distance, a1 is an inner circle and a2 is an outer circle, and after a1 is colored by color-lerp (color1.rgb color intensity.x, color2.rgb color intensity.y, a1), a2 is colored by color-lerp (color. rgb, color3.rgb color intensity.z, a2), a three-color image can be obtained, as shown in fig. 11, where fig. 11 is a schematic view of another gray image according to an example of the invention, where a1 and a2 include an outer circle.
This embodiment can be colored according to the gray diagram of the previous step, and the gray diagram can be changed into a color diagram, as shown in fig. 12, where fig. 12 is a schematic diagram of a graph for converting the gray diagram into a color diagram according to an embodiment of the present invention. This embodiment can adjust the intensity of the color, for example, the color (0.5, 0, 0) can adjust the intensity, and multiply it by 2 to obtain the color (1, 0, 0), even if the value exceeds 1.
FIG. 13 is a schematic illustration of a comparison of a gray image and a color image in accordance with an embodiment of the present invention. As shown in fig. 13, the color image may be a three-color map so that the inside-out may be sequentially red, green, and blue.
Alternatively, mtl.albedo ═ srgb2linear 3(Raw _ data.tex0.rgb @ basecolor.rgb) × color. Where Raw _ data. tex0.rgb is used to represent the base color map, BaseColor is an adjustable color parameter that can be used to adjust the color of the virtual light source), and color is the result of the simulated illumination calculated above (color image).
The embodiment may perform a superposition calculation on the above calculation result mtl, albedo and lightmap to obtain a final color of 0.2, and alpha of rgb lightmap of the above calculation result lightmap, where lightmap refers to a map generated after the scene baking illumination, and may be carried by the game engine itself.
Fig. 14 is a comparative schematic of another illumination data processing according to an embodiment of the invention. As shown in fig. 14, even the art classmates with insufficient experience can make the final effect of the target model look better and more controllable by adjusting, and the local illumination can be manually adjusted after baking; this embodiment can also add the light filling manually, finely tunes under the unchangeable condition of baking light and shines the chartlet light arrangement, has realized the purpose of adjusting the dark part and dying black.
The embodiment optimizes the baked shader, the position of the virtual light source can be adjusted according to the input offset, the range of the virtual light source is determined through the input radius, the color of the virtual light source is determined through the adjustable dyeing parameters, the algorithm of the virtual light source is added to the original color chartlet of the target model, the effect of supplementing light (lightening) to the target model through the virtual light source through manual parameter input is achieved, even if the artistic classmates with insufficient experience can make the final effect look better through adjustment, the control is better, the condition that dark parts are dead black in the target model is reduced, the technical problem that the efficiency of processing the illumination data is low is solved, and the technical effect of improving the processing efficiency of the illumination data is achieved.
The embodiment of the invention also provides a device for processing the illumination data. It should be noted that the apparatus of this embodiment may be used to execute the method for processing the illumination data shown in fig. 2 according to the embodiment of the present invention.
Fig. 15 is a schematic diagram of an illumination data processing apparatus according to an embodiment of the present invention. As shown in fig. 15, the illumination data processing device 150 includes: an acquisition unit 151, a determination unit 152, a simulation unit 153, and an adjustment unit 154.
An obtaining unit 151, configured to obtain a first position of a target model to be processed.
A determining unit 152 for determining a second position of the virtual light source based on the first position.
And the simulation unit 153 is configured to simulate, at the second position, illumination of the target model based on the virtual light source, so as to obtain a simulation result.
And an adjusting unit 154, configured to adjust the illumination map of the target model based on the simulation result, so as to obtain target illumination data of the target model.
In the embodiment, the virtual light source is arranged by combining the position of the target model, the result of illumination of the target model is simulated by using the virtual light source, and then the illumination map of the target model is adjusted to obtain the illumination data of the target model, so that the effect of supplementing light (lightening) the target model through the virtual light source is achieved, the condition that dark parts of the target model are dead black is reduced, the technical problem of low efficiency in processing the illumination data is solved, and the technical effect of improving the processing efficiency of the illumination data is achieved.
Embodiments of the present invention also provide a computer-readable storage medium. The computer readable storage medium stores a computer program, wherein when the computer program is executed by a processor, the apparatus where the computer readable storage medium is located is controlled to execute the method for processing the illumination data according to the embodiment of the present invention.
Optionally, in this embodiment, the storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the principle of the present invention should be included in the protection scope of the present invention.
Claims (14)
1. A method of processing illumination data, comprising:
acquiring a first position of a target model to be processed;
determining a second position of a virtual light source based on the first position;
simulating the illumination of the target model based on the virtual light source at the second position to obtain a simulation result;
and adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model.
2. The method of claim 1, wherein simulating illumination of the target model based on the virtual light source results in a simulation result, comprising:
determining a grayscale image based on the target model and the virtual light source;
coloring the gray level image to obtain a color image;
and determining the color image as the simulation result.
3. The method of claim 2, wherein the target model includes a plurality of vertices, and wherein determining a grayscale image based on the target model and the virtual light source includes:
determining a first distance based on the second position of the virtual light source and the position of each vertex to obtain a plurality of first distances, wherein the first distances are in one-to-one correspondence with the vertices;
representing the plurality of first distances as the grayscale image.
4. The method of claim 3,
the method further comprises the following steps: acquiring the radius of the virtual light source;
determining a first distance based on the second position of the virtual light source and the position of each of the vertices, comprising: determining the first distance based on the second location, the location of each of the vertices, and the radius.
5. The method of claim 4, wherein rendering the grayscale image to obtain a color image comprises:
converting the first distance into a second distance and a third distance based on a first parameter;
determining a first color based on the second distance and a second color based on the third distance;
determining the color image based on the first color and the second color, wherein the first color is a color of a first region of the color image and the second color is a color of a second region of the color image.
6. The method of claim 2,
the method further comprises the following steps: adjusting the color intensity of the color image from a first color intensity to a second color intensity;
determining the color image as the simulation result, including: determining the color image of the second color intensity as the simulation result.
7. The method of claim 1,
the method further comprises the following steps: obtaining an original color map of the target model;
adjusting the illumination map of the target model based on the simulation result to obtain target illumination data of the target model, including: adjusting the original color map based on the simulation result to obtain an adjustment result; and superposing the adjustment result and the illumination map to obtain the target illumination data.
8. The method of claim 7, wherein adjusting the original color map based on the simulation result to obtain an adjustment result comprises:
and adjusting the original color map based on a second parameter and the simulation result to obtain the adjustment result.
9. The method of claim 7, wherein superimposing the adjustment result and the illumination map to obtain the target illumination data comprises:
obtaining a product of a third parameter, the adjustment result, the first channel data of the illumination map, and the second channel data of the illumination map;
determining the product as color data in the target illumination data.
10. The method according to any one of claims 1 to 9, wherein obtaining a first position of the object model to be processed comprises:
obtaining the axis position of the target model;
determining the axis location as the first location.
11. The method of any one of claims 1 to 9, wherein determining the second position of the virtual light source based on the first position comprises:
and acquiring a target offset, and determining the second position based on the first position and the target offset.
12. An apparatus for processing illumination data, comprising:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring a first position of a target model to be processed;
a determining unit for determining a second position of the virtual light source based on the first position;
the simulation unit is used for simulating the illumination of the target model based on the virtual light source at the second position to obtain a simulation result;
and the adjusting unit is used for adjusting the illumination map of the target model based on the simulation result to obtain the target illumination data of the target model.
13. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, controls an apparatus in which the computer-readable storage medium is located to carry out the method of any one of claims 1 to 11.
14. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is configured to be executed by the processor to execute the computer program to perform the method of any of claims 1 to 11.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111040489.1A CN113836705B (en) | 2021-09-06 | 2021-09-06 | Method, device, storage medium and electronic device for processing illumination data |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111040489.1A CN113836705B (en) | 2021-09-06 | 2021-09-06 | Method, device, storage medium and electronic device for processing illumination data |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113836705A true CN113836705A (en) | 2021-12-24 |
CN113836705B CN113836705B (en) | 2025-01-21 |
Family
ID=78962346
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111040489.1A Active CN113836705B (en) | 2021-09-06 | 2021-09-06 | Method, device, storage medium and electronic device for processing illumination data |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113836705B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004236157A (en) * | 2003-01-31 | 2004-08-19 | Nec Access Technica Ltd | Image processing device, image processing method, and image processing program |
US20100302272A1 (en) * | 2009-06-01 | 2010-12-02 | Apple Inc. | Enhancing Images Using Known Characteristics of Image Subjects |
US20160307324A1 (en) * | 2015-04-15 | 2016-10-20 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and storage medium for lighting processing on image using model data |
CN111724313A (en) * | 2020-04-30 | 2020-09-29 | 完美世界(北京)软件科技发展有限公司 | Shadow map generation method and device |
CN111862290A (en) * | 2020-07-03 | 2020-10-30 | 完美世界(北京)软件科技发展有限公司 | Radial fuzzy-based fluff rendering method and device and storage medium |
CN111899325A (en) * | 2020-08-13 | 2020-11-06 | 网易(杭州)网络有限公司 | Rendering method and device of crystal stone model, electronic equipment and storage medium |
CN112819941A (en) * | 2021-03-05 | 2021-05-18 | 网易(杭州)网络有限公司 | Method, device, equipment and computer-readable storage medium for rendering water surface |
-
2021
- 2021-09-06 CN CN202111040489.1A patent/CN113836705B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004236157A (en) * | 2003-01-31 | 2004-08-19 | Nec Access Technica Ltd | Image processing device, image processing method, and image processing program |
US20100302272A1 (en) * | 2009-06-01 | 2010-12-02 | Apple Inc. | Enhancing Images Using Known Characteristics of Image Subjects |
US20160307324A1 (en) * | 2015-04-15 | 2016-10-20 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and storage medium for lighting processing on image using model data |
CN111724313A (en) * | 2020-04-30 | 2020-09-29 | 完美世界(北京)软件科技发展有限公司 | Shadow map generation method and device |
CN111862290A (en) * | 2020-07-03 | 2020-10-30 | 完美世界(北京)软件科技发展有限公司 | Radial fuzzy-based fluff rendering method and device and storage medium |
CN111899325A (en) * | 2020-08-13 | 2020-11-06 | 网易(杭州)网络有限公司 | Rendering method and device of crystal stone model, electronic equipment and storage medium |
CN112819941A (en) * | 2021-03-05 | 2021-05-18 | 网易(杭州)网络有限公司 | Method, device, equipment and computer-readable storage medium for rendering water surface |
Non-Patent Citations (3)
Title |
---|
宋振电;侯蓝田;张辉: "彩色图像与灰度图像间转换的程序设计", 电子技术, no. 09, 25 September 2008 (2008-09-25), pages 33 - 36 * |
张娜;秦品乐;曾建潮;李启: "基于密集神经网络的灰度图像着色算法", 计算机应用, no. 06, 21 January 2019 (2019-01-21), pages 1816 - 1823 * |
莫晓斐;丁友东;: "利用形体特征的铅笔素描画生成", 中国图象图形学报, no. 02, 16 February 2013 (2013-02-16), pages 219 - 24 * |
Also Published As
Publication number | Publication date |
---|---|
CN113836705B (en) | 2025-01-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112215934B (en) | Game model rendering method and device, storage medium and electronic device | |
CN111009026B (en) | Object rendering method and device, storage medium and electronic device | |
CN108564646B (en) | Object rendering method and device, storage medium and electronic device | |
CN112316420B (en) | Model rendering method, device, equipment and storage medium | |
JP3141245B2 (en) | How to display images | |
CN109448089A (en) | A kind of rendering method and device | |
CN113648652B (en) | Object rendering method and device, storage medium and electronic equipment | |
CN110443877B (en) | Model rendering method, device, terminal equipment and storage medium | |
CN113450440B (en) | Method, apparatus, computer-readable storage medium, and electronic device for rendering image | |
CN110115841B (en) | Rendering method and device for vegetation object in game scene | |
CN107657648B (en) | Real-time efficient dyeing method and system in mobile game | |
CN111862285A (en) | Rendering method and device for character skin, storage medium, and electronic device | |
CN112274934A (en) | Model rendering method, device, equipment and storage medium | |
CN114119818A (en) | Rendering method, device and device for scene model | |
CN114119848A (en) | Model rendering method and device, computer equipment and storage medium | |
CN109064431B (en) | Picture brightness adjusting method, equipment and storage medium thereof | |
CN111784814B (en) | Virtual character skin adjustment method and device | |
CN113313798B (en) | Cloud picture manufacturing method and device, storage medium and computer equipment | |
CN113440845B (en) | Virtual model rendering method and device, storage medium and electronic device | |
CN113836705B (en) | Method, device, storage medium and electronic device for processing illumination data | |
CN114549732A (en) | Model rendering method and device and electronic equipment | |
CN114187398A (en) | Processing method and device for human body illumination rendering based on normal map | |
CN114565709A (en) | Data storage management method, object rendering method and device | |
JP7301453B2 (en) | IMAGE PROCESSING METHOD, IMAGE PROCESSING APPARATUS, COMPUTER PROGRAM, AND ELECTRONIC DEVICE | |
CN103400410B (en) | A kind of scab pattern outline interactive rendering intent and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |