[go: up one dir, main page]

CN111583365B - Processing method and device for animation element display, storage medium and terminal - Google Patents

Processing method and device for animation element display, storage medium and terminal Download PDF

Info

Publication number
CN111583365B
CN111583365B CN202010332034.6A CN202010332034A CN111583365B CN 111583365 B CN111583365 B CN 111583365B CN 202010332034 A CN202010332034 A CN 202010332034A CN 111583365 B CN111583365 B CN 111583365B
Authority
CN
China
Prior art keywords
display
coefficient
animation
animation element
depth value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010332034.6A
Other languages
Chinese (zh)
Other versions
CN111583365A (en
Inventor
陈志强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202010332034.6A priority Critical patent/CN111583365B/en
Publication of CN111583365A publication Critical patent/CN111583365A/en
Application granted granted Critical
Publication of CN111583365B publication Critical patent/CN111583365B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a processing method and device for displaying animation elements, a storage medium and a terminal, relates to the technical field of data processing, and mainly aims to solve the problems that the display accuracy of the semi-transparent particle animation elements is affected and the display efficiency of the animation elements is reduced because the semi-transparent particle animation elements meeting the requirements of different scenes cannot be displayed at present. Comprising the following steps: acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element; determining a display depth value of the animation element according to the focus coefficient and the display coefficient; and generating an animation element diffuse reflection map corresponding to the animation element according to the display depth value. The method is mainly used for processing the animation elements.

Description

Processing method and device for animation element display, storage medium and terminal
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a processing method and apparatus for displaying animation elements, a storage medium, and a terminal.
Background
With the rapid development of image processing technology, development of technology for three-dimensional animation is also rapidly developing. Wherein, in order to display the three-dimensional animation with diffuse reflection effect, the realization is realized by configuring semi-transparent particles and combining animation elements.
At present, when the existing semi-transparent particle animation elements are displayed, the existing semi-transparent particle animation elements are only displayed according to diffuse reflection, however, the semi-transparent particle animation elements meeting different scene requirements cannot be displayed, and the scene requirements with far and near effects cannot be met.
Disclosure of Invention
In view of this, the invention provides a processing method and device for displaying animation elements, a storage medium and a terminal, and mainly aims to solve the problem that the existing semi-transparent particle animation elements meeting different scene requirements cannot be displayed, the display accuracy of the semi-transparent particle animation elements is affected, and therefore the display efficiency of the animation elements is reduced.
According to one aspect of the present invention, there is provided a processing method of animation element display, including:
acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element;
determining a display depth value of the animation element according to the focus coefficient and the display coefficient;
and generating an animation element diffuse reflection map corresponding to the animation element according to the display depth value.
According to another aspect of the present invention, there is provided a processing apparatus for animation element display, comprising:
the acquisition module is used for acquiring the focal coefficients for displaying the animation elements and the display coefficients of the animation elements;
the determining module is used for determining the display depth value of the animation element according to the focal coefficient and the display coefficient;
and the generation module is used for generating an animation element diffuse reflection map corresponding to the animation element according to the display depth value.
According to still another aspect of the present invention, there is provided a storage medium having stored therein at least one executable instruction for causing a processor to perform operations corresponding to the processing method for animation element display as described above.
According to still another aspect of the present invention, there is provided a terminal including: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction enables the processor to execute the operation corresponding to the processing method for displaying the animation elements.
By means of the technical scheme, the technical scheme provided by the embodiment of the invention has at least the following advantages:
compared with the prior art that the semi-transparent particle animation elements are only displayed according to diffuse reflection degree when being displayed, the embodiment of the invention obtains the focus coefficient for displaying the animation elements and the display coefficient of the animation elements; determining a display depth value of the animation element according to the focus coefficient and the display coefficient; and generating an animation element diffuse reflection map corresponding to the animation elements according to the display depth values, so as to establish the display depth of the element animation in the three-dimensional display process by using the focus coefficients and the display coefficients of the animation elements, and generating the animation element diffuse reflection map matched with each animation element by using the determined display depth values, thereby meeting the requirements of the animation scene for depth display, improving the display accuracy of the semi-transparent particle animation elements, providing diversified display effects and further improving the display efficiency of the animation elements.
The foregoing description is only an overview of the present invention, and is intended to be implemented in accordance with the teachings of the present invention in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present invention more readily apparent.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 is a flowchart of a processing method for displaying an animation element according to an embodiment of the present invention;
FIG. 2 is a flow chart of another method for processing an animation element display according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of displaying a snowflake animation element without animation element display processing according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of displaying a snowflake animation element after processing of displaying an animation element according to an embodiment of the present invention;
FIG. 5 is a block diagram showing the composition of a processing device for displaying an animation element according to an embodiment of the present invention;
FIG. 6 is a block diagram showing another processing apparatus for displaying animation elements according to an embodiment of the present invention;
fig. 7 shows a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The embodiment of the invention provides a processing method for displaying animation elements, as shown in fig. 1, the method comprises the following steps:
101. the focus coefficient for displaying the animation element and the display coefficient of the animation element are acquired.
The focal coefficient is the most clear display state of the animation element in the displayed animation, and may be represented by definition configured to display the most clear pixel, or may be represented by the most clear display distance of the animation element in the displayed three-dimensional effect diagram, that is, the most clear focal distance of the animation element in the three-dimensional effect diagram, where the display coefficient is the display state of the animation element in the animation relative to the display window, and may be represented by definition configured to the pixel, or may be represented by the display distance of the animation element in the displayed three-dimensional effect diagram relative to the display window, that is, the display distance of the animation element in the three-dimensional effect diagram relative to the display window.
The animation element in the embodiment of the present invention is applicable to, but not limited to, semi-transparent particle animation elements, and is applicable to the field of three-dimensional animation in applications such as games, entertainment (video), and the like, and the embodiment of the present invention is not particularly limited. The semi-transparent particle animation element is an animation element with diffuse reflection degree, so that the animation element with different degrees of transparency is displayed, and the method is suitable for configuring the animation element in an animation special effect.
102. And determining the display depth value of the animation element according to the focus coefficient and the display coefficient.
The display depth value is used for representing the blurring degree of the animation element displayed in the animation effect graph, so that the depth of the animation element in the animation effect graph is represented. In the embodiment of the invention, since the focal point coefficient and the display coefficient are both a display state or a display distance of the animation element displayed in the animation, the display depth value of the animation element can be determined by comparing the focal point coefficient and the display coefficient, and the embodiment of the invention is not particularly limited.
It should be noted that, because the animation elements displayed in the three-dimensional effect graph may determine the display depth based on the display position or the pixel definition, the display depth value may be obtained by comparing the display position angle with the pixel definition angle, and the display depth value may be obtained according to the comparison result, for example, the comparison result may include that the focus coefficient is smaller than, equal to, or greater than the display coefficient, so that the display depth value may be determined according to the preset calculation method according to the comparison result.
103. And generating an animation element diffuse reflection map corresponding to the animation element according to the display depth value.
The animation elements in the animation element diffuse reflection graph have different blurring and diffuse reflection effects based on the display depth values, so that scene requirements with far and near effects are met, the display accuracy of the semi-transparent particle animation elements is improved, and the display efficiency of the animation elements is improved.
Compared with the prior art that the animation elements of semi-transparent particles are displayed only according to diffuse reflection, the method for processing the animation element display obtains the focal coefficients for displaying the animation elements and the display coefficients of the animation elements; determining a display depth value of the animation element according to the focus coefficient and the display coefficient; and generating an animation element diffuse reflection map corresponding to the animation elements according to the display depth values, so as to establish the display depth of the element animation in the three-dimensional display process by using the focus coefficients and the display coefficients of the animation elements, and generating the animation element diffuse reflection map matched with each animation element by using the determined display depth values, thereby meeting the requirements of the animation scene for depth display, improving the display accuracy of the semi-transparent particle animation elements, providing diversified display effects and further improving the display efficiency of the animation elements.
The embodiment of the invention provides another processing method for displaying animation elements, as shown in fig. 2, the method comprises the following steps:
201. the method comprises the steps of receiving input focal coefficients for displaying animation elements, and determining display coefficients matching the animation elements through the animation elements relative to display states in display images.
For the embodiment of the invention, as the focal coefficient is the clearest display state of the animation element in the displayed animation, the focal coefficient input by the user can be input by the user, the current end receives the focal coefficient input by the user, and the display coefficient matched with the animation element is determined by the display position of the animation element relative to the display image. The display state may be a position of the animation element relative to the three-dimensional effect diagram, or may be a display size of the animation element in the three-dimensional effect diagram, so that a display coefficient matching the animation element may be determined by the display state. For example, the display coefficient is obtained by using the position difference between the display position of the half-particle animation element in the three-dimensional effect diagram and the display window position of the three-dimensional effect diagram, and the embodiment of the invention is not particularly limited.
The animation element in the embodiment of the present invention is applicable to, but not limited to, semi-transparent particle animation elements, and is applicable to the field of three-dimensional animation in applications such as games, entertainment (video), and the like, and the embodiment of the present invention is not particularly limited. The semi-transparent particle animation element is an animation element with diffuse reflection value, so that the animation element with different degrees of transparency is displayed, and the method is suitable for configuring the animation element in the animation special effect.
202. And comparing the focus coefficient with the display coefficient, and determining the display depth value of the animation element according to the comparison result.
For the embodiment of the invention, in order to accurately determine the display depth value, the focus coefficient and the display coefficient are compared, and the display depth value of the animation element is determined according to the comparison result. The comparison result may include that the focal coefficient is smaller than or equal to the display coefficient, that is, the relative position of the animation element is far away from the focal position, and may include that the focal coefficient is larger than the display coefficient, that is, the relative position of the animation element is close to the focal position, and the comparison result may include that the focal coefficient is smaller than or equal to the display coefficient, that is, the display size of the animation element is smaller than or equal to the display size of the animation element at the focal position, and the pixel definition of the animation element is lower than the pixel definition of the animation element at the focal position, and may further include that the focal coefficient is larger than the display coefficient, that is, the display size of the animation element is larger than the display size of the animation element at the focal position, and the pixel definition of the animation element is lower than the pixel definition of the animation element at the focal position.
For the embodiment of the present invention, in order to effectively use the comparison result of the focal point coefficient and the display coefficient to determine the display depth value, step 202 may specifically be: if the display coefficient is smaller than or equal to the focus coefficient, determining a display depth value by utilizing the ratio of the difference value of the focus coefficient minus the display coefficient to the focus distance; and if the display coefficient is larger than the focus coefficient, determining a display depth value by using the ratio of the difference value of the display coefficient and the focus coefficient to the focus distance.
It should be noted that, for the calculation of the display depth value, if the display coefficient is smaller than or equal to the focus coefficient, the ratio obtained by dividing the difference value of the focus coefficient by the display coefficient is used as the display depth value, for example, (the distance between the focus and the display window)/the focus distance, if the display coefficient is greater than the focus coefficient, the ratio obtained by subtracting the difference value of the focus coefficient from the display coefficient is used as the display depth value, for example, (the distance between the particle and the display window)/the focus distance, the calculated display depth value is between 0 and 1.
203. And acquiring a display scaling value, and adjusting the display depth value according to the display scaling value.
For the embodiment of the present invention, in order to enable the animation element based on the display depth value to satisfy more display scenes, the display zoom value is acquired, which may be entered by the user or may be preconfigured, and the embodiment of the present invention is not particularly limited. Specifically, the display depth value is adjusted according to the display zoom value, so that the display accuracy of the animation element is improved.
It should be noted that, specifically, the adjusted display depth value is obtained by multiplying the display zoom value by the display depth value, for example, the display zoom value is 0-n, where Diffuse is a Diffuse reflection map, specifically, a map for displaying an intrinsic color Of an object, where Diffuse maps have LOD (Level Of detail) levels, such as 0-10 levels, each Level corresponds to one MipMap, specifically, the width Of the map Of each MipMap layer is half Of the width Of the previous layer, that is, the size is one fourth Of the size Of the previous layer, so that an animation element Diffuse reflection map is obtained by the obtained display zoom value Of 0-n, and since the lower the MipMap Level is, the smaller the map size is, one average blurring is made with respect to the previous MipMap, so as to realize the blurring effect Of display. In addition, the LOD coefficients are used to support the difference in importance, position, speed or viewing angle related parameters when the object is far from the observer or the object, which in turn reduces the complexity of rendering the 3D model.
204. And determining the diffuse reflection coefficient of the animation element according to the adjusted display depth value, and performing tri-linear interpolation sampling based on the map layer determined by the diffuse reflection coefficient to generate the animation element diffuse reflection map.
For the embodiment of the invention, in order to match the generation of the Diffuse reflection map of the animation elements, so as to meet the display effect of the Diffuse reflection animation elements, the Diffuse reflection coefficient of the animation elements is a multi-level detail display coefficient, such as the Lod coefficient in the Diffuse Diffuse reflection map, and the adjusted display depth value can be directly used as the Diffuse reflection coefficient or can be further manually selected and determined. When the Diffuse reflection coefficient is determined, the Diffuse reflection coefficient Lod coefficient is utilized to perform three-line sampling, namely, diffuse Diffuse reflection image is sampled, two-dimensional texture coordinates (range 0-1) are required to be configured, the Lod coefficient is used as a Mipmap level to be sampled, when the Diffuse reflection coefficient is determined, the value can be a floating point number, and the obtained sampling target is a semi-transparent particle animation element, namely, the animation element Diffuse reflection image.
As shown in the schematic drawing of the display of the snow animation elements without the animation element display process shown in fig. 3, the blur degree of the far snow animation elements in the display image is the same as that of the near snow animation elements, and the sharpness corresponding to the different focuses cannot be displayed. In the embodiment of the invention, since the display scaling value is preset within the range of 0-n configured by the user, and is preset as the maximum number of levels in the LOD levels, for example, LOD is 0-10, the display scaling value is 10, that is, the Diffuse reflection level comprises 10 levels, and the range of the corresponding display depth value in each level is 0-1, therefore, the display depth value, that is, the LOD coefficient in the level is adjusted by using the display scaling value, the LOD coefficient of the Diffuse reflection Diffuse graph is obtained, the mapping level is determined according to the LOD coefficient, and the MipMap level is obtained. For example, if the calculated display depth value is 0.57, multiplying the calculated display depth value by the display scaling value of 10 to obtain LOD coefficients of 5.7 in the hierarchy, and determining that 2 MipMap hierarchies are 5 and 6, and the LOD coefficients are 5 and 6. In addition, in the embodiment of the invention, for the three-line interpolation sampling based on the map layer, the method specifically comprises the steps of respectively and independently performing bilinear sampling on the determined 2 MipMap layers to obtain a sampling result as a color RGBA value of a pixel, thereby obtaining an animation element diffuse reflection map, and performing linear interpolation on the color RGBA values respectively obtained by the 2 MipMap layers, wherein the coefficient of the linear interpolation is a layer LOD coefficient corresponding to the adjusted display depth value minus a LOD coefficient of a low layer in the 2 MipMap layers, such as LOD coefficients 5.7-5 in the layer obtained based on the display scaling value and low-layer LOD coefficient 5=0.7 in the 6 layer, and obtaining a snowflake animation element display schematic diagram which is shown in fig. 4 and is subjected to animation element display processing based on the sampling result, so that the definition of the far-position and near-position animation elements can be clearly displayed.
Further, in order to ensure that the animation element diffuse reflection map generated based on the display depth value meets the display requirement of the user, the embodiment of the invention further comprises: judging whether the display depth value exceeds a preset depth threshold value or not; if yes, the display coefficient and/or the focus coefficient are/is indicated to be updated.
The preset depth threshold is a display safety threshold pre-configured by a user to avoid distortion of the display animation element, so that when the display depth value exceeds the preset depth threshold, the user is instructed to update the display coefficient and/or the focus coefficient, thereby recalculating the corresponding display depth value.
Compared with the prior art that the semi-transparent particle animation elements are displayed only according to diffuse reflection, the embodiment of the invention obtains the focal coefficients for displaying the animation elements and the display coefficients of the animation elements; determining a display depth value of the animation element according to the focus coefficient and the display coefficient; and generating an animation element diffuse reflection map corresponding to the animation elements according to the display depth values, so as to establish the display depth of the element animation in the three-dimensional display process by using the focus coefficients and the display coefficients of the animation elements, and generating the animation element diffuse reflection map matched with each animation element by using the determined display depth values, thereby meeting the requirements of the animation scene for depth display, improving the display accuracy of the semi-transparent particle animation elements, providing diversified display effects and further improving the display efficiency of the animation elements.
Further, as an implementation of the method shown in fig. 1, an embodiment of the present invention provides a processing apparatus for displaying an animation element, as shown in fig. 5, where the apparatus includes: an acquisition module 31, a determination module 32, a generation module 33.
An acquisition module 31 for acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element;
a determining module 32, configured to determine a display depth value of the animation element according to the focal coefficient and the display coefficient;
and the generating module 33 is used for generating an animation element diffuse reflection map corresponding to the animation element according to the display depth value.
Compared with the prior art that the animation elements of semi-transparent particles are displayed only according to diffuse reflection, the embodiment of the invention obtains the focal coefficients for displaying the animation elements and the display coefficients of the animation elements; determining a display depth value of the animation element according to the focus coefficient and the display coefficient; and generating an animation element diffuse reflection map corresponding to the animation elements according to the display depth values, so as to establish the display depth of the element animation in the three-dimensional display process by using the focus coefficients and the display coefficients of the animation elements, and generating the animation element diffuse reflection map matched with each animation element by using the determined display depth values, thereby meeting the requirements of the animation scene for depth display, improving the display accuracy of the semi-transparent particle animation elements, providing diversified display effects and further improving the display efficiency of the animation elements.
Further, as an implementation of the method shown in fig. 2, an embodiment of the present invention provides another processing apparatus for displaying an animation element, as shown in fig. 6, where the apparatus includes: the device comprises an acquisition module 41, a determination module 42, a generation module 43, an adjustment module 44, a judgment module 45 and an indication module 46.
An acquisition module 41 for acquiring a focus coefficient for displaying the animation element, and a display coefficient of the animation element;
a determining module 42, configured to determine a display depth value of the animation element according to the focal coefficient and the display coefficient;
and the generating module 43 is configured to generate an animation element diffuse reflection map corresponding to the animation element according to the display depth value.
Further, the determining module 42 is specifically configured to compare the focal coefficient and the display coefficient, and determine a display depth value of the animation element according to a comparison result.
Further, the determining module 42 is specifically configured to determine a display depth value by using a ratio of the difference value of the focus coefficient minus the display coefficient to the focus distance if the display coefficient is less than or equal to the focus coefficient;
the determining module 42 is specifically further configured to determine a display depth value by using a ratio of the difference of the display coefficient minus the focal point coefficient to the focal distance if the display coefficient is greater than the focal point coefficient.
Further, the apparatus further comprises:
the adjustment module 44 is configured to obtain a display scaling value, and adjust the display depth value according to the display scaling value.
Further, the generating module 43 is specifically configured to determine a diffuse reflection coefficient of the animation element according to the adjusted display depth value, perform tri-linear interpolation sampling based on a mapping hierarchy determined by the diffuse reflection coefficient, and generate a diffuse reflection map of the animation element.
Further, the obtaining module 41 is specifically configured to receive the entered focal coefficient for displaying the animation element, and determine, by the animation element, a display coefficient matching the animation element with respect to the display state in the display image.
Further, the apparatus further comprises:
a judging module 45, configured to judge whether the display depth value exceeds a preset depth threshold;
and the indication module 46 is configured to, if yes, indicate to update the display coefficient and/or the focus coefficient.
Compared with the prior art that the animation elements with semi-transparent particles are displayed only according to diffuse reflection, the embodiment of the invention obtains the focal coefficients for displaying the animation elements and the display coefficients of the animation elements; determining a display depth value of the animation element according to the focus coefficient and the display coefficient; and generating an animation element diffuse reflection map corresponding to the animation elements according to the display depth values, so as to establish the display depth of the element animation in the three-dimensional display process by using the focus coefficients and the display coefficients of the animation elements, and generating the animation element diffuse reflection map matched with each animation element by using the determined display depth values, thereby meeting the requirements of the animation scene for depth display, improving the display accuracy of the semi-transparent particle animation elements, providing diversified display effects and further improving the display efficiency of the animation elements.
According to one embodiment of the present invention, there is provided a storage medium storing at least one executable instruction for performing the processing method for displaying an animation element in any of the above-described method embodiments.
Fig. 7 is a schematic structural diagram of a terminal according to an embodiment of the present invention, and the specific embodiment of the present invention is not limited to the specific implementation of the terminal.
As shown in fig. 7, the terminal may include: a processor 502, a communication interface (Communications Interface) 504, a memory 506, and a communication bus 508.
Wherein: processor 502, communication interface 504, and memory 506 communicate with each other via communication bus 508.
A communication interface 504 for communicating with network elements of other devices, such as clients or other servers.
The processor 502 is configured to execute the program 510, and may specifically perform relevant steps in the processing method embodiment for displaying the animation element.
In particular, program 510 may include program code including computer-operating instructions.
The processor 502 may be a central processing unit CPU, or a specific integrated circuit ASIC (Application Specific Integrated Circuit), or one or more integrated circuits configured to implement embodiments of the present invention. The one or more processors included in the terminal may be the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs.
A memory 506 for storing a program 510. Memory 506 may comprise high-speed RAM memory or may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 510 may be specifically operable to cause the processor 502 to:
acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element;
determining a display depth value of the animation element according to the focus coefficient and the display coefficient;
and generating an animation element diffuse reflection map corresponding to the animation element according to the display depth value.
It will be appreciated by those skilled in the art that the modules or steps of the invention described above may be implemented in a general purpose computing device, they may be concentrated on a single computing device, or distributed across a network of computing devices, they may alternatively be implemented in program code executable by computing devices, so that they may be stored in a memory device for execution by computing devices, and in some cases, the steps shown or described may be performed in a different order than that shown or described, or they may be separately fabricated into individual integrated circuit modules, or multiple modules or steps within them may be fabricated into a single integrated circuit module for implementation. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (16)

1. A method for processing an animation element display, comprising:
obtaining a focus coefficient for displaying the animation element and a display coefficient of the animation element, wherein the focus coefficient is characterized by a definition configured to display the image at the clearest or a display distance when the image is in the clearest state, and the display coefficient is characterized by a definition configured to display pixels or a display distance of the animation element relative to a display window;
determining a display depth value of the animation element according to the focus coefficient and the display coefficient, wherein the display depth value is used for representing the blurring degree of the animation element displayed in an animation effect graph;
and generating an animation element diffuse reflection map corresponding to the animation element according to the display depth value.
2. The method of claim 1, wherein said determining a display depth value of the animation element from the focus factor, the display factor comprises:
and comparing the focus coefficient with the display coefficient, and determining the display depth value of the animation element according to the comparison result.
3. The method of claim 2, wherein determining the display depth value of the animation element based on the comparison result comprises:
if the display coefficient is smaller than or equal to the focus coefficient, determining a display depth value by utilizing the ratio of the difference value of the focus coefficient minus the display coefficient to the focus distance;
and if the display coefficient is larger than the focus coefficient, determining a display depth value by using the ratio of the difference value of the display coefficient and the focus coefficient to the focus distance.
4. The method according to claim 2, wherein the method further comprises:
and acquiring a display scaling value, and adjusting the display depth value according to the display scaling value.
5. The method of claim 4, wherein generating an animation element diffuse map corresponding to the animation element from the display depth value comprises:
and determining the diffuse reflection coefficient of the animation element according to the adjusted display depth value, and performing tri-linear interpolation sampling based on the map layer determined by the diffuse reflection coefficient to generate the animation element diffuse reflection map.
6. The method of any of claims 1-5, wherein the obtaining the focus factor for displaying the animation element and the display factor for the animation element comprises:
the method comprises the steps of receiving input focal coefficients for displaying animation elements, and determining display coefficients matching the animation elements through the animation elements relative to display states in display images.
7. The method of claim 6, wherein the method further comprises:
judging whether the display depth value exceeds a preset depth threshold value or not;
if yes, the display coefficient and/or the focus coefficient are/is indicated to be updated.
8. A processing apparatus for displaying an animation element, comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a focus coefficient for displaying the animation element and a display coefficient of the animation element, the focus coefficient is characterized by being configured to be the clearest definition of a display image or the display distance when the display image is in the clearest state, and the display coefficient is characterized by being configured to be the definition of a pixel or the display distance of the animation element relative to a display window;
the determining module is used for determining a display depth value of the animation element according to the focal coefficient and the display coefficient, wherein the display depth value is used for representing the blurring degree of the animation element displayed in the animation effect graph;
and the generation module is used for generating an animation element diffuse reflection map corresponding to the animation element according to the display depth value.
9. The apparatus of claim 8, wherein the device comprises a plurality of sensors,
the determining module is specifically configured to compare the focal coefficient and the display coefficient, and determine a display depth value of the animation element according to a comparison result.
10. The apparatus of claim 9, wherein the device comprises a plurality of sensors,
the determining module is specifically configured to determine a display depth value by using a ratio of a difference value of the focus coefficient minus the display coefficient to the focus distance if the display coefficient is less than or equal to the focus coefficient;
the determining module is specifically further configured to determine a display depth value by using a ratio of the difference value of the display coefficient minus the focal point coefficient to the focal point distance if the display coefficient is greater than the focal point coefficient.
11. The apparatus of claim 9, wherein the apparatus further comprises:
and the adjusting module is used for acquiring the display scaling value and adjusting the display depth value according to the display scaling value.
12. The apparatus of claim 11, wherein the device comprises a plurality of sensors,
the generation module is specifically configured to determine a diffuse reflection coefficient of the animation element according to the adjusted display depth value, and perform tri-linear interpolation sampling based on a map layer determined by the diffuse reflection coefficient, so as to generate a diffuse reflection map of the animation element.
13. The device according to any one of claims 8-12, wherein,
the acquisition module is specifically used for receiving the input focal coefficients for displaying the animation elements and determining the display coefficients matched with the animation elements through the display states of the animation elements relative to the display images.
14. The apparatus of claim 13, wherein the apparatus further comprises:
the judging module is used for judging whether the display depth value exceeds a preset depth threshold value or not;
and the indication module is used for indicating to update the display coefficient and/or the focus coefficient if yes.
15. A storage medium having stored therein at least one executable instruction for causing a processor to perform operations corresponding to the method of processing animated element displays of any of claims 1-7.
16. A terminal, comprising: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is configured to store at least one executable instruction, where the executable instruction causes the processor to perform operations corresponding to a processing method for animation element display according to any one of claims 1-7.
CN202010332034.6A 2020-04-24 2020-04-24 Processing method and device for animation element display, storage medium and terminal Active CN111583365B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010332034.6A CN111583365B (en) 2020-04-24 2020-04-24 Processing method and device for animation element display, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010332034.6A CN111583365B (en) 2020-04-24 2020-04-24 Processing method and device for animation element display, storage medium and terminal

Publications (2)

Publication Number Publication Date
CN111583365A CN111583365A (en) 2020-08-25
CN111583365B true CN111583365B (en) 2023-09-19

Family

ID=72124460

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010332034.6A Active CN111583365B (en) 2020-04-24 2020-04-24 Processing method and device for animation element display, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN111583365B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11232486A (en) * 1998-02-12 1999-08-27 Hitachi Ltd 3D video playback device and method
CN101533529A (en) * 2009-01-23 2009-09-16 北京建筑工程学院 Range image-based 3D spatial data processing method and device
JP2011203731A (en) * 2010-03-25 2011-10-13 Seiko Epson Corp System and method for generating aerial three-dimensional image
JP2011250297A (en) * 2010-05-28 2011-12-08 Nidec Sankyo Corp Contact image sensor
CN103984553A (en) * 2014-05-26 2014-08-13 中科创达软件股份有限公司 3D (three dimensional) desktop display method and system
CN104133624A (en) * 2014-07-10 2014-11-05 腾讯科技(深圳)有限公司 Webpage animation display method, webpage animation display device and terminal
CN104461256A (en) * 2014-12-30 2015-03-25 广州视源电子科技股份有限公司 interface element display method and system
JP2017033314A (en) * 2015-07-31 2017-02-09 凸版印刷株式会社 Image processing system, method and program
WO2018109372A1 (en) * 2016-12-14 2018-06-21 Cyclopus Method for digital image processing
CN108270971A (en) * 2018-01-31 2018-07-10 努比亚技术有限公司 A kind of method, equipment and the computer readable storage medium of mobile terminal focusing
CN110910477A (en) * 2018-08-27 2020-03-24 北京京东尚科信息技术有限公司 Page animation display method and device and computer readable storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2003903445A0 (en) * 2003-06-26 2003-07-17 Canon Kabushiki Kaisha Optimising compositing calculations for a run of pixels
JP5214547B2 (en) * 2009-07-03 2013-06-19 富士フイルム株式会社 Image display apparatus and method, and program
JP5316385B2 (en) * 2009-12-01 2013-10-16 富士ゼロックス株式会社 Image processing apparatus and image processing program

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11232486A (en) * 1998-02-12 1999-08-27 Hitachi Ltd 3D video playback device and method
CN101533529A (en) * 2009-01-23 2009-09-16 北京建筑工程学院 Range image-based 3D spatial data processing method and device
JP2011203731A (en) * 2010-03-25 2011-10-13 Seiko Epson Corp System and method for generating aerial three-dimensional image
JP2011250297A (en) * 2010-05-28 2011-12-08 Nidec Sankyo Corp Contact image sensor
CN103984553A (en) * 2014-05-26 2014-08-13 中科创达软件股份有限公司 3D (three dimensional) desktop display method and system
CN104133624A (en) * 2014-07-10 2014-11-05 腾讯科技(深圳)有限公司 Webpage animation display method, webpage animation display device and terminal
CN104461256A (en) * 2014-12-30 2015-03-25 广州视源电子科技股份有限公司 interface element display method and system
JP2017033314A (en) * 2015-07-31 2017-02-09 凸版印刷株式会社 Image processing system, method and program
WO2018109372A1 (en) * 2016-12-14 2018-06-21 Cyclopus Method for digital image processing
CN108270971A (en) * 2018-01-31 2018-07-10 努比亚技术有限公司 A kind of method, equipment and the computer readable storage medium of mobile terminal focusing
CN110910477A (en) * 2018-08-27 2020-03-24 北京京东尚科信息技术有限公司 Page animation display method and device and computer readable storage medium

Also Published As

Publication number Publication date
CN111583365A (en) 2020-08-25

Similar Documents

Publication Publication Date Title
CN112184603B (en) Point cloud fusion method and device, electronic equipment and computer storage medium
JP4625805B2 (en) Method and scaling unit for scaling a three-dimensional model
US7528831B2 (en) Generation of texture maps for use in 3D computer graphics
RU2754721C2 (en) Device and method for generating an image of the intensity of light radiation
CN112652046B (en) Game picture generation method, device, equipment and storage medium
EP1803096B1 (en) Flexible antialiasing in embedded devices
US6396503B1 (en) Dynamic texture loading based on texture tile visibility
US20020085748A1 (en) Image generation method and apparatus
CN109035383B (en) Volume cloud drawing method and device and computer readable storage medium
AU2013206601A1 (en) Variable blend width compositing
CN113643414B (en) Three-dimensional image generation method and device, electronic equipment and storage medium
US9704272B2 (en) Motion blur using cached texture space blur
CN112468796A (en) Method, system and equipment for generating fixation point
Hornung et al. Interactive pixel‐accurate free viewpoint rendering from images with silhouette aware sampling
CN111583365B (en) Processing method and device for animation element display, storage medium and terminal
EP3924941B1 (en) Apparatus and method for generating a light intensity image
Verma et al. 3D Rendering-Techniques and challenges
Petikam et al. Visual perception of real world depth map resolution for mixed reality rendering
CN117745888B (en) Video generation method, device, electronic equipment and storage medium
TWI880679B (en) Method and apparatus for frame interpolation using both optical motion and in-game motion
Marrinan et al. Image Synthesis from a Collection of Depth Enhanced Panoramas: Creating Interactive Extended Reality Experiences from Static Images
CN116363331B (en) Image generation method, device, equipment and storage medium
EP1926052B1 (en) Method, medium, and system rendering 3 dimensional graphics data considering fog effect
Weiskopf Fast visualization of special relativistic effects on geometry and illumination
CN118678014A (en) Frame interpolation using optical motion compensation and in-game motion compensation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant