CN119672196A - Voxel adjustment method, device, electronic device and readable storage medium - Google Patents
Voxel adjustment method, device, electronic device and readable storage medium Download PDFInfo
- Publication number
- CN119672196A CN119672196A CN202411742782.6A CN202411742782A CN119672196A CN 119672196 A CN119672196 A CN 119672196A CN 202411742782 A CN202411742782 A CN 202411742782A CN 119672196 A CN119672196 A CN 119672196A
- Authority
- CN
- China
- Prior art keywords
- weight
- target
- normal
- sampling value
- texture sampling
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Image Generation (AREA)
Abstract
The embodiment of the invention provides a voxel adjustment method, a voxel adjustment device, electronic equipment and a readable storage medium, and relates to the field of voxel processing, wherein the voxel adjustment method comprises the following steps: and determining the vertex normals and the surface normals of all voxels in the voxel data to be processed, determining the included angle between the vertex normals and the surface normals, determining the target vertex normals based on the vertex normals and the surface normals when the included angle is larger than a preset angle, determining texture sampling values based on the target vertex normals, and adjusting the voxel data to be processed based on the texture sampling values. When the tri-plane mapping is used for texture mapping, voxels with deformed polar ends are adjusted, so that the texture stretching phenomenon is avoided.
Description
Technical Field
The present invention relates to the field of voxel processing, and in particular, to a voxel adjustment method, a voxel adjustment device, an electronic device, and a readable storage medium.
Background
In games, voxels are often used to generate and represent complex terrain and may also be used to achieve damaging effects such as changes in terrain after an explosion. The discretization of the voxels makes the simulation of the destruction more intuitive and simple.
Tri-plane mapping is a texture mapping technique that avoids the stretching and distortion problems of conventional texture mapping by projecting textures on three orthogonal planes.
Voxel and tri-planar mapping techniques provide powerful tools in game development for generating and rendering complex three-dimensional environments. Voxel technology allows for efficient representation and manipulation of three-dimensional space, while tri-planar mapping provides a flexible texture mapping approach that avoids many of the problems of conventional UV mapping. By combining these two techniques, a developer can create a more realistic and dynamic game world.
The tri-plane mapping technique is applied to voxel data with changes, which can cause texture stretching problems due to extreme changes in the voxel model. For example, when there is deformation of the terrain, texture stretching problems can exist even with tri-planar mapping techniques.
Disclosure of Invention
The invention aims to provide a voxel adjustment method which can avoid the phenomenon of texture stretching when a tri-plane mapping technology is used.
In order to achieve the above object, the technical scheme adopted by the embodiment of the invention is as follows:
In a first aspect, an embodiment of the present invention provides a voxel adjustment method, where the method includes:
determining voxel data to be processed;
determining the vertex normals and the surface normals of all voxels in the voxel data to be processed;
determining an included angle between the vertex normal and the surface normal;
when the included angle is larger than a preset angle, determining a target vertex normal based on the vertex normal and the surface normal;
determining a texture sampling value based on the target vertex normal;
and adjusting the voxel data to be processed based on the texture sampling value.
In an alternative embodiment, the step of determining the target vertex normal based on the vertex normal and the surface normal when the included angle is greater than a preset angle includes:
When the included angle is larger than a preset angle, determining an offset mixing coefficient;
And determining a target vertex normal based on the offset mixture coefficient, the vertex normal and the surface normal.
In an alternative embodiment, the step of determining the target vertex normal based on the offset blend coefficient, the vertex normal, and the surface normal includes:
Calculating a first product of the offset blend coefficient and the vertex normal;
calculating a difference value between a preset value and the offset mixing coefficient;
calculating a second product of the difference and the surface normal;
and calculating the sum of the first product and the second product as a target vertex normal.
In an alternative embodiment, the step of determining a texture sampling value based on the target vertex normal includes:
Calculating a first weight of the target vertex normal on an XY plane, a second weight of a YZ plane and a third weight of an XZ plane;
normalizing the first weight, the second weight and the third weight to obtain a first target weight, a second target weight and a third target weight;
determining a first texture sampling value of an XY surface normal, a second texture sampling value of a YZ surface normal and a third texture sampling value of an XZ surface normal;
And determining a texture sampling value based on the first target weight, the second target weight, the third target weight, the first texture sampling value, the second texture sampling value and the third texture sampling value.
In an optional implementation manner, the step of normalizing the first weight, the second weight and the third weight to obtain a first target weight, a second target weight and a third target weight includes:
Calculating a weight sum of the first weight, the second weight and the third weight;
Calculating the ratio of the first weight to the weight sum to be used as a first target weight;
calculating the ratio of the second weight to the weight sum as a second target weight;
And calculating the ratio of the third weight to the weight sum as a third target weight.
In an alternative embodiment, the step of determining the texture sample value based on the first target weight, the second target weight, the third target weight, the first texture sample value, the second texture sample value, and the third texture sample value includes:
calculating a third product of the first target weight and the first texture sample value;
calculating a fourth product of the second target weight and the second texture sample value;
calculating a fifth product of the third target weight and the third texture sample value;
and calculating the sum of the third product, the fourth product and the fifth product as a texture sampling value.
In an alternative embodiment, the step of adjusting the voxel to be processed based on the texture sample value includes:
Determining an original texture sampling value of the voxel data to be processed;
And replacing the original texture sampling value with the texture sampling value.
In a second aspect, an embodiment of the present invention provides a voxel adjustment device, the device including:
The device comprises a determining module, a determining module and a texture sampling value determining module, wherein the determining module is used for determining voxel data to be processed, determining vertex normals and surface normals of all voxels in the voxel data to be processed, determining an included angle between the vertex normals and the surface normals, determining a target vertex normals based on the vertex normals and the surface normals when the included angle is larger than a preset angle, and determining the texture sampling value based on the target vertex normals;
And the adjusting module is used for adjusting the voxel data to be processed based on the texture sampling value.
In a third aspect, an embodiment of the present invention provides an electronic device, including a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the voxel adjustment method when executing the computer program.
In a fourth aspect, an embodiment of the present invention provides a readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the voxel adjustment method.
The invention has the following beneficial effects:
The method comprises the steps of determining voxel data to be processed;
and determining the vertex normals and the surface normals of all voxels in the voxel data to be processed, determining the included angle between the vertex normals and the surface normals, determining the target vertex normals based on the vertex normals and the surface normals when the included angle is larger than a preset angle, determining texture sampling values based on the target vertex normals, and adjusting the voxel data to be processed based on the texture sampling values. When the tri-plane mapping is used for texture mapping, voxels with deformed polar ends are adjusted, so that the texture stretching phenomenon is avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic block diagram of an electronic device according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a voxel adjustment method according to an embodiment of the present invention;
FIG. 3 is a second flowchart of a voxel adjustment method according to an embodiment of the present invention;
FIG. 4 is a third flowchart illustrating a voxel adjustment method according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating a voxel adjustment method according to an embodiment of the present invention;
FIG. 6 is a flowchart of a voxel adjustment method according to an embodiment of the present invention;
Fig. 7 is a schematic structural diagram of a voxel adjustment device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that like reference numerals and letters refer to like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
In the description of the present invention, it should be noted that, if the terms "upper", "lower", "inner", "outer", and the like indicate an azimuth or a positional relationship based on the azimuth or the positional relationship shown in the drawings, or the azimuth or the positional relationship in which the inventive product is conventionally put in use, it is merely for convenience of describing the present invention and simplifying the description, and it is not indicated or implied that the apparatus or element referred to must have a specific azimuth, be configured and operated in a specific azimuth, and thus it should not be construed as limiting the present invention.
Furthermore, the terms "first," "second," and the like, if any, are used merely for distinguishing between descriptions and not for indicating or implying a relative importance.
In the description of the present invention, it should also be noted that, unless explicitly specified and limited otherwise, the terms "disposed," "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, integrally connected, mechanically connected, electrically connected, directly connected, indirectly connected through an intermediary, or in communication between two elements. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
As the inventors have studied extensively, voxels are often used in games to create and represent complex terrain and can also be used to achieve damaging effects such as post-explosion terrain changes. The discretization of the voxels makes the simulation of the destruction more intuitive and simple.
Tri-plane mapping is a texture mapping technique that avoids the stretching and distortion problems of conventional texture mapping by projecting textures on three orthogonal planes.
Voxel and tri-planar mapping techniques provide powerful tools in game development for generating and rendering complex three-dimensional environments. Voxel technology allows for efficient representation and manipulation of three-dimensional space, while tri-planar mapping provides a flexible texture mapping approach that avoids many of the problems of conventional UV mapping. By combining these two techniques, a developer can create a more realistic and dynamic game world.
The tri-plane mapping technique is applied to voxel data with changes, which can cause texture stretching problems due to extreme changes in the voxel model. For example, when there is deformation of the terrain, texture stretching problems can exist even with tri-planar mapping techniques.
In view of the above-mentioned problems, the present embodiment provides a voxel adjustment method, apparatus, electronic device, and readable storage medium, which can determine a vertex normal and a surface normal of each voxel in voxel data to be processed, determine an included angle between the vertex normal and the surface normal, determine a target vertex normal based on the vertex normal and the surface normal when the included angle is greater than a preset angle, determine a texture sampling value based on the target vertex normal, and adjust the voxel data to be processed based on the texture sampling value. When the tri-planar mapping is used for texture mapping, voxels with deformed polar ends are adjusted so as to avoid the texture stretching phenomenon, and the scheme provided by this embodiment is described in detail below.
The present embodiment provides an electronic device that can adjust voxels. In one possible implementation, the electronic device may be a user terminal, for example, the electronic device may be, but is not limited to, a server, a smart phone, a Personal computer (PersonalComputer, PC), a tablet computer, a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA), a Mobile internet appliance (Mobile INTERNET DEVICE, MID), an image capturing device, and the like.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an electronic device 100 according to an embodiment of the invention. The electronic device 100 may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
The electronic device 100 comprises voxel adjustment means 110, a memory 120 and a processor 130.
The memory 120 and the processor 130 are electrically connected directly or indirectly to each other to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The voxel adjustment device 110 comprises at least one software function module which may be stored in the memory 120 in the form of software or firmware (firmware) or cured in an Operating System (OS) of the electronic device 100. The processor 130 is configured to execute executable modules stored in the memory 120, such as software functional modules and computer programs included in the voxel adjustment device 110.
The Memory 120 may be, but is not limited to, a random access Memory (RandomAccess Memory, RAM), a Read Only Memory (ROM), a programmable Read Only Memory (Programmable Read-Only Memory, PROM), an erasable Read Only Memory (Erasable ProgrammableRead-Only Memory, EPROM), an electrically erasable Read Only Memory (Electric Erasable ProgrammableRead-Only Memory, EEPROM), etc. The memory 120 is configured to store a program, and the processor 130 executes the program after receiving an execution instruction.
Referring to fig. 2, fig. 2 is a flowchart of a voxel adjustment method applied to the electronic device 100 of fig. 1, and the method includes various steps described in detail below.
And S201, determining voxel data to be processed.
Voxel data refers to the division of a three-dimensional space into a number of tiny volume units, each unit being called a voxel, similar to a pixel in a two-dimensional image. These voxels may contain attribute information such as color, density, texture, etc., and are commonly used to construct and represent three-dimensional objects or scenes.
And S202, determining the vertex normals and the surface normals of all voxels in the voxel data to be processed.
The method for processing the vertex normals and the Surface normals of each voxel of the voxel data comprises the steps of extracting Surface grids from the voxel data through Marching Cubes algorithm or Surface Net algorithm and the like when converting the voxel data to be processed into a polygonal model for rendering or further geometric processing, obtaining the vertex normals and the Surface normals corresponding to the vertex, and obtaining the vertex normals and the Surface normals of each Surface grid.
Vertex normals are typically used for smooth coloring, where a vertex belongs to multiple faces, its normals are a weighted average of all the face normals that share the vertex, and the weights may be areas, angles, or other metrics. This averaging process helps to achieve a smooth transition at the vertices, making the lighting effect more natural, especially in areas where the model has less curvature variation.
The surface normal is a vector perpendicular to the polygonal surface, and for a triangular patch can be obtained by computing the vector cross-product of any two non-collinear edges within the plane.
And S203, determining an included angle between the vertex normal and the surface normal.
Calculating the angle between the vertex normal and the specific surface normal generally involves vector dot product, and the angle between the vertex normal and the surface normal can be found by the following formula:
Wherein the method comprises the steps of Representing the angle between the vertex normal and the surface normal, a and B representing the vector form of the vertex normal and the vector form of the surface normal, respectively, and |a| and |b| representing the modulo lengths of the vertex normal and the surface normal, respectively.
And S204, when the included angle is larger than a preset angle, determining the target vertex normal based on the vertex normal and the surface normal.
S205, determining a texture sampling value based on the target vertex normal.
And S206, adjusting the voxel data to be processed based on the texture sampling value.
Comparing the included angle between the vertex normal and the surface normal with a preset angle, when the included angle between the vertex normal and the surface normal is larger than the preset angle, the difference between the vertex normal of the triangular surface patch and the surface normal of the surface patch is large, and when the existing three-plane mapping is used for texture mapping, the calculated weight deviation of the three mapping planes is large, and finally, the phenomenon of stretching of the texture can be caused. Therefore, when the included angle between the vertex normal and the surface normal is larger than a preset threshold, the deflected vertex normal, namely the target vertex normal, needs to be recalculated based on the vertex normal and the surface normal, and the target vertex normal participates in weight calculation to obtain a texture sampling value, and the voxel data to be processed is adjusted based on the texture sampling value, so that the problem of texture stretching of voxel deformation in the voxel data to be processed is repaired on the premise that the illumination effect is not affected.
It should be noted that the preset angle may be set according to actual situations, which is not particularly limited in the embodiment of the present invention.
And performing tri-plane mapping processing based on the normal of the target vertex to obtain a texture sampling value, wherein tri-plane mapping is a texture mapping technology and is mainly used in 3D graphics rendering, especially in processing texture mapping of very complex or non-uniform curved surfaces. Tri-planar mapping is particularly useful when conventional UV mapping is difficult to process models such as caverns, rock surfaces, or complex organisms that do not have distinct flat areas.
The tri-planar mapping is based on projecting textures along three major coordinate axes of the model. The texture will be "stretched" and projected onto the corresponding plane of each axis, forming three independent projections. For each point of the voxel data surface to be processed, the final texture color is determined from the projections of that point onto the three axial planes. Typically, this involves calculating the distance of the point to three planes or weighting the contributions of these three axes according to the surface normal vector. To obtain a smooth transition at the texture boundaries of different axes, seamless texture or blending at the edges is typically used to ensure that the mapping effect is naturally continuous.
When the included angle is greater than the preset angle, there are various implementations of determining the target vertex normal based on the vertex normal and the surface normal, and in one implementation, as shown in fig. 3, the method includes the following steps:
And S301, when the included angle is larger than a preset angle, determining an offset mixing coefficient.
S302, determining a target vertex normal based on the offset mixing coefficient, the vertex normal and the surface normal.
It should be noted that, the offset mixing coefficient is an adjustable parameter, and the offset mixing coefficient may be specifically set according to actual needs, which is not specifically limited in the embodiment of the present invention.
The offset blending coefficient specifies the amount of offset of each texture relative to the base texture coordinates and the weight of how they interact when blending two or more textures. The offset blending coefficient may control the location of textures on the model surface and the degree of fusion with each other, thereby achieving a richer and finer visual effect.
Various implementations of determining the target vertex normals based on the offset blending coefficient, the vertex normals, and the surface normals, in one implementation, as shown in FIG. 4, include the steps of:
s320-1, calculating a first product of the offset blending coefficient and the vertex normal.
S302-2, calculating a difference value between the preset value and the offset mixing coefficient.
S302-3, calculating a second product of the difference value and the surface normal.
S302-4, calculating the sum of the first product and the second product as the normal line of the target vertex.
The calculation of the deflected vertex normal, i.e. the target vertex normal, based on the offset mixing coefficient, the vertex normal and the surface normal may be performed by the following formula:
N‘=Mblend×N+(1-Mblend)×Nface
Where N is the vertex normal, N face is the surface normal, M blend is the offset blend coefficient, and N' is the target vertex normal.
When the included angle between the vertex normal and the surface normal is larger than a preset angle, the vertex normal is deflected to the surface normal by a certain angle based on the formula, and the deflected target vertex normal is obtained.
There are various implementations of determining texture sample values based on target vertex normals, and in one implementation, as shown in FIG. 5, the method includes the steps of:
s205-1, calculating a first weight of the normal line of the target vertex in the XY plane, a second weight of the YZ plane and a third weight of the XZ plane.
Calculating the first weight of the target vertex normal on the XY plane can be calculated by the following formula:
N x ' is the value of the x-axis of the target vertex finding N ', N ' is the deflected vertex normal, δ and m are external tuning parameters to correct the final effect.
Calculating the second weight of the target vertex normal in the YZ plane can be calculated by the following formula:
N y ' is the value of the y-axis of the target vertex found N '.
Calculating the third weight of the target vertex normal in the XZ plane can be calculated by the following formula:
N z ' is the value of the z-axis of target vertex found N '.
The XY plane, also called the horizontal plane, is made up of the x-axis and the y-axis, all points lying on this plane having the characteristic of a z-coordinate equal to 0 (z=0). YZ plane, which is perpendicular to the x-axis, is composed of the y-axis and the z-axis, and the x-coordinate of all points on the plane is 0 (x=0). XZ plane, which is perpendicular to the y-axis and is formed by the x-axis and the z-axis, the y-coordinates of all points on the plane are 0 (y=0).
And S205-2, normalizing the first weight, the second weight and the third weight to obtain a first target weight, a second target weight and a third target weight.
S205-3, determining a first texture sampling value of the XY surface normal, a second texture sampling value of the YZ surface normal and a third texture sampling value of the XZ surface normal.
S205-4 determining a texture sample value based on the first target weight, the second target weight, the third target weight, the first texture sample value, the second texture sample value, and the third texture sample value.
The specific way of normalizing the first weight, the second weight and the third weight to obtain the first target weight, the second target weight and the third target weight may be:
Calculating the weight sum of the first weight, the second weight and the third weight, calculating the ratio of the first weight to the weight sum as a first target weight, calculating the ratio of the second weight to the weight sum as a second target weight, and calculating the ratio of the third weight to the weight sum as a third target weight.
The first target weight may be calculated by:
Wherein b x is a first weight of the target vertex normal in the XY plane, b y is a second weight of the target vertex normal in the YZ plane, b z is a third weight of the target vertex normal in the XZ plane, and b' x is the first target weight.
The second target weight may be calculated by:
Wherein b x is a first weight of the target vertex normal in the XY plane, b y is a second weight of the target vertex normal in the YZ plane, b z is a third weight of the target vertex normal in the XZ plane, and b' y is a second target weight.
The third target weight may be calculated by:
Wherein b x is a first weight of the target vertex normal in the XY plane, b y is a second weight of the target vertex normal in the YZ plane, b z is a third weight of the target vertex normal in the XZ plane, and b' z is a third target weight.
There are various implementations of adjusting voxels to be processed based on texture sample values, in one implementation, as shown in fig. 6, comprising the steps of:
S206-1, determining an original texture sampling value of the voxel data to be processed.
S206-2, replacing the original texture sampling value with the texture sampling value.
Texture sample values refer to the process of obtaining color or other data from a texture map. Each vertex is typically associated with a set of texture coordinates (UV coordinates) indicating the location of the corresponding color or data on the texture image. Bilinear interpolation or multi-linear interpolation is performed between vertices to determine texture coordinates of any point within the triangle. The color of the screen pixel is obtained from the texture pixel based on nearest neighbor sampling, bilinear filtering, trilinear filtering, anisotropic filtering and the like, and the original texture sampling value is obtained.
Referring to fig. 7, an embodiment of the present invention further provides a voxel adjustment device 110 applied to the electronic device 100 shown in fig. 1, where the voxel adjustment device 110 includes:
The determining module 111 is configured to determine voxel data to be processed, determine a vertex normal and a surface normal of each voxel in the voxel data to be processed, determine an included angle between the vertex normal and the surface normal, determine a target vertex normal based on the vertex normal and the surface normal when the included angle is greater than a preset angle, and determine a texture sampling value based on the target vertex normal;
And the adjustment module 112 is used for adjusting the voxel data to be processed based on the texture sampling value.
The invention also provides an electronic device 100, the electronic device 100 comprising a processor 130 and a memory 120. Memory 120 stores computer-executable instructions that, when executed by processor 130, implement the voxel adjustment method.
The embodiment of the present invention further provides a computer readable storage medium storing a computer program which, when executed by the processor 130, implements the voxel adjustment method.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative, for example, of the flowcharts and block diagrams in the figures that illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present invention may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part. The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. The storage medium includes a U disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, an optical disk, or other various media capable of storing program codes.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
The above description is merely illustrative of various embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the scope of the present invention, and the invention is intended to be covered by the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202411742782.6A CN119672196A (en) | 2024-11-29 | 2024-11-29 | Voxel adjustment method, device, electronic device and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202411742782.6A CN119672196A (en) | 2024-11-29 | 2024-11-29 | Voxel adjustment method, device, electronic device and readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN119672196A true CN119672196A (en) | 2025-03-21 |
Family
ID=94981245
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202411742782.6A Pending CN119672196A (en) | 2024-11-29 | 2024-11-29 | Voxel adjustment method, device, electronic device and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN119672196A (en) |
-
2024
- 2024-11-29 CN CN202411742782.6A patent/CN119672196A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112785674B (en) | Texture map generation method, rendering device, equipment and storage medium | |
US6593924B1 (en) | Rendering a non-photorealistic image | |
JP5111638B2 (en) | Apparatus and method for dividing a parametric curve into smaller subpatches | |
CN103559374B (en) | A kind of method carrying out face disintegrated type surface subdivision on plurality of subnets lattice model | |
KR102701851B1 (en) | Apparatus and method for determining LOD(level Of detail) for texturing cube map | |
CN109448137B (en) | Interaction method, interaction device, electronic equipment and storage medium | |
CN113658316B (en) | Rendering method and device of three-dimensional model, storage medium and computer equipment | |
CN111583381B (en) | Game resource map rendering method and device and electronic equipment | |
CN111583398B (en) | Image display method, device, electronic equipment and computer readable storage medium | |
CN109598672B (en) | Map road rendering method and device | |
US9019268B1 (en) | Modification of a three-dimensional (3D) object data model based on a comparison of images and statistical information | |
CN109697748A (en) | Model compression processing method, model pinup picture processing method device, storage medium | |
CN111382618B (en) | Illumination detection method, device, equipment and storage medium for face image | |
CN114241151A (en) | Three-dimensional model simplification method and device, computer equipment and computer storage medium | |
CN112365572B (en) | Rendering method based on surface subdivision and related products thereof | |
CN108074285B (en) | Volume cloud simulation method and volume cloud simulation device | |
US10636210B2 (en) | Dynamic contour volume deformation | |
JP2025031730A (en) | Method and system for generating polygon meshes that approximate surfaces using root finding and iteration on mesh vertex positions - Patents.com | |
CN104346822B (en) | texture mapping method and device | |
KR20140098592A (en) | An apparatus and method for deciding and changing inner and outer surfaces of meshes on the time of surface generation | |
CN110751026B (en) | Video processing method and related device | |
He et al. | Improving the parameterization of approximate subdivision surfaces | |
CN119672196A (en) | Voxel adjustment method, device, electronic device and readable storage medium | |
CN106600694B (en) | Method and device for smoothing terrain data | |
CN117173371A (en) | Method and system for generating polygonal mesh approximating curved surface using root finding and iteration for mesh vertex position |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination |