[go: up one dir, main page]

CN114299215A - Texture size detection method and device and rendering engine - Google Patents

Texture size detection method and device and rendering engine Download PDF

Info

Publication number
CN114299215A
CN114299215A CN202111641397.9A CN202111641397A CN114299215A CN 114299215 A CN114299215 A CN 114299215A CN 202111641397 A CN202111641397 A CN 202111641397A CN 114299215 A CN114299215 A CN 114299215A
Authority
CN
China
Prior art keywords
texture
rendered
size
scene
virtual camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111641397.9A
Other languages
Chinese (zh)
Other versions
CN114299215B (en
Inventor
魏博
景傲磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Yake Interactive Technology Co ltd
Original Assignee
Tianjin Yake Interactive Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Yake Interactive Technology Co ltd filed Critical Tianjin Yake Interactive Technology Co ltd
Priority to CN202111641397.9A priority Critical patent/CN114299215B/en
Publication of CN114299215A publication Critical patent/CN114299215A/en
Application granted granted Critical
Publication of CN114299215B publication Critical patent/CN114299215B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Generation (AREA)

Abstract

The application discloses a texture size detection method, a texture size detection device and a rendering engine, wherein the method comprises the following steps: respectively acquiring data information of each texture corresponding to a rendered scene when a virtual camera reaches each voxel unit in the scene to be rendered, wherein the data information comprises a loading size in a visible range of the virtual camera; for one texture, determining a reference size of the texture in a scene to be rendered according to a plurality of loading sizes acquired when a virtual camera reaches each voxel unit in the scene to be rendered; and determining whether the sizes of the textures are reasonable or not when the scene to be rendered is rendered according to the determined reference sizes of the textures. The method and the device can automatically, quickly, comprehensively and meticulously screen out the texture with unreasonable size in the game scene, give out reasonable size, greatly save labor cost and time cost, remarkably save game development cost and improve game development efficiency compared with a manual detection method.

Description

Texture size detection method and device and rendering engine
Technical Field
The application relates to the technical field of rendering, in particular to a texture size detection method and device and a rendering engine.
Background
In the game development process, the rapid rhythm makes the artists prefer to use huge texture sizes to show better effects, but the texture with unreasonable sizes not only affects the memory occupation, but also affects the loading time consumption. Considering and using textures with reasonable size, an art designer with a large amount of related experience is often needed to do the processing, and even if related rules are formulated in an early stage, human factors are easy to careless, so that development efficiency and development quality are difficult to find a reasonable balance point.
When the rationality of the size of the texture is checked in the later stage, in the prior art, manual means is usually adopted to open and check the texture one by one, which not only needs a large amount of time cost and labor cost, but also is difficult to check, and the accuracy is low through visual observation.
Disclosure of Invention
The embodiment of the application provides a texture size detection method and device and a rendering engine. The method can automatically screen out the texture with unreasonable size in the game scene so as to overcome or at least partially overcome the defects of the prior art.
In a first aspect, a method for detecting a texture size is provided, including:
controlling a virtual camera to traverse each voxel unit in a scene to be rendered, and respectively acquiring data information of each texture corresponding to the scene to be rendered when the virtual camera reaches each voxel unit in the scene to be rendered, wherein the data information comprises a loading size in a visible range of the virtual camera;
for one texture, determining the reference size of the texture in the scene to be rendered according to a plurality of loading sizes acquired when the virtual camera reaches each voxel unit in the scene to be rendered, and circularly executing the step to determine the reference size of each texture corresponding to the scene to be rendered;
and determining whether the size of each texture meets a preset condition when the scene to be rendered is rendered according to the determined reference size of each texture.
In a second aspect, there is provided an apparatus for detecting a texture size, the apparatus comprising:
the system comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for controlling a virtual camera to traverse each voxel unit in a scene to be rendered and respectively acquiring data information of each texture corresponding to the scene to be rendered when the virtual camera reaches each voxel unit in the scene to be rendered, and the data information comprises a loading size in a visible range of the virtual camera;
the size determining unit is used for determining the reference size of the texture in the scene to be rendered according to a plurality of loading sizes acquired when the virtual camera reaches each voxel unit in the scene to be rendered, and determining the reference size of each texture corresponding to the scene to be rendered by circularly executing the step;
and the judging unit is used for determining whether the size of each texture meets a preset condition when the scene to be rendered is rendered according to the determined reference size of each texture.
In a third aspect, there is provided a rendering engine comprising an apparatus as described above.
In a fourth aspect, an embodiment of the present application further provides an electronic device, including: a processor; and a memory arranged to store computer executable instructions that, when executed, cause the processor to perform any of the methods described above.
In a fifth aspect, this application embodiment also provides a computer-readable storage medium storing one or more programs which, when executed by an electronic device including a plurality of application programs, cause the electronic device to perform any of the methods described above.
The embodiment of the application adopts at least one technical scheme which can achieve the following beneficial effects:
when the virtual camera is compared to reach each voxel unit in a scene to be rendered, the loading size of each texture in the scene to be rendered in the visible range of the virtual camera is determined, the reference size of each texture in the scene to be rendered is determined, and when a game runs, the reference size of each texture in the scene to be rendered is used as a judgment threshold value, so that whether the size of each texture meets a preset condition or not is automatically detected. The method and the device can automatically, quickly, comprehensively and meticulously screen out the texture with unreasonable size in the game scene, give reasonable size data, greatly save labor cost and time cost, and obviously improve the detection accuracy compared with a manual detection method; for game developers, in the early development process, textures with larger sizes can be used as far as possible, and the development efficiency is accelerated; in the later-stage performance optimization and memory analysis, the method can be used for efficiently positioning and processing the texture related problems, thereby obviously saving the game development cost and improving the game development efficiency.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 shows a schematic flow diagram of a method of texture size detection according to an embodiment of the present application;
FIG. 2 shows a schematic flow diagram of a method of texture size detection according to another embodiment of the present application;
FIG. 3 shows a schematic structural diagram of a texture size detection apparatus according to another embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Aiming at the problems that in the prior art, a method for checking the size of textures one by adopting a manual method is needed, the method for detecting the size of the textures is provided, the method can realize automatic detection on the reasonability of the size of the textures, great convenience is provided for early development of games, a game developer can use the textures with larger sizes as far as possible in the early development period, and the development efficiency is accelerated; in the later-stage performance optimization and memory analysis, the method can be used for efficiently positioning and processing the texture related problems, thereby obviously saving the game development cost and improving the game development efficiency.
Fig. 1 is a schematic flow chart illustrating a method for detecting texture size according to the prior art, and as can be seen from fig. 1, the method at least includes steps S110 to S130:
step S110: and controlling a virtual camera to traverse each voxel unit in the scene to be rendered, and respectively acquiring data information of each texture corresponding to the scene to be rendered when the virtual camera reaches each voxel unit in the scene to be rendered, wherein the data information comprises a loading size in a visible range of the virtual camera.
A voxel is an abbreviation of Volume element (Volume Pixel), and is the minimum unit of digital data in three-dimensional space segmentation, that is, a scene to be rendered is composed of many individual Pixel units.
Texture (Texture), i.e. the appearance of the surface of an object, is an intuitive first impression, and can be easily understood as a picture, but in computer graphics, Texture can be understood as a piece of data, which is not limited to 2D space, and there are also one-dimensional Texture, three-dimensional Texture, cube mapping Texture, array Texture, and so on.
The rendering engine includes a virtual camera, for example, a UE4 rendering engine, and the virtual camera is a camera actor in a UE4 engine, and when the game runs, objects in a scene under the lens of the virtual camera are rendered in real time, and each object has at least one or more textures, so as to represent real objects.
In the present application, the maximum size of the texture when the texture is imported is used as the import size of the texture (e.g., 2048 × 2048, which is denoted as Max dimensions), and the maximum size of the texture when the Game is running (i.e., the maximum running size In the embodiment of the present application, for example, 1024 × 1024, which is denoted as Max In Game) may be set according to the import size, and specifically, the maximum size of the texture when the Game is running may be modified by modifying a level of detail deviation (LOD Bias). For the same texture, when the virtual camera reaches different voxel units, the loading size of the texture may change, for example, when the virtual camera reaches voxel unit a, the rendering system controls the multiple progressive texture levels of the texture loaded into the memory, and determines the corresponding size of the multiple progressive texture levels as the loading size in the visible range of the current virtual camera, for example, 512 × 512, and when the virtual camera reaches voxel unit B, the loading size of the texture is 1024 × 1024.
That is, for a piece of texture, when the virtual camera reaches different voxel units, the data information of the piece of texture, such as the multi-level progressive texture (MipMap) level of the texture, changes. Therefore, the method and the device firstly collect and acquire data information of each texture in the rendered scene when the virtual camera reaches each voxel unit in the scene to be rendered. It should be noted that a scene contains many textures.
The data information of the texture is usually stored in a memory of a rendering engine, the memory refers to a data space applied in the running process of the game process, when the game is run, a scene is rendered in real time, the rendered data is stored in the memory of the rendering engine, and when the required data is obtained, the rendered data can be read from the memory of the rendering engine.
Taking a piece of texture as an example, when the virtual camera reaches a voxel unit, the related data information of the piece of texture in the rendering engine is read, including but not limited to the loading size of the piece of texture in the visible range of the virtual camera.
The visible range of the virtual camera can be understood as the range that can be shot by the virtual camera, objects within the range participate in rendering, and objects beyond the visible range do not participate in rendering.
And controlling the virtual camera to traverse each voxel unit in the scene to be rendered one by one, and collecting data information of each texture in the scene to be rendered in the voxel unit when the virtual camera traverses one voxel unit. Suppose that a scene to be rendered contains 3 voxel units (actually containing thousands of voxels), which are respectively marked as voxel unit 1, voxel unit 2, and voxel unit 3; assuming that a scene to be rendered comprises three textures which are respectively marked as a texture A, a texture B and a texture C, when a virtual camera reaches a voxel unit 1, respectively acquiring data information of the texture A, the texture B and the texture C, and respectively marking the data information as data information 1A, data information 1B and data information 1C; when the virtual camera reaches a voxel unit 2, respectively acquiring data information of a texture A, a texture B and a texture C, and respectively recording the data information as data information 2A, data information 2B and data information 2C; when the virtual camera reaches the voxel unit 3, the data information of the texture a, the texture B, and the texture C is acquired and recorded as data information 3A, data information 3B, and data information 3C, respectively.
Step S120: and for one texture, determining the reference size of the texture in the scene to be rendered according to a plurality of loading sizes acquired when the virtual camera reaches each voxel unit in the scene to be rendered, and circularly executing the step to determine the reference size of each texture corresponding to the scene to be rendered.
For a piece of texture, determining a reference size of the piece of texture in a scene to be rendered according to a plurality of data information of the piece of texture acquired when the virtual camera traverses each voxel unit in the scene to be rendered.
As for the texture a, when the virtual camera reaches the voxel unit 1, the voxel unit 2, and the voxel unit 3, the data information 1A, the data information 2A, and the data information 3A are respectively obtained, and the reference size of the texture a in the scene to be rendered is determined according to the data information 1A, the data information 2A, and the data information 3A.
Specifically, the data information of the texture contains a loading size of the texture within a visible range of the virtual camera. When determining the reference size, the maximum value of the multiple load sizes may be selected as the reference size of the piece of texture in the scene to be rendered.
The reference dimension is a balance point which can ensure the visual effect and prevent the memory burden from being too serious, and in the application, the reference dimension is marked as Max counted Dimensions, and can be understood as the most reasonable dimension of a texture when a game runs.
Assuming that the load size included in the data information 1A is 256 × 256, the load size included in the data information 2A is 128 × 128, and the load size of the data information 3A is 512 × 512, when determining the reference size of the piece of texture, the largest load size may be selected as the reference size of the piece of texture in the scene to be rendered, and in this application, it is assumed that the determination 512 × 512 is the reference size of the texture a.
For each texture in the scene to be rendered, the step of determining the reference size of the texture in the scene to be rendered according to the multiple loading sizes obtained by the texture when the virtual camera reaches each voxel unit in the scene to be rendered is repeated, that is, the reference size of each texture in the scene to be rendered can be determined.
Step S130: and determining whether the sizes of the textures are reasonable or not when the scene to be rendered is rendered according to the determined reference sizes of the textures.
When the game is operated, the scene can be rendered in real time, and the method can be used for detecting the reasonability of the size of each texture in the rendering process.
Specifically, for a piece of texture, the determined reference size may be used as a determination threshold, and when a scene to be rendered is rendered, the running size of the actually used piece of texture (i.e., the maximum size of the texture during game running) is compared with the reference size (Max held Dimensions) to determine whether the size of the piece of texture meets a preset condition, if so, the size of the piece of texture is considered to be reasonable, and if not, the size of the piece of texture is considered to be unreasonable.
In some embodiments of the present application, a size relationship between a maximum value (Max In Game) of a running size of a texture and a reference size may be directly compared to perform a determination, so as to directly determine whether the size of the texture satisfies a preset condition. Where Max In Game represents the maximum size of a texture at the time of Game execution. For each texture, if the reference size of the texture is smaller than the maximum size of the texture in the running process of the game, determining that the size of the texture does not meet the preset condition, namely unreasonable; optionally, the size of the texture may be determined to be reasonable if the maximum size of the texture during game operation is within a range of a product of the reference size corresponding to the texture and the ratio threshold. The ratio threshold is a value greater than 0, and may be greater than 1 or less than 1, so that the size of the texture is not too large while the display effect is ensured.
The method shown in fig. 1 can be seen that, when a virtual camera reaches each voxel unit in a scene to be rendered, the loading size of each texture in the scene to be rendered within the visible range of the virtual camera is compared, so as to determine the reference size of each texture in the scene to be rendered, and when a game runs, the reference size of each texture in the scene to be rendered is used as a judgment threshold, so as to automatically detect whether the size of each texture meets a preset condition. The method and the device can automatically, quickly, comprehensively and meticulously screen out the texture with unreasonable size in the game scene, give reasonable size data, greatly save labor cost and time cost, and obviously improve the detection accuracy compared with a manual detection method; for game developers, in the early development process, textures with larger sizes can be used as far as possible, and the development efficiency is accelerated; in the later-stage performance optimization and memory analysis, the method can be used for efficiently positioning and processing the texture related problems, thereby obviously saving the game development cost and improving the game development efficiency.
In some embodiments of the present application, the method further includes controlling the virtual camera to rotate in various directions when the virtual camera reaches each voxel unit in the scene to be rendered, so that the scene to be rendered is fully rendered.
When the virtual camera reaches each voxel unit of a scene to be rendered, the virtual camera is controlled to rotate in all directions, such as four directions of front, back, left and right, and the rotation of the virtual camera is controlled to avoid that some objects in the voxel units are removed and do not participate in rendering, so that reasonable data cannot be obtained.
In some embodiments of the present application, the respectively obtaining data information of each texture in a scene to be rendered when the virtual camera reaches each voxel unit in the scene to be rendered includes: for a piece of texture, when the virtual camera reaches a voxel unit in the scene to be rendered, reading the multilevel progressive texture level of the piece of texture loaded in the memory; and taking the texture size corresponding to the multi-level progressive texture level of the texture loaded in the memory as the loading size of the texture in the visible range of the virtual camera.
When the data information of the texture is collected, for one texture, when the virtual camera reaches a voxel unit in the scene to be rendered, the multi-level progressive texture (MipMap) level of the texture is read from the memory. For a texture, it usually includes multiple multi-level progressive texture levels, such as 4 multi-level progressive texture levels, which correspond to the texture sizes 128 × 128, 256 × 256, 512 × 512, 1024 × 1024, respectively, and the texture size corresponding to the multi-level progressive texture level loaded in the current memory is used as the loading size of the texture within the visible range of the virtual camera.
In actual implementation, the collection of the data information of the Texture may be implemented by a system of the rendering engine, specifically, by calling and the like, for example, when the UE4 engine is used for rendering, the size of the load for collecting a Texture may be implemented by a Texture Streaming system of the UE4 engine. During rendering, the Texture Streaming system loads the MipMap level in the memory, and only loads the MipMap required when the current virtual camera position is rendered in the scene, and acquires the MipMap level loaded in the memory, that is, the loading size.
In some embodiments of the present application, determining a reference size of a texture in the scene to be rendered according to a plurality of loading sizes obtained when a virtual camera reaches each voxel unit in the scene to be rendered includes: determining a maximum value of the plurality of load sizes; and taking the maximum value as a reference size of the texture in the scene to be rendered.
That is, when determining the reference size of a texture, the maximum value of the plurality of load sizes may be selected as the reference size of the piece of texture. The multiple loading sizes comprise all loading sizes when the virtual camera reaches each voxel unit in the scene to be rendered, and the maximum value of the loading sizes is selected as the reference size of the texture, so that the rendering effect of the scene can be guaranteed to the maximum extent.
In some embodiments of the present application, the determining, according to the determined reference size of each texture, whether the size of each texture in the scene to be rendered satisfies a preset condition includes: when the scene to be rendered is rendered, if the maximum operation size of a texture is larger than the reference size of the texture, determining that the size of the texture does not meet a preset condition; otherwise, determining that the size of the texture meets the preset condition.
That is, when determining whether the size of a texture satisfies the predetermined condition, if the maximum value (Max In Game) of the run size of the texture when being rendered is greater than the reference size of the texture, it is determined that the size of the texture is not satisfied with the predetermined condition, that is, it is not reasonable, and if the maximum value of the run size of a texture is 2048 × 2048 and the reference size of the texture is 1024 × 1024, it is determined that the size of the texture is not satisfied with the predetermined condition, that is, it is not reasonable.
If the maximum value of the running size of a texture when being rendered is smaller than or equal to the reference size of the texture, it indicates that the texture meets the preset condition, i.e. is reasonable, for example, if the maximum size of a texture when a game runs is 512 × 512, and the reference size of the texture is 512 × 512, it indicates that the texture meets the preset condition, i.e. is reasonable.
In some embodiments of the present application, the method further comprises: in response to the texture size modification instruction, the maximum size of the unreasonable-sized target texture at the time of game execution is set as the reference size of the target texture.
In summary, if the size of a certain texture is found to be not meeting the preset condition through detection, that is, the size of the texture is too large, the modification of the size of the texture can be prompted, specifically, a texture size modification instruction can be received through a front-end interface, and the maximum size of the detected unreasonable texture during game running is modified into a given reference size, so that the unreasonable size texture can be quickly corrected, and the time and labor cost of developers can be saved.
In some embodiments of the present application, the data information further comprises base parameters of the texture; the method further comprises the following steps: and visually displaying the basic parameters of each texture, the reference size of each texture, the running size of each texture and the lead-in size of each texture in the scene to be rendered.
Various basic parameters of texture are stored in the memory of the electronic device, including but not limited to Name (Name), Type (Type), Format (Format), Group (Group), and so on. And performing visual display on the reference size (Max rolled Dimensions) of each texture generated In the rendering process, the maximum running size (Max In Game) of each texture when the Game runs, and the import size (Max Dimensions) of each texture when the rendering engine is imported. Some embodiments of the present application support visual presentation of the above parameters for review by a worker.
In the visual display interface, various data information of each texture can be seen at a glance, including Name (Name), Type (Type), Format (Format), Group (Group), and the like; max Dimensions (import size), Max In Game (maximum run size), and Max bound Dimensions (reference size) are also included; as mentioned above, Max Dimensions can be understood as the size of the texture when it is imported into the rendering engine, Max In Game represents the maximum run size of the texture when the Game is running, i.e. the maximum run size of the texture when the scene to be rendered is rendered, and Max held Dimensions represents the most reasonable size when the Game is running, i.e. the aforementioned reference size of the texture.
Fig. 2 is a schematic flow chart illustrating a texture size detection method according to another embodiment of the present application, and as can be seen from fig. 2, the present application includes:
and controlling the virtual camera to reach a voxel unit of a scene to be rendered, controlling the virtual camera to rotate in four directions, and acquiring basic data of each texture and a loading size in a visible range of the virtual camera in the voxel unit.
And circularly executing the steps to obtain the loading size of each texture in the scene to be rendered in the visible range of the virtual camera under each voxel unit.
For a piece of texture, the loading sizes of the texture in the visible range of the virtual camera under each voxel unit are compared, and the maximum value of the loading sizes is taken as the reference size of the piece of texture.
And circularly executing the steps to determine the reference size of each texture.
Judging whether the maximum operation size of a texture is larger than the reference size of the texture when the game is operated, if so, displaying various data of the texture on a visual interface, wherein the reference size is used as a modification reference; if not, the texture size meets the preset condition, and various data information of the texture is displayed on a visual interface.
And circularly executing the steps until all the textures are detected.
Fig. 3 shows a schematic structural diagram of a texture size detection device, wherein the device 300 comprises:
an obtaining unit 310, configured to control a virtual camera to traverse each voxel unit in a scene to be rendered, and obtain data information of each texture corresponding to the scene to be rendered when the virtual camera reaches each voxel unit in the scene to be rendered, where the data information includes a loading size within a visible range of the virtual camera;
a size determining unit 320, configured to determine, for a piece of texture, a reference size of the piece of texture in the scene to be rendered according to a plurality of loading sizes obtained when the virtual camera reaches each voxel unit in the scene to be rendered, and perform this step in a loop to determine the reference size of each piece of texture corresponding to the scene to be rendered;
the determining unit 330 is configured to determine, according to the determined reference size of each texture, whether the size of each texture meets a preset condition when the scene to be rendered is rendered.
In some embodiments of the present application, in the above apparatus, the obtaining unit 310 is configured to control the virtual camera to rotate in each direction when the virtual camera reaches each voxel unit in the scene to be rendered, so that the scene to be rendered is fully rendered.
In some embodiments of the present application, in the above apparatus, the obtaining unit 310 is configured to, for a texture, read a multi-level progressive texture level of the texture loaded in a memory when the virtual camera reaches a voxel unit in the scene to be rendered; and taking the texture size corresponding to the multi-level progressive texture level of the texture loaded in the memory as the loading size of the texture in the visible range of the virtual camera.
In some embodiments of the present application, in the above apparatus, the size determining unit 320 is configured to determine a maximum value of the plurality of loading sizes; and taking the maximum value as a reference size of the texture in the scene to be rendered.
In some embodiments of the present application, in the above apparatus, the determining unit 330 is configured to determine that a size of a texture does not satisfy a preset condition if a maximum running size of the texture is larger than a reference size of the texture when the scene to be rendered is rendered; otherwise, determining that the size of the texture meets the preset condition.
In some embodiments of the present application, the apparatus further comprises: and the texture modification unit is used for responding to the texture size modification instruction and setting the maximum size of the target texture with unreasonable size as the reference size of the target texture when the game runs.
In some embodiments of the present application, the data information further comprises base parameters of the texture; the above-mentioned device still includes: and the visualization unit is used for visually displaying the basic parameters of each texture, the reference size of each texture, the running size of each texture and the lead-in size of each texture in the scene to be rendered.
It can be understood that the above-mentioned detection apparatus for texture size can implement the steps of the detection method for texture size provided in the foregoing embodiments, and the explanations regarding the detection method for texture size are applicable to the detection apparatus for texture size, and are not repeated herein.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application. Referring to fig. 4, at a hardware level, the electronic device includes a processor, and optionally further includes an internal bus, a network interface, and a memory. The Memory may include a Memory, such as a Random-Access Memory (RAM), and may further include a non-volatile Memory, such as at least 1 disk Memory. Of course, the electronic device may also include hardware required for other services.
The processor, the network interface, and the memory may be connected to each other via an internal bus, which may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 4, but that does not indicate only one bus or one type of bus.
And the memory is used for storing programs. In particular, the program may include program code comprising computer operating instructions. The memory may include both memory and non-volatile storage and provides instructions and data to the processor.
The processor reads the corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to form a detection device of the texture size on the logic level. The processor is used for executing the program stored in the memory and is specifically used for executing the following operations:
controlling a virtual camera to traverse each voxel unit in a scene to be rendered, and respectively acquiring data information of each texture corresponding to the scene to be rendered when the virtual camera reaches each voxel unit in the scene to be rendered, wherein the data information comprises a loading size in a visible range of the virtual camera;
for one texture, determining the reference size of the texture in the scene to be rendered according to a plurality of loading sizes acquired when the virtual camera reaches each voxel unit in the scene to be rendered, and circularly executing the step to determine the reference size of each texture corresponding to the scene to be rendered;
and determining whether the size of each texture meets a preset condition when the scene to be rendered is rendered according to the determined reference size of each texture.
The method performed by the texture size detection apparatus according to the embodiment shown in fig. 3 of the present application may be applied to or implemented by a processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The electronic device may further execute the method executed by the texture size detection apparatus in fig. 3, and implement the function of the texture size detection apparatus in the embodiment shown in fig. 4, which is not described herein again in this embodiment of the present application.
An embodiment of the present application further provides a computer-readable storage medium storing one or more programs, where the one or more programs include instructions, which, when executed by an electronic device including multiple application programs, enable the electronic device to perform the method performed by the texture size detection apparatus in the embodiment shown in fig. 3, and are specifically configured to perform:
controlling a virtual camera to traverse each voxel unit in a scene to be rendered, and respectively acquiring data information of each texture corresponding to the scene to be rendered when the virtual camera reaches each voxel unit in the scene to be rendered, wherein the data information comprises a loading size in a visible range of the virtual camera;
for one texture, determining the reference size of the texture in the scene to be rendered according to a plurality of loading sizes acquired when the virtual camera reaches each voxel unit in the scene to be rendered, and circularly executing the step to determine the reference size of each texture corresponding to the scene to be rendered;
and determining whether the size of each texture meets a preset condition when the scene to be rendered is rendered according to the determined reference size of each texture.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (11)

1. A method for detecting texture size, comprising:
controlling a virtual camera to traverse each voxel unit in a scene to be rendered, and respectively acquiring data information of each texture corresponding to the scene to be rendered when the virtual camera reaches each voxel unit in the scene to be rendered, wherein the data information comprises a loading size in a visible range of the virtual camera;
for one texture, determining the reference size of the texture in the scene to be rendered according to a plurality of loading sizes acquired when the virtual camera reaches each voxel unit in the scene to be rendered, and circularly executing the step to determine the reference size of each texture corresponding to the scene to be rendered;
and determining whether the size of each texture meets a preset condition when the scene to be rendered is rendered according to the determined reference size of each texture.
2. The method of claim 1, further comprising:
and when the virtual camera reaches each voxel unit in the scene to be rendered, controlling the virtual camera to rotate in each direction so as to fully render the scene to be rendered.
3. The method according to claim 2, wherein the separately acquiring data information of each texture in the rendered scene when the virtual camera reaches each voxel unit in the scene to be rendered comprises:
for a piece of texture, when the virtual camera reaches a voxel unit in the scene to be rendered, reading the multilevel progressive texture level of the piece of texture loaded in the memory;
and taking the texture size corresponding to the multi-level progressive texture level of the texture loaded in the memory as the loading size of the texture in the visible range of the virtual camera.
4. The method according to claim 1, wherein the determining a reference size of a texture in the scene to be rendered according to a plurality of loading sizes of the texture obtained when a virtual camera reaches each voxel unit in the scene to be rendered comprises:
determining a maximum value of the plurality of load sizes;
and taking the maximum value as a reference size of the texture in the scene to be rendered.
5. The method according to claim 1, wherein the determining whether the size of each texture in the scene to be rendered satisfies a preset condition according to the determined reference size of each texture comprises:
when the scene to be rendered is rendered, if the maximum operation size of a texture is larger than the reference size of the texture, determining that the size of the texture does not meet a preset condition;
otherwise, determining that the size of the texture meets the preset condition.
6. The method of claim 5, further comprising:
in response to the texture size modification instruction, the maximum size of the unreasonable-sized target texture at the time of game execution is set as the reference size of the target texture.
7. The method of claim 1, wherein the data information further comprises basic parameters of a texture;
the method further comprises the following steps:
and visually displaying the basic parameters of each texture, the reference size of each texture, the running size of each texture and the lead-in size of each texture in the scene to be rendered.
8. A texture size rationality detection device, comprising:
the system comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for controlling a virtual camera to traverse each voxel unit in a scene to be rendered and respectively acquiring data information of each texture corresponding to the scene to be rendered when the virtual camera reaches each voxel unit in the scene to be rendered, and the data information comprises a loading size in a visible range of the virtual camera;
the size determining unit is used for determining the reference size of a texture in the scene to be rendered according to a plurality of loading sizes acquired when the virtual camera reaches each voxel unit in the scene to be rendered, and determining the reference size of each texture corresponding to the scene to be rendered by circularly executing the step;
and the judging unit is used for determining whether the size of each texture meets a preset condition when the scene to be rendered is rendered according to the determined reference size of each texture.
9. A rendering engine, characterized in that the rendering engine comprises the apparatus of claim 8.
10. An electronic device, comprising:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the method of any of claims 1 to 7.
11. A computer readable storage medium storing one or more programs which, when executed by an electronic device comprising a plurality of application programs, cause the electronic device to perform the method of any of claims 1-7.
CN202111641397.9A 2021-12-29 2021-12-29 Texture size detection method and device and rendering engine Active CN114299215B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111641397.9A CN114299215B (en) 2021-12-29 2021-12-29 Texture size detection method and device and rendering engine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111641397.9A CN114299215B (en) 2021-12-29 2021-12-29 Texture size detection method and device and rendering engine

Publications (2)

Publication Number Publication Date
CN114299215A true CN114299215A (en) 2022-04-08
CN114299215B CN114299215B (en) 2024-08-27

Family

ID=80970845

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111641397.9A Active CN114299215B (en) 2021-12-29 2021-12-29 Texture size detection method and device and rendering engine

Country Status (1)

Country Link
CN (1) CN114299215B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1321893A2 (en) * 2001-11-27 2003-06-25 Samsung Electronics Co., Ltd. Node structure for representing 3-dimensional objects using depth image
US20130057653A1 (en) * 2011-09-06 2013-03-07 Electronics And Telecommunications Research Institute Apparatus and method for rendering point cloud using voxel grid
CN105094920A (en) * 2015-08-14 2015-11-25 网易(杭州)网络有限公司 Game rendering method and device
KR101859318B1 (en) * 2017-01-09 2018-05-17 진형우 Video content production methods using 360 degree virtual camera
CN113269858A (en) * 2021-07-19 2021-08-17 腾讯科技(深圳)有限公司 Virtual scene rendering method and device, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1321893A2 (en) * 2001-11-27 2003-06-25 Samsung Electronics Co., Ltd. Node structure for representing 3-dimensional objects using depth image
US20130057653A1 (en) * 2011-09-06 2013-03-07 Electronics And Telecommunications Research Institute Apparatus and method for rendering point cloud using voxel grid
CN105094920A (en) * 2015-08-14 2015-11-25 网易(杭州)网络有限公司 Game rendering method and device
KR101859318B1 (en) * 2017-01-09 2018-05-17 진형우 Video content production methods using 360 degree virtual camera
CN113269858A (en) * 2021-07-19 2021-08-17 腾讯科技(深圳)有限公司 Virtual scene rendering method and device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MANFRED WEILER等: "Level-Of-Detail Volume Rendering via 3D Textures", 《PROCEEDINGS OF THE 2000 IEEE SYMPOSIUM ON VOLUME VISUALIZATION》, 9 October 2020 (2020-10-09) *

Also Published As

Publication number Publication date
CN114299215B (en) 2024-08-27

Similar Documents

Publication Publication Date Title
CN112634201B (en) Target detection method and device and electronic equipment
KR102137263B1 (en) Image processing apparatus and method
TW201918911A (en) Method and apparatus for calculating above-the-fold rendering duration of page, and electronic device
CN114119834B (en) Rendering method, rendering device, electronic equipment and readable storage medium
US11393156B2 (en) Partially resident bounding volume hierarchy
CN112190936B (en) Game scene rendering method, device, equipment and storage medium
CN110363837B (en) Method and device for processing texture image in game, electronic equipment and storage medium
CN108280135B (en) Method and device for realizing visualization of data structure and electronic equipment
CN109388564B (en) Test method and device and electronic equipment
CN114494024A (en) Image rendering method, device and equipment and storage medium
CN114299215B (en) Texture size detection method and device and rendering engine
CN109060830B (en) Method and device for detecting impurities of display screen
CN110930520B (en) Semantic segmentation labeling method, device and equipment
CN114372928A (en) Data processing method and device and electronic equipment
CN111640109A (en) Model detection method and system
CN115049531B (en) Image rendering method and device, graphic processing equipment and storage medium
CN114299470B (en) Guideboard corner recognition method and device based on center point constraint
CN119295620A (en) Rendering method and device for removing obstructed transparent elements, storage medium, and equipment
CN117197069A (en) Model surface detection method, device, computer equipment and readable storage medium
CN115952372A (en) High-precision map data acquisition method, device, equipment and storage medium
CN115718824A (en) Method for judging position of equipment and automatically pushing equipment information through space distance
CN114490753A (en) Method, device, electronic equipment and medium for displaying map information
CN115512033A (en) 3D scene rendering method and device, electronic equipment and storage medium
CN119537167B (en) Memory leak detection method, device, electronic device, and storage medium
CN115054916B (en) Shadow making method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant