[go: up one dir, main page]

CN117474798B - Anisotropic filtering method and device - Google Patents

Anisotropic filtering method and device Download PDF

Info

Publication number
CN117474798B
CN117474798B CN202210832108.1A CN202210832108A CN117474798B CN 117474798 B CN117474798 B CN 117474798B CN 202210832108 A CN202210832108 A CN 202210832108A CN 117474798 B CN117474798 B CN 117474798B
Authority
CN
China
Prior art keywords
texture
weight
texture element
sampling point
filtering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210832108.1A
Other languages
Chinese (zh)
Other versions
CN117474798A (en
Inventor
杨鲤豪
冯晶
余鹏程
占克文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Granfei Intelligent Technology Co ltd
Original Assignee
Granfei Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Granfei Intelligent Technology Co ltd filed Critical Granfei Intelligent Technology Co ltd
Priority to CN202210832108.1A priority Critical patent/CN117474798B/en
Publication of CN117474798A publication Critical patent/CN117474798A/en
Application granted granted Critical
Publication of CN117474798B publication Critical patent/CN117474798B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Generation (AREA)
  • Image Processing (AREA)

Abstract

本申请涉及图像处理技术领域,提供了一种各向异性过滤方法和装置。本申请能够在无损条件下优化各向异性过滤性能。该方法包括:确定像素点在纹理贴图上对应的采样点序列;该采样点序列中相邻采样点的间距与纹理贴图中纹理元素的边长相等;确定采样点序列中每一个采样点在纹理贴图上各自对应的纹理元素组;针对每一个采样点获取对应的纹理元素组中每一个纹理元素的权重;基于各纹理元素组中每一个纹理元素的权重,将对应于同一个纹理元素的权重进行累加得到每一个纹理元素的总权重;基于每一个纹理元素的总权重及其颜色信息进行过滤计算,得到每一纹理元素各自对应的过滤计算结果;根据该每一纹理元素各自对应的过滤计算结果得到像素点的颜色信息。

The present application relates to the field of image processing technology, and provides an anisotropic filtering method and device. The present application can optimize anisotropic filtering performance under lossless conditions. The method includes: determining a sampling point sequence corresponding to a pixel point on a texture map; the spacing between adjacent sampling points in the sampling point sequence is equal to the side length of a texture element in the texture map; determining a texture element group corresponding to each sampling point in the sampling point sequence on the texture map; obtaining the weight of each texture element in the corresponding texture element group for each sampling point; based on the weight of each texture element in each texture element group, accumulating the weight corresponding to the same texture element to obtain the total weight of each texture element; performing filtering calculation based on the total weight of each texture element and its color information to obtain the filtering calculation result corresponding to each texture element; and obtaining the color information of the pixel point according to the filtering calculation result corresponding to each texture element.

Description

Anisotropic filtering method and device
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an anisotropic filtering method and apparatus.
Background
In most cases, the coordinates under the texture map and the coordinates on the screen are not in a one-to-one correspondence, so that texture filtering techniques are required to make the content displayed on the screen smoother. The filtering modes include Point (closest Point sampling), bilinear (bilinear filtering), TRILINEAR (trilinear filtering) and Anisotropic Filtering (anisotropic filtering).
When the texture map is parallel to the screen, an adjacent 2 x 24 pixel point (pixel) on the screen must be square, i.e. isotropic, corresponding to the quadrilateral enclosed on the texture map. At this time, a bilinear filtering mode may be adopted, that is, 4 texture elements (texels) around the texture coordinates are used to calculate a new color information according to the corresponding weights as the value of the pixel point (pixel), so that the display information on the screen is smoother. However, if this quadrilateral is not square, this means an angular tilt between the texture map and the screen, i.e. anisotropy. If a bilinear filtering mode is adopted in the anisotropic process, the information at that distance is blurred, and at this time, anisotropic filtering (Anisotropic Filtering, abbreviated as AF) is needed, so that the texture of the inclined object on the surface is clearer and sharper. The anisotropic filtering can perform bilinear filtering for more times according to the included angle formed by the texture mapping and the screen, so that a better display effect is achieved.
Although the sharpness of anisotropic filtering may be better than bilinear filtering, it comes at the cost of a significant decrease in performance. In the conventional anisotropic filtering method, it is mainly divided into AF2x, AF4x, AF8x and AF16x. Wherein, AF2x will split a pixel into two samples for bilinear filtering, AF4x will split a pixel into 4 samples, AF8x will split a pixel into 8 samples, AF16x will split a pixel into 16 samples. However, taking AF16x as an example, under the conventional anisotropic filtering method, the number of sampling points required for bilinear filtering is 16 times that of the common bilinear filtering, which greatly influences the performance.
Disclosure of Invention
Based on the above, an anisotropic filtering method and apparatus capable of optimizing anisotropic filtering performance under a lossless condition are provided.
In a first aspect, the present application provides an anisotropic filtration method. The method comprises the following steps:
determining a sampling point sequence corresponding to a pixel point on a texture map, wherein the distance between adjacent sampling points in the sampling point sequence is equal to the side length of a texture element in the texture map;
Determining a texture element group corresponding to each sampling point in the sampling point sequence on the texture map;
for each sampling point in the sampling point sequence, acquiring the weight of each texture element in the corresponding texture element group;
accumulating the weights corresponding to the same texture element based on the weight of each texture element in each texture element group to obtain the total weight of each texture element;
Filtering calculation is carried out based on the total weight of each texture element and the color information of each texture element, so that a filtering calculation result corresponding to each texture element is obtained;
And obtaining the color information of the pixel point according to the filtering calculation result corresponding to each texture element.
In a second aspect, the application also provides an anisotropic filtration device. The device comprises:
The sampling point determining module is used for determining a sampling point sequence corresponding to the pixel point on the texture map, wherein the distance between adjacent sampling points in the sampling point sequence is equal to the side length of a texture element in the texture map;
A texel determining module, configured to determine a texel group corresponding to each sampling point in the sequence of sampling points on the texture map;
the weight acquisition module is used for acquiring the weight of each texture element in the corresponding texture element group aiming at each sampling point in the sampling point sequence;
The weight accumulation module is used for accumulating the weights corresponding to the same texture element based on the weight of each texture element in each texture element group to obtain the total weight of each texture element;
The filtering calculation module is used for carrying out filtering calculation based on the total weight of each texture element and the color information of each texture element to obtain a filtering calculation result corresponding to each texture element;
and the result processing module is used for obtaining the color information of the pixel point according to the filtering calculation result corresponding to each texture element.
The anisotropic filtering method and device determine a sampling point sequence corresponding to a pixel point on a texture map, the distance between adjacent sampling points in the sampling point sequence is equal to the side length of texture elements in the texture map, each corresponding texture element group of each sampling point in the sampling point sequence is determined on the texture map, the weight of each texture element in the corresponding texture element group is obtained for each sampling point, the weights corresponding to the same texture element are accumulated based on the weights of each texture element in each texture element group to obtain the total weight of each texture element, filtering calculation is performed based on the total weight of each texture element and the color information of each texture element to obtain the filtering calculation result corresponding to each texture element, and the color information of the pixel point is obtained according to the filtering calculation result corresponding to each texture element. According to the scheme, the total weight of each texture element is obtained by combining the weights of the same texture elements, filtering calculation is carried out by taking the texture elements as granularity, and color information of pixel points is obtained according to the filtering calculation, so that not only is all information reserved in the calculation process, namely, the method is an information lossless method, but also repeated filtering calculation of the same texture elements is avoided, the number of times of filtering calculation which truly influences performance is reduced compared with that of the traditional anisotropic filtering method, and therefore anisotropic filtering performance is improved, and the anisotropic filtering performance is optimized under the lossless condition.
Drawings
FIG. 1 is a schematic diagram of AF16x in an anisotropic example;
FIG. 2 is a schematic flow chart of a conventional anisotropic filtering method;
FIG. 3 is a schematic diagram of the relationship between adjacent sample points and their texels in an example;
FIG. 4 is a flow chart of an anisotropic filtering method according to an embodiment;
FIG. 5 is a flow chart of an anisotropic filtering method in an example;
FIG. 6 is a flow chart of texel weight merging in an example;
FIG. 7 is an auxiliary graph of an example performance optimization specification;
FIG. 8 is another auxiliary graph of performance optimization specification in an example;
FIG. 9 (a) is an auxiliary graph of another example performance optimization specification;
FIG. 9 (b) is another auxiliary graph of performance optimization specification in another example;
FIG. 10 is a block diagram of an anisotropic filtration device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The anisotropic filtering method and device provided by the embodiment of the application mainly aim to optimize the anisotropic filtering performance under the lossless condition.
The key words related to the embodiment of the application are anisotropic filtering (Anisotropic Filtering, AF for short), bilinear filtering (Bilinear), texture element (Texel, T for short), pixel point (P for short), and sampling point (Sample, S for short).
For anisotropy, if a group of four pixels of 2×2 adjacent to each other on the screen corresponds to a quadrangle enclosed by the texture map, the quadrangle is not square, which means that there is an angle inclination between the texture map and the screen, i.e., anisotropy. When the anisotropy is adopted, if the filtering mode of bilinear filtering is adopted, the information at the distance is more fuzzy, and at the moment, the anisotropy filtering is needed, so that the textures of the inclined object on the surface are clearer and sharper, and the anisotropy filtering can be carried out for more times according to the included angle formed by the texture map and the screen, thereby achieving better display effect.
In connection with the foregoing background section, it is clear that although the sharpness of current anisotropic filtering methods is better than bilinear filtering, it comes at the cost of a significant decrease in performance. Taking AF16x (i.e. one pixel is split into 16 samples) as an example, referring to fig. 1, four pixels corresponding to four pixels P0 to P1 are non-square, so that a certain angle exists between the texture map and the display screen. According to the included angle, a pixel point is split into a plurality of sampling points S (the sampling points S form a sampling point sequence), for example, a pixel point P0 in FIG. 1 is split into 16 sampling points S, then bilinear filtering is carried out on the 16 sampling points S to obtain 16 intermediate results, and then the 16 intermediate results are weighted and summed according to a certain weight, so that a final result of the pixel point P0 is obtained. Therefore, although the current anisotropic filtering can obtain better effect, the number of sampling points for bilinear filtering is 16 times that of common Bilinear, and the performance is greatly influenced.
Next, to better explain the basic idea of the present application, the flow related to the conventional anisotropic filtering method adopted at present is briefly described, as shown in fig. 2, in the case of AF (anisotropic filtering), step S201 determines that AF is on, and when AF is on, step S202 is entered to calculate AF parameters (such as LOD, ratio, step, sample Number, weight of each S), at this time, an ellipse is determined by using the received 4P, i.e. four pixels, and the logarithm of the minor axis of the ellipse is taken as the LOD value, ratio is the Ratio of the major axis of the ellipse, ratio is rounded up to a multiple of 2 as Sample Number, i.e. several x, step is the distance between two adjacent sampling points S in the UV direction, so that the coordinates of all S can be obtained, the Weight of the middle S is 1/Ratio, and the Weight of the two remote sampling points S is (1- (Sample Number-2)/Ratio 2). The method comprises the steps of calculating the information of all sample points S, including the coordinates and weights thereof, proceeding to the steps S203-S206, carrying out bilinear filtering Bilinear on all sample points S with the sample points S as granularity, calculating the coordinates of 4T (namely four corresponding texture elements) corresponding to each sample point S for each sample point S in the step S204, taking out the corresponding 4T color information from a memory, then calculating the corresponding 4T weight according to the coordinates of each sample point S in the step S205, carrying out bilinear filtering calculation according to the 4T color information and weights thereof in the step S206, calculating the bilinear filtering calculation result of each sample point S, proceeding to the step S207 at the end of the cycle, and accumulating the bilinear filtering calculation results of all sample points S to obtain a final result.
The application is mainly based on the following three points:
1. The distance between adjacent sampling points S is 1;
2. Each texel T is a square with a side length of 1;
3. Based on the previous two points, it can be known that there is a repeated texel T among eight texels T corresponding to two adjacent sampling points S.
The basic idea of the application is to use at least one identical texel T based on two adjacent sampling points S in point 3, so the application is to avoid that the identical texel T is reused, thus improving the performance.
Before deriving the 3 rd point, it is explained how the nearest four texels T are obtained by one coordinate. As shown in fig. 3, if the texture elements T0 to T3 are to be taken as the nearest four texture elements T, the center point of the texture elements T0 to T3 is taken as the vertex, and a square is made, that is, a square surrounded by a left-side broken line in fig. 3, if the coordinates of one sample point S1 fall within the square surrounded by the left-side broken line, then the nearest four texture elements around the sample point S1 are texture elements T0 to T3, and the side length of the square surrounded by the broken line is 1, and for the square surrounded by the right-side broken line in fig. 3, then the nearest four texture elements around the sample point S2 are texture elements T4 to T7.
Based on this, the above 3 rd point is demonstrated:
It is assumed that eight texels T corresponding to adjacent sample points S1, S2 are not repeated and are denoted as corresponding texels T0 to T3, respectively T4 to T7. At this time, since the sample point S1 falls within the square surrounded by the left broken line, the sample point S1 is spaced from the middle black solid line by a distance L1>0.5. Similarly, since the sample point S2 falls within the square surrounded by the right broken line, the sample point S2 is spaced from the middle solid black line by a distance L2>0.5. Therefore, the distance between the sample points S1 and S2 is l12> =l1+l2 >1, which contradicts the first point. It can be seen that at least one texture element T of the eight texture elements T corresponding to two adjacent sample points S is repetitive.
Based on the previous three points, it can be seen that when an original pixel point P is split into n sampling points S on the texture map, the number of texture elements T actually performing filtering calculation is necessarily smaller than 4×n.
As can be known from the flow related to the conventional anisotropic filtering method adopted at present, the anisotropic filtering method adopted at present carries out bilinear filtering calculation by taking the sampling point S as granularity, the number of texture elements T required to carry out bilinear filtering calculation is 4×n, and some texture elements T are reused. The basic idea of the application is therefore to avoid repeated filtering calculations of the same texel T, thus achieving an improved performance. In order to achieve this, the conventional anisotropic filtering method adopted at present needs to be improved, the sampling points S cannot be used as granularity, the texture elements T are used as granularity to perform operation, and then the information such as weights of all required texture elements T is calculated immediately after the information of all the sampling points S is obtained.
The anisotropic filtering method provided by the application is described below in connection with specific embodiments, and the method can be executed by a terminal, a server, or other computer equipment.
In one embodiment, as shown in fig. 4, the anisotropic filtering method provided by the present application specifically includes the following steps:
in step S401, a sampling point sequence corresponding to the pixel point on the texture map is determined.
The pixel point P can be split into a plurality of sampling points on the texture map, and the plurality of sampling points form a sampling point sequence. For example, the pixel point P may be split into two (AF 2 x), four (AF 4 x), eight (AF 8 x), sixteen (AF 16 x) sampling points on the texture map, and the anisotropic filtering method provided by the present application may be applicable to all AF cases. In this embodiment, the distance between adjacent sampling points in the sampling point sequence is equal to the side length of the texture element in the texture map, and is 1.
Step S402, determining a texture element group corresponding to each sampling point on the texture map in the sampling point sequence;
In this step, for each sampling point S in the sequence of sampling points, a respective plurality of corresponding texels T are found on the texture map to form a texel group.
Specifically, step S402 includes determining, for each sampling point, a plurality of texture elements on the texture map that are closest to the sampling point according to coordinates of the sampling point on the texture map, and obtaining a corresponding texture element group according to the plurality of texture elements. Wherein four texels T closest to it may be determined on the texture map for each sampling point S as corresponding texel groups.
Step S403, for each sampling point in the sampling point sequence, obtaining the weight of each texture element in the corresponding texture element group;
in this step, for each sampling point S in the sampling point sequence, the weight of each texel T in the corresponding texel group needs to be obtained before the subsequent filtering calculation is performed.
Step S404, based on the weight of each texture element in each texture element group, accumulating the weights corresponding to the same texture element to obtain the total weight of each texture element;
Since different sampling points S correspond to some identical texture elements T, this step adds the weights of each texture element T corresponding to different sampling points S as the total weight of the texture element T.
Step S405, filtering calculation is performed based on the total weight of each texture element and the color information of each texture element, so as to obtain a filtering calculation result corresponding to each texture element;
The filtering calculation is performed on the basis of the total weight of each texture element and the color information of the total weight of each texture element to obtain a respective corresponding filtering calculation result, wherein the filtering calculation performed on each texture element can comprise bilinear filtering. This entails that the number of texels for the filtering calculation is necessarily less than 4 xn, so that the AF performance is optimized, since the calculation of the weights for the texels concerned is already done in the preceding step, the calculation of the relevant information is no longer needed in the loop of the filtering calculation with the texels as granularity.
In some embodiments, step S405 specifically includes taking a plurality of texture elements as a loop granularity of filtering calculation, and calculating to obtain a filtering calculation result corresponding to each texture element according to the total weight and the color information of the corresponding texture element.
In this embodiment, the filtering calculation may be performed with, for example, four texture elements T as the granularity of the filtering calculation, and in each cycle, color information of the four corresponding texture elements T is extracted from the memory, and for each texture element T, a filtering calculation result corresponding to each texture element is calculated according to the total weight and the color information corresponding to each texture element T.
The filtering calculation result corresponding to each texture element is obtained through calculation according to the total weight and the color information of the corresponding texture element, and specifically comprises the steps of multiplying the corresponding total weight and the color information for each texture element to obtain the filtering calculation result corresponding to each texture element.
Step S406, obtaining the color information of the pixel point according to the filtering calculation result corresponding to each texture element.
The step is mainly to integrate the filtering calculation results corresponding to each texture element T to obtain the color information of the pixel point P. The step S406 may specifically include accumulating the filtering calculation results corresponding to each texture element to obtain color information of the pixel point. That is, the filtering calculation results corresponding to each of the texture elements T obtained in step S405 are added as color information of the pixel point P.
Next, referring to an example, in the anisotropic filtering method of the present application, as shown in fig. 5, step S501 determines that AF is on, and when AF is on, step S502 is entered to calculate AF parameters (such as LOD, ratio, step, sample Number, weight of each of the samples S), after information of all the samples S is obtained in Step S502, weight information of all the texture elements T corresponding to the samples S needs to be further obtained, because some identical texture elements T are used for different samples S, weights of each texture element T corresponding to different samples S are added up as total weights of texture elements T in Step S503, and steps S504 to S506 are entered, i.e., the Number of texture elements T actually requiring bilinear filtering calculation is necessarily smaller than 4×n, so that the optimization of AF is obtained, and the steps S504 to S506 do not need to be repeated for the relevant information.
For the merging of the weights of the texels, in an embodiment, the adding the weights corresponding to the same texel based on the weights of each texel in the texel groups to obtain the total weight of each texel specifically includes:
Determining the texture elements with the same coordinates as the same texture element based on the coordinates of each texture element in each texture element group; the weights corresponding to the same texture element are accumulated to obtain the total weight of each texture element.
In this embodiment, for each texel T in each texel group, the texels T with the same coordinates are combined, and the weights thereof are accumulated, and the final weight is the total weight of the texel T.
Further, for each weight of each texel, in an embodiment, the obtaining the weight of each texel in the corresponding texel group for each sampling point in the sampling point sequence specifically includes:
And acquiring the weight and the coordinates of each texture element in the corresponding texture element group for each sampling point in the sampling point sequence. That is, for a plurality of sampling points S split on the texture map for the pixel point P, the coordinates of each texture element are acquired while the weights of the corresponding texture elements are acquired so that the weights of the same texture element are combined according to the coordinates of the texture elements.
In the embodiment, the weight and the coordinates of each texture element in the corresponding texture element group are obtained for each sampling point in the sampling point sequence, specifically including calculating the weight and the coordinates of each texture element in the corresponding texture element group according to the coordinates and the weights of the sampling points for each sampling point in the sampling point sequence. That is, for each sampling point S in the sampling point sequence, the sampling points S are cycled with granularity, and the coordinates and weights of the corresponding texture elements T are calculated according to the coordinates and weights of each sampling point S.
The weight combination at granularity S (corresponding to step S503) before the filtering calculation at granularity 4T is performed will be described with reference to fig. 6. After obtaining the relevant information of the sampling points S in step S502, steps S601 to S603 may be entered, the sampling points S are used as granularity and circulated, the coordinates and weights of four texture elements corresponding to each sampling point S are calculated according to the coordinates and weights of the sampling points S, then the texture elements with the same coordinates are combined to accumulate the weights, and finally the accumulated weights are the total weights of the texture elements for subsequent filtering calculation with 4T granularity.
It can be seen that although the conventional method only needs to loop the sampling point S, the method provided by the present application not only loops the sampling point S but also loops the texture element T, which seems to have a larger number of loops, but the number of filtering calculations that actually affect the AF performance is reduced.
The anisotropic filtering method provided by the application is described below for improving the efficiency of AF performance.
The anisotropic filtering method provided by the application can be suitable for all AF cases. And in the calculation process, all needed information is reserved, so that the method is a lossless optimization method. The efficiency of the optimization will vary from case to case, and the best and worst cases of the method will be mainly described below.
The two factors influencing the optimization efficiency are the number n of the sampling points S after the pixel point P is split ①, and the included angle theta between the connecting line of all the sampling points S and the horizontal direction in ②. It will be appreciated that the larger the number n of sampling points S, the more the number of texture elements T, and the smaller the number n of sampling points S, the fewer the number of texture elements T. Therefore, the number n of the sampling points S does not form a factor affecting the monotonicity of the number change function of the texture elements T, so that the number n of the sampling points S is set to 16 for the sake of simplicity in calculation, and the influence of the included angle θ on the number of the texture elements T is mainly described. Since the application mainly reduces the number of the texture elements T for filtering calculation, the optimization efficiency is realized by taking the ratio of the reduced number of the texture elements T to 4 multiplied by n (the number of the texture elements T which are required to be filtered and calculated by taking S as granularity).
Referring to fig. 7, the number of texels T that need to be calculated is equivalent to the area of the enclosed parallelogram. The height of the parallelogram is (n+1) sin θ, and the length of the base varies due to the variation of θ. The closer the angle θ is to 0 °, the greater the length of the base is, the closer to n+1, and the closer the angle θ is to 90 °, the closer the length of the base is to 2. Because of its varying symmetry, only the variation of θ ε [45 °,90 ° ] will be described here. Therefore, it is first necessary to find the relationship between the base and the θ angle. As can be seen from fig. 7, the length of the bottom edge is equal to the average of the number of texture elements T occupied by each line, and thus the average can be obtained by calculating the expected value. Since the distance between adjacent sampling points S is 1, it can be derived that the adjacent sampling points S are separated by cos θ and sin θ in the horizontal and vertical directions, respectively.
Referring to fig. 8, if the mapping of the first sampling point S of two adjacent sampling points S in the horizontal direction falls within 1-cos θ, the second sampling point S still falls within the dotted rectangle, then no new texture element T will be used in the horizontal direction, and if the first sampling point S falls within the cos θ, then the second sampling point S will use a new texture element T, so in this case, there will be three texture elements T in this row. Thus, the length of the base is2 (1-cos θ) +3-cos θ. Then, the number of texture elements T used is (2 x (1-cos θ) +3 x cos θ) x (n+1) sin θ. The derivative analysis of the function revealed that the maximum value was about 68 ° and the minimum value was about 45 ° and 90 °.
Substituting these angles into the calculation results in a worst case of approximately 37.43 texels T, and a best case of approximately 34 texels T. Therefore, the improvement is about 41.5% and 46.875%, respectively, compared to the 64 texels T calculated at granularity S. Further, as one of the above formulas is n+1, it is understood that the larger n is, the smaller the influence of 1 on the formula is. Meaning that the performance improvement is more pronounced as the number n of sampling points S is greater.
The performance improvement will be described below by taking θ as 90 ° and a non-specific angle as examples.
For θ 90 °:
When the anisotropic filtering generates a plurality of sampling points S, the distance between two adjacent sampling points S is 1, each sampling point S is filtered according to a bilinear filtering mode, and the bilinear filtering mode is described in the foregoing, that is, four texture elements T around the sampling point S are used to calculate a new color information according to the corresponding weights to obtain the value of the pixel point P. Taking AF4x as an example, as shown in fig. 9 (a), P is a position where a pixel corresponds to the texture map, and this pixel P is split into four sampling points S, i.e., S0, S1, S2, and S3. The sampling point S0 takes the nearest four texture elements T around for filtering, namely texture elements T0, T1, T2 and T3, the sampling point S1 uses texture elements T2, T3, T4 and T5, the sampling point S2 uses texture elements T4, T5, T6 and T7, and the sampling point S3 uses texture elements T6, T7, T8 and T9. It can be seen that the texels T2 to T7 are all used twice. Therefore, the present application calculates the texture elements T0 to T3 of the sampling point S0 and the texture elements T2 and T3 of the sampling point S1 together, calculates the texture elements T4 and T5 of the sampling point S1, the texture elements T4 to T7 of the sampling point S2 and the texture elements T6 and T7 of the sampling point S3 together, and thus, can avoid the repeated use of the same texture elements to perform calculations in the multipliers, thereby improving the performance.
Specifically, according to the method provided by the application, the coordinates and weights of the corresponding four texture elements T are calculated according to the coordinates and weights of each sampling point S. Namely, the weights of the texture elements T0 to T3 corresponding to the sampling point S0 are weight_S0_T0, weight_S0_T1, weight_S0_T2 and weight_S0_T3, the weights of the texture elements T2 to T5 corresponding to the sampling point S1 are weight_S1_T2, weight_S1_T3, weight_S1_T4 and weight_S1_T5, the weights of the texture elements T4 to T7 corresponding to the sampling point S2 are weight_S2_T4, weight_S2_T5, weight_S2_T6 and weight_S2_T7, and the weights of the texture elements T6 to T9 corresponding to the sampling point S3 are weight_S3_T6, weight_S3_T7 and weight_S3_T8 and weight_S3_T9.
Then, the texture elements T corresponding to all the sampling points S are combined according to the equal coordinates, the weights of the texture elements T with the same coordinates are accumulated, and the obtained weights are used as the total weights of the texture elements T. Namely, the Weight of the texture element T0 is weight_S0_T0, the Weight of the texture element T1 is weight_S0_T1, the Weight of the texture element T2 is (weight_S0_T2+weight_S1_T2), the Weight of the texture element T3 is (weight_S0_T3+weight_S1_T3), the Weight of the texture element T4 is (weight_S1_T4+weight_S2_T4), the Weight of the texture element T5 is (weight_S1_T5+weight_S2_T5), the Weight of the texture element T6 is (weight_S2_T6+weight_S3_T6), the Weight of the texture element T7 is (weight_S2+weight_S3_T7), the Weight of the texture element T4 is (weight_S3+weight_T8), and the Weight of the texture element T5 is (weight_S3_T9).
The coordinate and weight information of all texture elements T needed to be used are obtained, then the 4T is used as granularity to circulate, bilinear filtering calculation is carried out according to the obtained total weight and color information, all the filtering calculation results of the split pixel point P are accumulated to obtain the final result of the pixel point P, and the flow can be expressed by the following calculation formula :P=S0*Weight_S0+S1*Weight_S1+S2*Weight_S2+S3*Weight_S3=T0*Weight_S0_T0+T1*Weight_S0_T1+T2*Weight_S0_T2+T3*Weight_S0_T3+T2*Weight_S1_T2+T3*Weight_S1_T3+T4*Weight_S1_T4+T5*Weight_S1_T5+T4*Weight_S2_T4+T5*Weight_S2_T5+T6*Weight_S2_T6+T7*Weight_S2_T7+T6*Weight_S3_T6+T7*Weight_S3_T7+T8*Weight_S3_T8+T9*Weight_S3_T9=T0*Weight_S0_T0+T1*Weight_S0_T1+T2*(Weight_S0_T2+Weight_S1_T2)+T3*(Weight_S0_T3+Weight_S1_T3)+T4*(Weight_S1_T4+Weight_S2_T4)+T5*(Weight_S1_T5+Weight_S2_T5)+T6*(Weight_S2_T6+Weight_S3_T6)+T7*(Weight_S2_T7+*Weight_S3_T7)+T8*Weight_S3_T8+T9*Weight_S3_T9.
According to the calculation, the result obtained after the optimization is not affected, so that the optimization does not lose accuracy. And 4 times of filtering (16 times of multiplication) are needed originally, and only 2.5 times of filtering (10 times of multiplication) are needed after optimization, so that the performance is improved by about 37.5%. The more sampling points the AF generates, the more significant the performance improvement. Performance can be improved by up to about 46.875% when AF16 x.
Referring to fig. 9 (b), for the example that θ is a non-specific angle and is AF8x, similarly, the coordinates and weights of four texels T corresponding to each sampling point S are calculated by circulating with the sampling point S as granularity. Namely, the weights of the texture elements T0, T1, T2 and T3 corresponding to the sampling point S0 are weight_S0_T0, weight_S0_T1, weight_S0_T2 and weight_S0_T3; the weights of the texture elements T2, T3, T5 and T6 corresponding to the sampling point S1 are weight_S1_T2, weight_S1_T3, weight_S1_T5 and weight_S1_T6; the weights of the texture elements T4, T5, T7 and T8 corresponding to the sampling point S2 are weight_S2_T4, weight_S2_T5, weight_S2_T7 and weight_S2_T8; the weights of the texture elements T7, T8, T10 and T11 corresponding to the sampling point S3 are weight_S3_T7, weight_S3_T8, weight_S3_T10 and weight_S3_T11, the weights of the texture elements T9, T10, T13 and T14 corresponding to the texture element S4 are weight_S4_T9, weight_S4_T10, weight_S4_T13 and weight_S4_T14, the weights of the texture elements T13, T14, T16 and T17 corresponding to the texture element S5 are weight_S5_T13, weight_S5_T14, weight_S5_T16 and weight_S5_T17, the weights of the texture elements T12, T13, T15 and T6 corresponding to the texture element S4 are weight_S6_T10, weight_S4_T13, weight_S4_T14, weight_S14, weight_S16 and weight_S6_T6_T6, and weight_S7_S7_T16, weight_S6 and weight_S7_S15.
And combining the texture elements T corresponding to all the sampling points S according to the equal coordinates, accumulating the weights of the texture elements T with the same coordinates, and taking the obtained weights as the total weights of the texture elements T. That is, the Weight of texel T0 is weight_S0_T0; the Weight of texel T1 is weight_s0_t1; the Weight of texel T2 is (weight_s0_t2+weight_s1_t2); the Weight of the texture element T3 is (Weight S0_T3+weight_S1_T3), the Weight of the texture element T4 is (weight_S2_T4), the Weight of the texture element T5 is (weight_S1_T5+weight_S2_T5), the Weight of the texture element T6 is (weight_S1_T6), the Weight of the texture element T7 is (weight_S2_T7+weight_T7), the Weight of the texture element T8 is (weight_S2_T8+weight_T8), the Weight of the texture element T9 is (weight_S4_T9), the Weight of the texture element T10 is (weight_S3+weight_T10), the Weight of the texture element T11 is (weight_S3_T6), the Weight of the texture element T6 is (weight_S6+weight_S6), the Weight of the texture element T6 is (weight_S6+weight_S6+weight_T5), the Weight of the texture element T9 is (weight_S4+weight_S4_T9), and the Weight of the texture element T9 is (weight_S9+weight_S4_T9).
And then, 4T is taken as granularity to carry out circulation, bilinear filtering calculation is carried out according to the obtained total weight and color information, and all filtering calculation results split out by the pixel point P are accumulated to obtain the final result of the pixel point P. The above procedure can be expressed by the following calculation formula :P=T0*Weight_S0_T0+T1*Weight_S0_T1+T2*(Weight_S0_T2+Weight_S1_T2)+T3*(Weight_S0_T3+Weight_S1_T3)+T4*Weight_S2_T4+T5*(Weight_S1_T5+Weight_S2_T5)+T6*Weight_S1_T6+T7*(Weight_S2_T7+Weight_S3_T7)+T8*(Weight_S2_T8+Weight_S3_T8)+T9*Weight_S4_T9+T10*(Weight_S3_T10+Weight_S4_T10)+T11*Weight_S3_T11+T12*Weight_S6_T12+T13*(Weight_S4_T13+Weight_S5_T13+Weight_S6_T13)+T14*(Weight_S4_T14+Weight_S5_T14)+T15*(Weight_S6_T15+Weight_S7_T15)+T16*(Weight_S5_T16+Weight_S6_T16+Weight_S7_T16)+T17*Weight_S5_T17+T18*Weight_S7_T18+T19*Weight_S7_T19.
The above flow shows that the original method takes S as granularity and needs 8 times Bilinear Filter times (32 times of multiplication), and the application only needs 5 times Bilinear Filter times of calculation, namely 20 times of multiplication according to 4T circulation. The performance is improved by about 37.5 percent.
Therefore, the application improves the performance by selecting and combining the texture elements on the basis of the original AF calculation method, and the main idea is to avoid repeated operation of the same texture elements so as to dynamically reduce sampling points and realize the optimization of anisotropic filtering performance under the lossless condition. The main advantages in the treatment flow and the technical effect are as follows:
① The application is a lossless optimization method, which can ensure that all the information of the required texture elements T are calculated into a final result;
② The more the split sampling points S are, the more obvious the performance improvement is, which can approach to 50%;
③ The application performs Bilinear calculation with S as granularity, and modifies the calculation with 4T as granularity. In the process, the coordinate and weight calculation of the texture element T in the original method is only required to be advanced to the outside of the 4T loop.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides an anisotropic filter device for realizing the above-mentioned anisotropic filter method. The implementation of the solution provided by the device is similar to that described in the above method, so specific limitations in one or more embodiments of the anisotropic filtration device provided below can be found in the above limitations of the anisotropic filtration method, and will not be repeated here.
In one embodiment, as shown in FIG. 10, an anisotropic filtration device 1100 is provided, the device comprising:
The sampling point determining module 1001 is configured to determine a sampling point sequence corresponding to a pixel point on a texture map, where a distance between adjacent sampling points in the sampling point sequence is equal to a side length of a texture element in the texture map;
A texel determination module 1002, configured to determine a respective set of texels on the texture map for each sampling point in the sequence of sampling points;
A weight obtaining module 1003, configured to obtain, for each sampling point in the sampling point sequence, a weight of each texture element in the corresponding texture element group;
A weight accumulating module 1004, configured to accumulate weights corresponding to the same texture element based on the weight of each texture element in each texture element group, to obtain a total weight of each texture element;
A filtering calculation module 1005, configured to perform filtering calculation based on the total weight of each texture element and the color information of each texture element, to obtain a filtering calculation result corresponding to each texture element;
And a result processing module 1006, configured to obtain color information of the pixel point according to the filtering calculation result corresponding to each texture element.
In one embodiment, the weight accumulation module 1004 is configured to determine, based on the coordinates of each texture element in each texture element group, the texture elements with the same coordinates as the same texture element, and accumulate the weights corresponding to the same texture element to obtain the total weight of each texture element.
In one embodiment, the weight obtaining module 1003 is configured to obtain, for each sampling point in the sampling point sequence, a weight and coordinates of each texture element in the corresponding texture element group.
In one embodiment, the weight obtaining module 1003 is configured to calculate, for each sampling point in the sampling point sequence, a weight and a coordinate of each texture element in the corresponding texture element group according to the coordinate and the weight of the sampling point.
In one embodiment, the filtering calculation module 1005 is configured to calculate, with a plurality of texture elements as a cycle granularity of the filtering calculation, a filtering calculation result corresponding to each texture element according to the total weight and the color information of the corresponding texture element.
In one embodiment, the filtering calculation module 1005 is configured to multiply, for each texture element, the corresponding total weight and the color information to obtain a respective filtering calculation result for each texture element.
In one embodiment, the result processing module 1006 is configured to accumulate the filtering calculation results corresponding to each of the texture elements to obtain color information of the pixel point.
In one embodiment, the texel determining module 1002 is configured to determine, for each sampling point, a plurality of texels on the texture map that are closest to the sampling point according to coordinates of the sampling point on the texture map, and obtain the corresponding texel group according to the plurality of texels.
In one embodiment, the filtering calculation comprises bilinear filtering.
The various modules in the anisotropic filtration device described above may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magneto-resistive random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (PHASE CHANGE Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in various forms such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), etc. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.

Claims (10)

1.一种各向异性过滤方法,其特征在于,所述方法包括:1. An anisotropic filtering method, characterized in that the method comprises: 确定像素点在纹理贴图上对应的采样点序列;其中,所述采样点序列中相邻采样点的间距与纹理贴图中纹理元素的边长相等;Determine a sampling point sequence corresponding to a pixel point on a texture map; wherein a distance between adjacent sampling points in the sampling point sequence is equal to a side length of a texture element in the texture map; 确定所述采样点序列中每一个采样点在所述纹理贴图上各自对应的纹理元素组;包括:针对所述每一个采样点,根据所述采样点在所述纹理贴图上的坐标,确定所述纹理贴图上与所述采样点相距最近的若干个纹理元素;根据所述若干个纹理元素,得到所述对应的纹理元素组;Determining a texture element group corresponding to each sampling point in the sampling point sequence on the texture map; comprising: for each sampling point, determining a number of texture elements on the texture map that are closest to the sampling point according to the coordinates of the sampling point on the texture map; and obtaining the corresponding texture element group according to the number of texture elements; 针对所述采样点序列中每一个采样点,获取对应的纹理元素组中每一个纹理元素的权重;For each sampling point in the sampling point sequence, obtaining the weight of each texture element in the corresponding texture element group; 基于各纹理元素组中每一个纹理元素的权重,将坐标相同的纹理元素确定为同一个纹理元素;将对应于同一个纹理元素的权重进行累加,得到每一个纹理元素的总权重;Based on the weight of each texture element in each texture element group, texture elements with the same coordinates are determined as the same texture element; weights corresponding to the same texture element are accumulated to obtain a total weight of each texture element; 基于所述每一个纹理元素的总权重及所述每一个纹理元素的颜色信息进行过滤计算,得到每一个纹理元素各自对应的过滤计算结果;包括:以若干个纹理元素作为过滤计算的循环粒度,针对每一个纹理元素,将对应的总权重和颜色信息进行相乘,得到每一个纹理元素各自对应的过滤计算结果;Performing filtering calculation based on the total weight of each texture element and the color information of each texture element to obtain filtering calculation results corresponding to each texture element; including: taking a number of texture elements as the loop granularity of filtering calculation, multiplying the corresponding total weight and color information for each texture element to obtain filtering calculation results corresponding to each texture element; 根据所述每一个纹理元素各自对应的过滤计算结果,得到所述像素点的颜色信息;包括:将所述每一个纹理元素各自对应的过滤计算结果进行累加,得到像素点的颜色信息。Obtaining the color information of the pixel point according to the filtering calculation results corresponding to each of the texture elements; including: accumulating the filtering calculation results corresponding to each of the texture elements to obtain the color information of the pixel point. 2.根据权利要求1所述的方法,其特征在于,所述针对所述采样点序列中每一个采样点,获取对应的纹理元素组中每一个纹理元素的权重,包括:2. The method according to claim 1, characterized in that, for each sampling point in the sampling point sequence, obtaining the weight of each texture element in the corresponding texture element group comprises: 针对所述采样点序列中每一个采样点,获取对应的纹理元素组中每一个纹理元素的权重和坐标。For each sampling point in the sampling point sequence, the weight and coordinates of each texture element in the corresponding texture element group are obtained. 3.根据权利要求2所述的方法,其特征在于,所述针对所述采样点序列中每一个采样点,获取对应的纹理元素组中每一个纹理元素的权重和坐标,包括:3. The method according to claim 2, wherein for each sampling point in the sampling point sequence, obtaining the weight and coordinates of each texture element in the corresponding texture element group comprises: 针对所述采样点序列中每一个采样点,根据采样点的坐标和权重计算对应的纹理元素组中每一个纹理元素的权重和坐标。For each sampling point in the sampling point sequence, the weight and coordinate of each texture element in the corresponding texture element group are calculated according to the coordinate and weight of the sampling point. 4.根据权利要求1至3任一项所述的方法,其特征在于,所述过滤计算包括双线性过滤。4 . The method according to claim 1 , wherein the filtering calculation comprises bilinear filtering. 5.根据权利要求1所述的方法,其特征在于,所述采样点序列中相邻采样点的间距与纹理贴图中纹理元素的边长均为1。5 . The method according to claim 1 , wherein the spacing between adjacent sampling points in the sampling point sequence and the side length of a texture element in the texture map are both 1. 6.一种各向异性过滤装置,其特征在于,所述装置包括:6. An anisotropic filtering device, characterized in that the device comprises: 采样点确定模块,用于确定像素点在纹理贴图上对应的采样点序列;其中,所述采样点序列中相邻采样点的间距与纹理贴图中纹理元素的边长相等;A sampling point determination module, used to determine a sampling point sequence corresponding to a pixel point on a texture map; wherein the spacing between adjacent sampling points in the sampling point sequence is equal to the side length of a texture element in the texture map; 纹理元素确定模块,用于确定所述采样点序列中每一个采样点在所述纹理贴图上各自对应的纹理元素组;用于针对所述每一个采样点,根据所述采样点在所述纹理贴图上的坐标,确定所述纹理贴图上与所述采样点相距最近的若干个纹理元素;根据所述若干个纹理元素,得到所述对应的纹理元素组;a texture element determination module, configured to determine a texture element group corresponding to each sampling point in the sampling point sequence on the texture map; and to determine, for each sampling point, a number of texture elements on the texture map that are closest to the sampling point according to the coordinates of the sampling point on the texture map; and to obtain the corresponding texture element group according to the number of texture elements; 权重获取模块,用于针对所述采样点序列中每一个采样点,获取对应的纹理元素组中每一个纹理元素的权重;A weight acquisition module, used for acquiring the weight of each texture element in the corresponding texture element group for each sampling point in the sampling point sequence; 权重累加模块,用于基于各纹理元素组中每一个纹理元素的权重,将坐标相同的纹理元素确定为同一个纹理元素;将对应于同一个纹理元素的权重进行累加,得到每一个纹理元素的总权重;A weight accumulation module is used to determine the texture elements with the same coordinates as the same texture element based on the weight of each texture element in each texture element group; and to accumulate the weights corresponding to the same texture element to obtain the total weight of each texture element; 过滤计算模块,用于基于所述每一个纹理元素的总权重及所述每一个纹理元素的颜色信息进行过滤计算,得到每一个纹理元素各自对应的过滤计算结果;用于以若干个纹理元素作为过滤计算的循环粒度,针对每一个纹理元素,将对应的总权重和颜色信息进行相乘,得到每一个纹理元素各自对应的过滤计算结果;A filtering calculation module, used for performing filtering calculation based on the total weight of each texture element and the color information of each texture element to obtain filtering calculation results corresponding to each texture element; used for taking a plurality of texture elements as the loop granularity of filtering calculation, multiplying the corresponding total weight and color information for each texture element to obtain filtering calculation results corresponding to each texture element; 结果处理模块,用于根据所述每一个纹理元素各自对应的过滤计算结果,得到所述像素点的颜色信息;用于将所述每一个纹理元素各自对应的过滤计算结果进行累加,得到像素点的颜色信息。The result processing module is used to obtain the color information of the pixel point according to the filtering calculation results corresponding to each texture element; and is used to accumulate the filtering calculation results corresponding to each texture element to obtain the color information of the pixel point. 7.根据权利要求6所述的装置,其特征在于,权重获取模块,用于针对所述采样点序列中每一个采样点,获取对应的纹理元素组中每一个纹理元素的权重和坐标。7. The device according to claim 6, characterized in that the weight acquisition module is used to obtain the weight and coordinates of each texture element in the corresponding texture element group for each sampling point in the sampling point sequence. 8.根据权利要求7所述的装置,其特征在于,权重获取模块,用于针对所述采样点序列中每一个采样点,根据采样点的坐标和权重计算对应的纹理元素组中每一个纹理元素的权重和坐标。8. The device according to claim 7 is characterized in that the weight acquisition module is used to calculate the weight and coordinates of each texture element in the corresponding texture element group according to the coordinates and weight of each sampling point in the sampling point sequence. 9.根据权利要求6至8任一项所述的装置,其特征在于,所述过滤计算包括双线性过滤。9. The device according to any one of claims 6 to 8, characterized in that the filtering calculation includes bilinear filtering. 10.根据权利要求6所述的装置,其特征在于,所述采样点序列中相邻采样点的间距与纹理贴图中纹理元素的边长均为1。10 . The device according to claim 6 , wherein the spacing between adjacent sampling points in the sampling point sequence and the side length of a texture element in a texture map are both 1.
CN202210832108.1A 2022-07-15 2022-07-15 Anisotropic filtering method and device Active CN117474798B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210832108.1A CN117474798B (en) 2022-07-15 2022-07-15 Anisotropic filtering method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210832108.1A CN117474798B (en) 2022-07-15 2022-07-15 Anisotropic filtering method and device

Publications (2)

Publication Number Publication Date
CN117474798A CN117474798A (en) 2024-01-30
CN117474798B true CN117474798B (en) 2025-01-21

Family

ID=89638383

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210832108.1A Active CN117474798B (en) 2022-07-15 2022-07-15 Anisotropic filtering method and device

Country Status (1)

Country Link
CN (1) CN117474798B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1845178A (en) * 2005-04-08 2006-10-11 株式会社东芝 Image plotting method and image plotting equipment using omnidirectional different pattern mapping
CN101079949A (en) * 2006-02-07 2007-11-28 索尼株式会社 Image processing apparatus and method, recording medium, and program

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7221371B2 (en) * 2004-03-30 2007-05-22 Nvidia Corporation Shorter footprints for anisotropic texture filtering
CN102034262A (en) * 2009-09-27 2011-04-27 比亚迪股份有限公司 Texture filtering method and device based on anisotropy
GB2609245B (en) * 2021-07-26 2024-09-11 Imagination Tech Ltd Anisotropic texture filtering
CN114187403A (en) * 2021-11-10 2022-03-15 杭州易现先进科技有限公司 Method, system, electronic device and storage medium for repetitive texture filtering

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1845178A (en) * 2005-04-08 2006-10-11 株式会社东芝 Image plotting method and image plotting equipment using omnidirectional different pattern mapping
CN101079949A (en) * 2006-02-07 2007-11-28 索尼株式会社 Image processing apparatus and method, recording medium, and program

Also Published As

Publication number Publication date
CN117474798A (en) 2024-01-30

Similar Documents

Publication Publication Date Title
US11436483B2 (en) Neural network engine with tile-based execution
WO2018040463A1 (en) Data compression and decompression methods for demura table, and mura compensation method
KR100594073B1 (en) Digital Image Scaling Method for Embedded System
KR101522985B1 (en) Apparatus and Method for Image Processing
WO2022141178A1 (en) Image processing method and apparatus
CN107257452B (en) A kind of image processing method, device and calculate equipment
US20190080441A1 (en) Image processing method and device
US10402196B2 (en) Multi-dimensional sliding window operation for a vector processor, including dividing a filter into a plurality of patterns for selecting data elements from a plurality of input registers and performing calculations in parallel using groups of the data elements and coefficients
CN109300083B (en) Wallis color homogenizing method and device through block processing
KR102494565B1 (en) Method for optimizing hardware structure of convolutional neural networks
CN117474798B (en) Anisotropic filtering method and device
CN112823378A (en) Image depth information determination method, device, equipment and storage medium
US8903190B2 (en) Median filtering method and apparatus
CN109410136B (en) Color homogenizing method and processing device based on shortest transmission path
CN112446951B (en) Three-dimensional reconstruction method, three-dimensional reconstruction device, electronic equipment and computer storage medium
CN117612470B (en) Color lookup table generating method and color correcting method
CN114219710B (en) Super-resolution image reconstruction method, device, electronic device and storage medium
CN112132914A (en) Image scale space establishing method and image processing chip
US9781357B2 (en) Method and an apparatus for generating an approximate nearest neighbor field (ANNF) for images and video sequences
US10277912B2 (en) Methods and apparatus for storing data related to video decoding
JP6116271B2 (en) Feature amount calculation apparatus, method, and program
WO2022027818A1 (en) Data batch processing method and batch processing apparatus thereof, and storage medium
WO2022024165A1 (en) Information processing device, information processing method, and recording medium
CN117710488B (en) Camera internal parameter calibration method, device, computer equipment and storage medium
CN115393405B (en) Image alignment method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Country or region after: China

Address after: 200135, 11th Floor, Building 3, No. 889 Bibo Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai

Applicant after: Granfei Intelligent Technology Co.,Ltd.

Address before: 200135 Room 201, No. 2557, Jinke Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai

Applicant before: Gryfield Intelligent Technology Co.,Ltd.

Country or region before: China

GR01 Patent grant
GR01 Patent grant