[go: up one dir, main page]

CN113538549A - Method and system for retaining texture of image texture in image processing - Google Patents

Method and system for retaining texture of image texture in image processing Download PDF

Info

Publication number
CN113538549A
CN113538549A CN202111013113.1A CN202111013113A CN113538549A CN 113538549 A CN113538549 A CN 113538549A CN 202111013113 A CN202111013113 A CN 202111013113A CN 113538549 A CN113538549 A CN 113538549A
Authority
CN
China
Prior art keywords
image
normal map
normal
processed
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111013113.1A
Other languages
Chinese (zh)
Other versions
CN113538549B (en
Inventor
林青山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Guangzhuiyuan Information Technology Co ltd
Original Assignee
Guangzhou Guangzhuiyuan Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Guangzhuiyuan Information Technology Co ltd filed Critical Guangzhou Guangzhuiyuan Information Technology Co ltd
Priority to CN202111013113.1A priority Critical patent/CN113538549B/en
Publication of CN113538549A publication Critical patent/CN113538549A/en
Application granted granted Critical
Publication of CN113538549B publication Critical patent/CN113538549B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/529Depth or shape recovery from texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Image Generation (AREA)

Abstract

The application relates to a method and a system for retaining texture of an image when the image is processed, wherein the method comprises the following steps: confirming an area to be eliminated in an image to be processed, smearing the area to be eliminated to obtain a smeared mask image, filling the smeared area to be eliminated, calculating to obtain a first normal map of the image to be processed, downsampling the image to be processed based on a preset scaling coefficient, and calculating to obtain a second normal map of the downsampled image to be processed. And processing the first normal map and the second normal map, merging the processed first normal map and the second normal map into a third normal map, performing texture recovery on the mask map based on the third normal map, and replacing the mask map with the recovered texture to the area to be eliminated. The method and the device not only eliminate the pattern of the area to be eliminated in the image to be processed, but also reserve the texture of the image texture of the area to be eliminated.

Description

Method and system for retaining texture of image texture in image processing
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and a system for retaining texture of an image when processing the image.
Background
In some design fields, a user may need to smear patterns on some objects in an image and replace the patterns with user-defined patterns, such as replacing a LOGO on clothes in the image. In the prior art, when such image processing is performed, a pattern on the surface of an object is generally directly smeared, and then the smeared image portion is subjected to repairing processing, but the texture information of the repaired portion is often lost in the repaired image, for example, after the pattern on the clothes is removed, the wrinkle information of the clothes is also lost.
Disclosure of Invention
In order to overcome the problem that the texture information of the repaired part of the image is lost when the smeared image part is repaired in the image processing in the related art at least to a certain extent, the application provides a method and a system for retaining the texture of the image in the image processing.
The scheme of the application is as follows:
according to a first aspect of embodiments of the present application, there is provided a method of preserving image texture when processing an image, comprising:
confirming an area to be eliminated in an image to be processed, and smearing the area to be eliminated to obtain a smeared mask image;
filling the coated area to be eliminated;
calculating to obtain a first normal map of the image to be processed;
based on a preset scaling coefficient, down-sampling the image to be processed, and calculating to obtain a second normal map of the down-sampled image to be processed;
processing the first normal map and the second normal map;
merging the processed first normal map and the second normal map into a third normal map;
performing texture restoration on the mask map based on the third normal map;
and replacing the mask image after texture recovery to the area to be eliminated.
Preferably, in an implementable manner of the present application, the filling of the area to be eliminated after the application includes:
and selecting the adjacent pixel area of the area to be eliminated to carry out pixel filling on the area to be eliminated.
Preferably, in an implementation manner of the present application, the calculating to obtain the first normal map of the image to be processed includes:
establishing a first coordinate system, and constructing a first normal vector in the first coordinate system;
taking the gradient of the image to be processed in the horizontal direction as a component vector of the first normal vector on the x axis of the first coordinate system;
taking the gradient of the image to be processed in the vertical direction as a component vector of the first normal vector on the y axis of the first coordinate system;
taking a first preset constant as a component vector of the first normal vector on the z axis of the first coordinate system;
normalizing the first normal vector;
processing the normalized first normal vector;
and generating the first normal map according to the processed first normal vector.
Preferably, in an implementation manner of the present application, the calculating to obtain the second normal map of the downsampled image to be processed includes:
establishing a second coordinate system, and constructing a second normal vector in the second coordinate system;
taking the gradient of the downsampled image to be processed in the horizontal direction as a component vector of the second normal vector on the x axis of the second coordinate system;
taking the gradient of the downsampled image to be processed in the vertical direction as a component vector of the second normal vector on the y axis of the second coordinate system;
taking a second preset constant as a component vector of the first normal vector on the z axis of the first coordinate system; wherein the second preset constant is greater than the first preset constant;
normalizing the second normal vector;
processing the normalized second normal vector;
and generating the second normal map according to the second normal vector obtained by processing.
Preferably, in an implementable manner of the present application, the processing the normalized first normal vector includes: suppressing a larger value at two ends of the normalized first normal vector result;
the processing the normalized second normal vector includes: and suppressing the larger value at the two ends of the normalized second normal vector result.
Preferably, in an implementation manner of the present application, the method further includes:
acquiring pixel values of all points of the image to be processed;
and calculating the gradient of each point of the image to be processed based on a gradient operator according to the pixel value of each point of the image to be processed.
Preferably, in an implementable manner of the present application, the processing the first normal map and the second normal map includes:
and performing Gaussian blur on the first normal map and the second normal map, and smoothing normal changes of the first normal map and the second normal map.
Preferably, in an implementable manner of the present application, the merging the processed first normal map and the second normal map into the third normal map includes:
and mixing the processed first normal map and the second normal map in a soft light manner to obtain the third normal map.
Preferably, in an implementable manner of the present application, the texture restoration of the mask map based on the third normal map includes:
and calculating a world light irradiation result of the third normal line mapping, and recovering the shade and gradient change of the object surface wrinkles in the mask map according to the world light irradiation result.
According to a second aspect of embodiments of the present application, there is provided a system for preserving texture of an image when processing the image, comprising:
the elimination module is used for confirming an area to be eliminated in the image to be processed, smearing the area to be eliminated and obtaining a smeared mask image;
the filling module is used for filling the coated area to be eliminated;
the first normal map generating module is used for calculating to obtain a first normal map of the image to be processed;
the second normal map generating module is used for down-sampling the image to be processed based on a preset scaling coefficient, and calculating to obtain a second normal map of the down-sampled image to be processed;
the normal map processing module is used for processing the first normal map and the second normal map;
the normal map merging module is used for merging the processed first normal map and the second normal map into a third normal map;
the texture recovery module is used for performing texture recovery on the mask map based on the third normal line mapping;
and the replacing module is used for replacing the mask image after the texture recovery to the area to be eliminated.
The technical scheme provided by the application can comprise the following beneficial effects: the method for retaining texture of image texture when processing image comprises the following steps: confirming an area to be eliminated in an image to be processed, smearing the area to be eliminated to obtain a smeared mask image, filling the smeared area to be eliminated, calculating to obtain a first normal map of the image to be processed, downsampling the image to be processed based on a preset scaling coefficient, and calculating to obtain a second normal map of the downsampled image to be processed. The first normal map is calculated by adopting the full resolution of the image to be processed, so that more details of the image to be processed are obtained, and the second normal map is calculated by adopting the image to be processed after down sampling, so that more structural information of the image to be processed is obtained. And processing the first normal map and the second normal map, merging the processed first normal map and the second normal map into a third normal map, performing texture recovery on the mask map based on the third normal map, and replacing the mask map with the recovered texture to the area to be eliminated. The pattern of the area to be eliminated in the image to be processed is eliminated, and the texture of the image texture of the area to be eliminated is reserved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
FIG. 1 is a flow chart illustrating a method for preserving texture of an image when processing the image according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart illustrating a first normal map of an image to be processed calculated in a method for retaining texture of the image when the image is processed according to another embodiment of the present application;
FIG. 3 is a schematic diagram of an image to be processed according to an embodiment of the present application;
FIG. 4 is a schematic illustration of a mask after painting as provided by one embodiment of the present application;
FIG. 5 is a schematic diagram illustrating a filled area to be eliminated after smearing an image to be processed according to an embodiment of the present application;
FIG. 6 is a third schematic line map provided by one embodiment of the present application;
FIG. 7 is a schematic illustration of a resulting processed image provided by an embodiment of the present application;
fig. 8 is a schematic structural diagram of a system for preserving texture of an image when processing the image according to an embodiment of the present application.
Reference numerals: a cancellation module-21; a filling module-22; a first normal map generation module-23; a second normal map generation module-24; normal map processing module-25; normal map merging module-26; texture restoration module-27; module-28 is replaced.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
A method of preserving texture of an image when processing the image, referring to fig. 1, comprising:
s11: confirming an area to be eliminated in the image to be processed, and smearing the area to be eliminated to obtain a smeared mask image;
fig. 3 is illustrated as an image to be processed in this embodiment and the following embodiments. The applied mask is shown in fig. 4.
S12: filling the coated area to be eliminated;
specifically, the method comprises the following steps: and selecting the adjacent pixel area of the area to be eliminated for pixel filling of the area to be eliminated, so that the filled pixels have continuity. The processed image has more authenticity and better texture effect. The effect after filling is shown in fig. 5.
S13: calculating to obtain a first normal map of the image to be processed;
referring to fig. 2, specifically:
s131: establishing a first coordinate system, and constructing a first normal vector in the first coordinate system;
s132: taking the gradient of the image to be processed in the horizontal direction as a component vector of a first normal vector on the x axis of a first coordinate system;
s133: taking the gradient of the image to be processed in the vertical direction as a component vector of a first normal vector on the y axis of a first coordinate system;
s134: taking a first preset constant as a component vector of the first normal vector on the z axis of the first coordinate system;
s135: normalizing the first normal vector;
s136: processing the normalized first normal vector;
s137: and generating a first normal map according to the processed first normal vector.
The magnitude of the first predetermined constant determines the degree of concavity and convexity of the first normal map. When the first preset constant is small, the component vector of the first normal vector on the z axis of the first coordinate system is small, so that the components of the normalized first normal vector in the x and y directions are large, and further the concave-convex fluctuation change of the first normal map is large, and vice versa.
The formula for normalizing the first normal vector is as follows:
n=(x,y,z)/length((x,y,z))
wherein n is a first normal vector, x, y, z are components of the first normal vector on the x axis, components of the first normal vector on the y axis, and a first preset constant.
S14: based on a preset scaling coefficient, down-sampling the image to be processed, and calculating to obtain a second normal map of the down-sampled image to be processed;
specifically, the method comprises the following steps:
establishing a second coordinate system, and constructing a second normal vector in the second coordinate system;
taking the gradient of the down-sampled image to be processed in the horizontal direction as a component vector of a second normal vector on the x axis of a second coordinate system;
taking the gradient of the down-sampled image to be processed in the vertical direction as a component vector of a second normal vector on the y axis of a second coordinate system;
taking a second preset constant as a component vector of the first normal vector on the z axis of the first coordinate system; wherein the second preset constant is larger than the first preset constant;
normalizing the second normal vector;
processing the normalized second normal vector;
and generating a second normal map according to the second normal vector obtained by processing.
The process of generating the second normal map is substantially identical to the process of generating the first normal map.
The difference is that the image to be processed needs to be downsampled based on a preset scaling factor before the second normal map is generated.
Preferably, the preset scaling factor is 4.
The down-sampling is a process of image reduction, and the down-sampling of the image to be processed is to reduce the image to be processed.
In this embodiment, the second preset constant is larger than the first preset constant, and the second preset constant on the z-axis is set to be larger, so that the structural information of the image is more conveniently obtained.
When the first normal map is calculated, more details of the image to be processed are obtained because the full resolution of the image to be processed is adopted for calculation. More structural information of the image to be processed is obtained when the second normal map is calculated. The main purpose of computing the second normal map is to recover the larger range of gradient transformation conditions of the image to be processed.
S15: processing the first normal map and the second normal map;
s16: merging the processed first normal map and the second normal map into a third normal map;
a third normal map is shown in fig. 6.
S17: performing texture recovery on the mask image based on the third normal line mapping;
s18: and replacing the mask image after the texture recovery to the area to be eliminated.
The resulting processed image is shown in fig. 7.
The method for retaining texture of image texture in processing image in the embodiment comprises the following steps: confirming an area to be eliminated in an image to be processed, smearing the area to be eliminated to obtain a smeared mask image, filling the smeared area to be eliminated, calculating to obtain a first normal map of the image to be processed, downsampling the image to be processed based on a preset scaling coefficient, and calculating to obtain a second normal map of the downsampled image to be processed. The first normal map is calculated by adopting the full resolution of the image to be processed, so that more details of the image to be processed are obtained, and the second normal map is calculated by adopting the image to be processed after down sampling, so that more structural information of the image to be processed is obtained. And processing the first normal map and the second normal map, merging the processed first normal map and the second normal map into a third normal map, performing texture recovery on the mask map based on the third normal map, and replacing the mask map with the recovered texture to the area to be eliminated. The pattern of the area to be eliminated in the image to be processed is eliminated, and the texture of the image texture of the area to be eliminated is reserved.
In some embodiments, the method for preserving texture of an image when processing an image, processing the normalized first normal vector, includes: suppressing a larger value at two ends of the normalized first normal vector result;
processing the normalized second normal vector, including: and suppressing the larger value at the two ends of the normalized second normal vector result.
Since more gradient information needs to be retained instead of edge gradient information of abrupt change when performing image restoration, the present embodiment further performs primary suppression on the larger value at both ends of the result of obtaining the normal vector by calculation in the above embodiment, so as to suppress edge information of abrupt change, and obtain a normal vector after suppression.
Specifically, the suppression is performed with reference to the following formula:
n’=max((-smoothstep(0.0,0.35,abs(n))+1.0,0.0)*n。
wherein n is the normalized first normal vector or the normalized second normal vector, and n' is the suppressed first normal vector or the suppressed second normal vector.
The method of preserving image texture in some embodiments, further comprising:
acquiring pixel values of all points of an image to be processed;
and calculating the gradient of each point of the image to be processed based on a gradient operator according to the pixel value of each point of the image to be processed.
In this embodiment, the gradients of each point of the image to be processed may be calculated based on various gradient operators. The gradient operator may be, but is not limited to, a Sobel operator or a Prewitt operator.
In this embodiment, the calculation gradient is described by taking Sobel operator as an example:
the gradient at coordinate (u, v) is:
dx=2*P(u–1,v)+P(u-1,v+1)+P(u–1,v–1)–2*P(u+1,v)–P(u+1,v+1)–P(u+1,v–1),dy=P(u-1,v–1)+2*P(u,v-1)+P(u+1,v-1)–P(u-1,v+1)–2*P(u,v+1)–P(u+1,v+1)
where dx is the gradient at point (u, v) and P (u, v) is the pixel value at point (u, v).
In some embodiments, a method for preserving texture of an image when processing the image, processing a first normal map and a second normal map, comprises:
and performing Gaussian blurring on the first normal map and the second normal map, and smoothing normal changes of the first normal map and the second normal map.
Gaussian blur, also known as gaussian smoothing, is a widely used processing effect in existing image processing software, and is commonly used to reduce image noise and reduce detail levels. In this embodiment, gaussian blurring is performed on the first normal map and the second normal map to smooth normal variations of the first normal map and the second normal map.
In some embodiments, the method for preserving texture of an image when processing the image, merging the processed first normal map and the second normal map into a third normal map includes:
and performing soft light mixing on the processed first normal map and the processed second normal map to obtain a third normal map.
In this embodiment, the smoothed first normal map and the smoothed second normal map are combined to obtain a final normal map. Specifically, the first normal map and the second normal map are merged based on soft light blending.
In some embodiments, a method for preserving texture of an image when processing the image, the texture recovery of the mask map based on the third normal mapping comprises:
and calculating a world light irradiation result of the third normal line mapping, and recovering the shadow and gradient change of the object surface wrinkles in the mask map according to the world light irradiation result.
In this embodiment, the shadow and gradient change of the object surface wrinkle in the mask map are recovered by calculating the world light illumination result of the third normal line map.
Specifically, the calculation is performed with the world light as the white light as the reference.
A system for preserving texture of an image when processing the image, referring to fig. 8, comprising:
the elimination module 21 is configured to determine an area to be eliminated in the image to be processed, and smear the area to be eliminated to obtain a smeared mask image;
a filling module 22, configured to fill the area to be eliminated after the smearing;
the first normal map generating module 23 is configured to calculate a first normal map of the to-be-processed image;
the second normal map generating module 24 is configured to perform downsampling on the image to be processed based on the preset scaling factor, and calculate to obtain a second normal map of the downsampled image to be processed;
a normal map processing module 25, configured to process the first normal map and the second normal map;
a normal map merging module 26, configured to merge the processed first normal map and the second normal map into a third normal map;
a texture recovery module 27, configured to perform texture recovery on the mask map based on the third normal map;
and a replacing module 28, configured to replace the texture-restored mask map with the area to be eliminated.
Specifically, the normal map processing module 25 is configured to perform gaussian blurring on the first normal map and the second normal map, and smooth normal changes of the first normal map and the second normal map.
Specifically, the normal map merging module 26 is configured to perform soft light mixing on the processed first normal map and the processed second normal map to obtain a third normal map.
Specifically, the texture recovery module 27 is configured to calculate a world light irradiation result of the third normal map, and recover the shading and gradient change of the object surface wrinkle in the mask map according to the world light irradiation result.
In the system for retaining the texture of the image during processing of the image in the embodiment, the elimination module is used for confirming the area to be eliminated in the image to be processed, and the area to be eliminated is smeared to obtain the smeared mask image. And filling the area to be eliminated after the smearing through a filling module. And calculating to obtain a first normal map of the image to be processed through a first normal map generating module, downsampling the image to be processed through a second normal map generating module based on a preset scaling coefficient, and calculating to obtain a second normal map of the downsampled image to be processed. The first normal map is calculated by adopting the full resolution of the image to be processed, so that more details of the image to be processed are obtained, and the second normal map is calculated by adopting the image to be processed after down sampling, so that more structural information of the image to be processed is obtained. The method comprises the steps of processing a first normal map and a second normal map through a normal map processing module, merging the processed first normal map and the second normal map into a third normal map through a normal map merging module, performing texture recovery on a mask map based on the third normal map through a texture recovery module, and finally replacing the mask map with the recovered texture to an area to be eliminated through a replacement module. In the embodiment, not only the pattern of the area to be eliminated in the image to be processed is eliminated, but also the texture of the image texture of the area to be eliminated is reserved.
The system for preserving texture of an image while processing the image in some embodiments, further comprising:
the gradient calculation module is used for acquiring pixel values of all points of the image to be processed; and calculating the gradient of each point of the image to be processed based on a gradient operator according to the pixel value of each point of the image to be processed.
It is understood that the same or similar parts in the above embodiments may be mutually referred to, and the same or similar parts in other embodiments may be referred to for the content which is not described in detail in some embodiments.
It should be noted that, in the description of the present application, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Further, in the description of the present application, the meaning of "a plurality" means at least two unless otherwise specified.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (10)

1. A method for preserving texture of an image when processing the image, comprising:
confirming an area to be eliminated in an image to be processed, and smearing the area to be eliminated to obtain a smeared mask image;
filling the coated area to be eliminated;
calculating to obtain a first normal map of the image to be processed;
based on a preset scaling coefficient, down-sampling the image to be processed, and calculating to obtain a second normal map of the down-sampled image to be processed;
processing the first normal map and the second normal map;
merging the processed first normal map and the second normal map into a third normal map;
performing texture restoration on the mask map based on the third normal map;
and replacing the mask image after texture recovery to the area to be eliminated.
2. The method according to claim 1, wherein the filling of the area to be eliminated after the applying comprises:
and selecting the adjacent pixel area of the area to be eliminated to carry out pixel filling on the area to be eliminated.
3. The method of claim 1, wherein the computing a first normal map of the image to be processed comprises:
establishing a first coordinate system, and constructing a first normal vector in the first coordinate system;
taking the gradient of the image to be processed in the horizontal direction as a component vector of the first normal vector on the x axis of the first coordinate system;
taking the gradient of the image to be processed in the vertical direction as a component vector of the first normal vector on the y axis of the first coordinate system;
taking a first preset constant as a component vector of the first normal vector on the z axis of the first coordinate system;
normalizing the first normal vector;
processing the normalized first normal vector;
and generating the first normal map according to the processed first normal vector.
4. The method of claim 3, wherein the computing a second normal map of the downsampled image to be processed comprises:
establishing a second coordinate system, and constructing a second normal vector in the second coordinate system;
taking the gradient of the downsampled image to be processed in the horizontal direction as a component vector of the second normal vector on the x axis of the second coordinate system;
taking the gradient of the downsampled image to be processed in the vertical direction as a component vector of the second normal vector on the y axis of the second coordinate system;
taking a second preset constant as a component vector of the first normal vector on the z axis of the first coordinate system; wherein the second preset constant is greater than the first preset constant;
normalizing the second normal vector;
processing the normalized second normal vector;
and generating the second normal map according to the second normal vector obtained by processing.
5. The method of claim 1, wherein processing the normalized first normal vector comprises: suppressing a larger value at two ends of the normalized first normal vector result;
the processing the normalized second normal vector includes: and suppressing the larger value at the two ends of the normalized second normal vector result.
6. The method of claim 1, further comprising:
acquiring pixel values of all points of the image to be processed;
and calculating the gradient of each point of the image to be processed based on a gradient operator according to the pixel value of each point of the image to be processed.
7. The method of claim 1, wherein the processing the first normal map and the second normal map comprises:
and performing Gaussian blur on the first normal map and the second normal map, and smoothing normal changes of the first normal map and the second normal map.
8. The method of claim 1, wherein merging the processed first and second normal maps into a third normal map comprises:
and mixing the processed first normal map and the second normal map in a soft light manner to obtain the third normal map.
9. The method of claim 1, wherein the texture recovering the mask map based on the third normal map comprises:
and calculating a world light irradiation result of the third normal line mapping, and recovering the shade and gradient change of the object surface wrinkles in the mask map according to the world light irradiation result.
10. A system for preserving texture of an image when processing the image, comprising:
the elimination module is used for confirming an area to be eliminated in the image to be processed, smearing the area to be eliminated and obtaining a smeared mask image;
the filling module is used for filling the coated area to be eliminated;
the first normal map generating module is used for calculating to obtain a first normal map of the image to be processed;
the second normal map generating module is used for down-sampling the image to be processed based on a preset scaling coefficient, and calculating to obtain a second normal map of the down-sampled image to be processed;
the normal map processing module is used for processing the first normal map and the second normal map;
the normal map merging module is used for merging the processed first normal map and the second normal map into a third normal map;
the texture recovery module is used for performing texture recovery on the mask map based on the third normal line mapping;
and the replacing module is used for replacing the mask image after the texture recovery to the area to be eliminated.
CN202111013113.1A 2021-08-31 2021-08-31 Method and system for retaining texture of image texture during image processing Active CN113538549B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111013113.1A CN113538549B (en) 2021-08-31 2021-08-31 Method and system for retaining texture of image texture during image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111013113.1A CN113538549B (en) 2021-08-31 2021-08-31 Method and system for retaining texture of image texture during image processing

Publications (2)

Publication Number Publication Date
CN113538549A true CN113538549A (en) 2021-10-22
CN113538549B CN113538549B (en) 2023-12-22

Family

ID=78122946

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111013113.1A Active CN113538549B (en) 2021-08-31 2021-08-31 Method and system for retaining texture of image texture during image processing

Country Status (1)

Country Link
CN (1) CN113538549B (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2659458A1 (en) * 2010-12-30 2013-11-06 TomTom Polska SP. Z O.O. System and method for generating textured map object images
KR20140019199A (en) * 2012-08-06 2014-02-14 동명대학교산학협력단 Method of producing 3d earth globes based on natural user interface using motion-recognition infrared camera
CN104392481A (en) * 2014-11-25 2015-03-04 无锡梵天信息技术股份有限公司 Method and device for controlling specular reflection definition by mapping
CN105574918A (en) * 2015-12-24 2016-05-11 网易(杭州)网络有限公司 Material adding method and apparatus of 3D model, and terminal
US20170148205A1 (en) * 2015-11-19 2017-05-25 Adobe Systems Incorporated Creating bump and normal maps from images with multi-scale control
CN107204033A (en) * 2016-03-16 2017-09-26 腾讯科技(深圳)有限公司 The generation method and device of picture
CN107358643A (en) * 2017-07-04 2017-11-17 网易(杭州)网络有限公司 Image processing method, device, electronic equipment and storage medium
CN107808372A (en) * 2017-11-02 2018-03-16 北京奇虎科技有限公司 Image penetration management method, apparatus, computing device and computer-readable storage medium
CN109559319A (en) * 2018-10-31 2019-04-02 深圳市创梦天地科技有限公司 A kind of processing method and terminal of normal map
CN111243099A (en) * 2018-11-12 2020-06-05 联想新视界(天津)科技有限公司 Method and device for processing image and method and device for displaying image in AR (augmented reality) device
CN111583398A (en) * 2020-05-15 2020-08-25 网易(杭州)网络有限公司 Image display method and device, electronic equipment and computer readable storage medium
CN111612882A (en) * 2020-06-10 2020-09-01 腾讯科技(深圳)有限公司 Image processing method, image processing device, computer storage medium and electronic equipment
CN111773710A (en) * 2020-08-20 2020-10-16 网易(杭州)网络有限公司 Texture image processing method and device, electronic equipment and storage medium
CN111899325A (en) * 2020-08-13 2020-11-06 网易(杭州)网络有限公司 Rendering method and device of crystal stone model, electronic equipment and storage medium
JP2020197774A (en) * 2019-05-31 2020-12-10 キヤノン株式会社 Image processing method, image processing device, image-capturing device, image processing program, and memory medium
CN112241933A (en) * 2020-07-15 2021-01-19 北京沃东天骏信息技术有限公司 Face image processing method and device, storage medium and electronic equipment
CN112870707A (en) * 2021-03-19 2021-06-01 腾讯科技(深圳)有限公司 Virtual object display method in virtual scene, computer device and storage medium

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2659458A1 (en) * 2010-12-30 2013-11-06 TomTom Polska SP. Z O.O. System and method for generating textured map object images
KR20140019199A (en) * 2012-08-06 2014-02-14 동명대학교산학협력단 Method of producing 3d earth globes based on natural user interface using motion-recognition infrared camera
CN104392481A (en) * 2014-11-25 2015-03-04 无锡梵天信息技术股份有限公司 Method and device for controlling specular reflection definition by mapping
US20170148205A1 (en) * 2015-11-19 2017-05-25 Adobe Systems Incorporated Creating bump and normal maps from images with multi-scale control
CN105574918A (en) * 2015-12-24 2016-05-11 网易(杭州)网络有限公司 Material adding method and apparatus of 3D model, and terminal
CN107204033A (en) * 2016-03-16 2017-09-26 腾讯科技(深圳)有限公司 The generation method and device of picture
CN107358643A (en) * 2017-07-04 2017-11-17 网易(杭州)网络有限公司 Image processing method, device, electronic equipment and storage medium
CN107808372A (en) * 2017-11-02 2018-03-16 北京奇虎科技有限公司 Image penetration management method, apparatus, computing device and computer-readable storage medium
CN109559319A (en) * 2018-10-31 2019-04-02 深圳市创梦天地科技有限公司 A kind of processing method and terminal of normal map
CN111243099A (en) * 2018-11-12 2020-06-05 联想新视界(天津)科技有限公司 Method and device for processing image and method and device for displaying image in AR (augmented reality) device
JP2020197774A (en) * 2019-05-31 2020-12-10 キヤノン株式会社 Image processing method, image processing device, image-capturing device, image processing program, and memory medium
CN111583398A (en) * 2020-05-15 2020-08-25 网易(杭州)网络有限公司 Image display method and device, electronic equipment and computer readable storage medium
CN111612882A (en) * 2020-06-10 2020-09-01 腾讯科技(深圳)有限公司 Image processing method, image processing device, computer storage medium and electronic equipment
CN112241933A (en) * 2020-07-15 2021-01-19 北京沃东天骏信息技术有限公司 Face image processing method and device, storage medium and electronic equipment
CN111899325A (en) * 2020-08-13 2020-11-06 网易(杭州)网络有限公司 Rendering method and device of crystal stone model, electronic equipment and storage medium
CN111773710A (en) * 2020-08-20 2020-10-16 网易(杭州)网络有限公司 Texture image processing method and device, electronic equipment and storage medium
CN112870707A (en) * 2021-03-19 2021-06-01 腾讯科技(深圳)有限公司 Virtual object display method in virtual scene, computer device and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LIN XU ET AL: ""A general texture mapping framework for image-based 3D modeling"", 《2010 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING》 *
柏文;花金祥;张茜茜;: "浅析水利三维设计中的法线贴图工作原理与绘制方法", 科技视界, no. 18 *

Also Published As

Publication number Publication date
CN113538549B (en) 2023-12-22

Similar Documents

Publication Publication Date Title
Wei et al. Deep retinex decomposition for low-light enhancement
Liu et al. Robust color guided depth map restoration
US8406548B2 (en) Method and apparatus for performing a blur rendering process on an image
US9311901B2 (en) Variable blend width compositing
EP2380132B1 (en) Denoising medical images
CN106027851B (en) Method and system for processing images
US8433115B2 (en) System and method for multi-image based stent visibility enhancement
Fredembach et al. Hamiltonian Path based Shadow Removal.
US20130202177A1 (en) Non-linear resolution reduction for medical imagery
CN113744142B (en) Image restoration method, electronic device and storage medium
JP2002503840A (en) A new perceptual threshold determination for gradient-based local contour detection
JP2004038984A (en) Interpolated image filtering method and apparatus
WO2022088976A1 (en) Image processing method and device
CN101202928B (en) Apparatus and method for improving image clarity
US20140050417A1 (en) Image filtering based on structural information
CN113793272B (en) Image noise reduction method and device, storage medium and terminal
WO2010050914A1 (en) Method and system for enhancing image signals to alter the perception of depth by human viewers
CN103942756B (en) A kind of method of depth map post processing and filtering
CN105931213A (en) Edge detection and frame difference method-based high-dynamic range video de-ghosting method
CN113129207B (en) Picture background blurring method and device, computer equipment and storage medium
CN111263067B (en) Image processing method, device, terminal equipment and storage medium
JP5617426B2 (en) Jaggy mitigation processing apparatus and jaggy mitigation processing method
Seo Image denoising and refinement based on an iteratively reweighted least squares filter
CN108230251A (en) Combined image restoration method and device
CN113538549B (en) Method and system for retaining texture of image texture during image processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant