[go: up one dir, main page]

CN109816741B - Method and system for generating self-adaptive virtual lip gloss - Google Patents

Method and system for generating self-adaptive virtual lip gloss Download PDF

Info

Publication number
CN109816741B
CN109816741B CN201711176136.8A CN201711176136A CN109816741B CN 109816741 B CN109816741 B CN 109816741B CN 201711176136 A CN201711176136 A CN 201711176136A CN 109816741 B CN109816741 B CN 109816741B
Authority
CN
China
Prior art keywords
lip
color
pixel point
area
gloss
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711176136.8A
Other languages
Chinese (zh)
Other versions
CN109816741A (en
Inventor
吴倩
胥立丰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ziguang Zhanrui Communication Technology Co Ltd
Original Assignee
Beijing Ziguang Zhanrui Communication Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ziguang Zhanrui Communication Technology Co Ltd filed Critical Beijing Ziguang Zhanrui Communication Technology Co Ltd
Priority to CN201711176136.8A priority Critical patent/CN109816741B/en
Publication of CN109816741A publication Critical patent/CN109816741A/en
Application granted granted Critical
Publication of CN109816741B publication Critical patent/CN109816741B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Generation (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a method and a system for generating a self-adaptive virtual lip gloss, wherein the method comprises the following steps: acquiring lip key points in a face image, and determining a lip area by the lip key points; counting the average color of each pixel point in the lip region, and generating a color mapping curve according to the counted average color and the reference lip color; obtaining the brightness of each pixel point in the lip area and the average brightness of the lip area, and obtaining a lip gloss intensity function according to the brightness of each pixel point in the lip area and the average brightness of the lip area, wherein the lip gloss intensity function is used for representing the relation between the brightness of each pixel point in the lip area and the added lip gloss intensity; and adding lip gloss to the pixel points in the lip area according to the color mapping curve and the lip gloss intensity function. The invention can make the added lip gloss have natural color transition and clear texture, realize full automation and high efficiency of lip gloss addition, and achieve the effect of improving the aesthetic feeling of the portrait.

Description

Method and system for generating self-adaptive virtual lip gloss
Technical Field
The invention relates to the technical field of image processing, in particular to a method and a system for generating a self-adaptive virtual lip gloss.
Background
With the increasing level of living of social substances, the nature of the pursuit of beauty is such that more and more people are beginning to focus on their mental aspects, especially the female population. On the one hand, various cosmetics have become indispensable daily necessities for them in order to improve the appearance and improve the quality of the air. On the other hand, the face image can be subjected to makeup treatment by a virtual makeup method, so that the aesthetic feeling of the face can be improved. Virtual makeup has rich application scenes, such as use in making up in camera shooting, use in cosmetic marketing companies for virtual trial makeup, and the like. Adding lip gloss is an indispensable step in make-up and is the focus of virtual make-up software.
At present, some software can provide a function of adding lip gloss, but the degree of automation is insufficient, manual assistance is needed when the lip gloss is added, for example, the range of a lip area needs to be manually adjusted, or the added lip gloss cannot be adaptively changed according to illumination and color change of an original image, so that the color transition of the added lip gloss is not natural enough; or the added lip gloss completely covers the original image lips without retaining the textures of the original image lips, so that the added lip gloss does not look true enough.
Disclosure of Invention
The method and the system for generating the self-adaptive virtual lip gloss can determine the lip gloss color and the lip gloss intensity to be added according to the actual color and the brightness of each pixel point in the lip area, so that the added lip gloss color has natural transition and clear texture, the lip gloss is fully automatically and efficiently added, and the effect of improving the aesthetic feeling of the portrait is achieved.
In a first aspect, the present invention provides a method for generating an adaptive virtual lip gloss, including:
acquiring lip key points in a face image, and determining a lip area by the lip key points;
counting the average color of each pixel point in the lip region, and generating a color mapping curve according to the counted average color and the reference lip color;
obtaining the brightness of each pixel point in the lip area and the average brightness of the lip area, and obtaining a lip gloss intensity function according to the brightness of each pixel point in the lip area and the average brightness of the lip area, wherein the lip gloss intensity function is used for representing the relation between the brightness of each pixel point in the lip area and the added lip gloss intensity;
and adding lip gloss to the pixel points in the lip area according to the color mapping curve and the lip gloss intensity function.
Optionally, the lip key points at least comprise left and right mouth corners, M key points uniformly distributed on an outer lip line and N key points uniformly distributed on an inner lip line;
wherein M is greater than or equal to 4, and N is greater than or equal to 4.
Optionally, the determining the lip region from the lip keypoints comprises:
fitting M key points uniformly distributed on the left and right mouth corners and the outer lip line by using spline curves to obtain a first closed area A1;
fitting the left and right mouth corners and N key points uniformly distributed on the inner lip line by using spline curves to obtain a second closed area A2;
the area obtained by subtracting the second closed area A2 from the first closed area A1 is determined as the lip area.
Optionally, after the acquiring the lip keypoints in the image and determining the lip region from the lip keypoints, the method further comprises:
extracting SIFT features of key points of lips;
judging whether the lip area is blocked by the SVM classifier, and if the lip area is blocked, adding lip gloss is not needed; if the lip region is not occluded, the next step is performed.
Optionally, the determining, by the SVM classifier, whether the lip region is occluded includes:
inputting an input feature vector consisting of SIFT features of the lip key points into an SVM classifier obtained by training the occluded and non-occluded lip samples;
and outputting a judging result of whether the lip area is blocked or not by the SVM classifier.
Optionally, the counting the average color of each pixel point in the lip area, and generating the color mapping curve according to the counted average color and the reference lip color includes:
according to
Figure BDA0001478262610000031
The average color (R) of each pixel point of the lip region is counted M ,G M ,B M ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein, (R) i ,G i ,B i ) The actual color of the pixel points in the lip area is represented by N, which is the number of the pixel points in the lip area;
according to the average color (R) M ,G M ,B M ) And reference lip gloss (R) R ,G R ,B R ) Determining a target lip gloss color (R T ,G T ,B T );
In R channel with (0, 0), (R T ,R T ) (255 ) fitting Bezier curve to obtain R channel color mapping curve R d =f R (R s ) The method comprises the steps of carrying out a first treatment on the surface of the In the G channel(0,0),(G T ,G T ) (255 ) fitting Bezier curve to obtain color mapping curve G of G channel d =f G (G s ) The method comprises the steps of carrying out a first treatment on the surface of the In the B channel with (0, 0), (B T ,R T ) (255 ) fitting Bezier curve to obtain color mapping curve B of B channel d =f B (B s )。
Optionally, the average color (R M ,G M ,B M ) And reference lip gloss (R) R ,G R ,B R ) Determining a target lip gloss color (R T ,G T ,B T ) Comprising the following steps: an average color (R M ,G M ,B M ) And reference lip gloss (R) R ,G R ,B R ) The target lip gloss color (R) is calculated according to the following formula T ,G T ,B T );
Figure BDA0001478262610000032
Wherein Y is M Is the average brightness of each pixel point of the lip area, and is formed by the average color (R M ,G M ,B M ) According to Y M =0.299×R M +0.587×G M +0.114×B M Calculating to obtain;
Y R is the brightness of the reference lip gloss color, and is defined by the reference lip gloss color (R R ,G R ,B R ) According to Y R =0.299×R R +0.587×G R +0.114×B R And (5) calculating to obtain the product.
Optionally, the lip gloss intensity function α (Y i ) Is formed by the brightness Y of each pixel point of the lip region i And the average brightness Y of the lip region M Calculated according to the following formula;
α(Y i )=max(0,1-λ|Y i -Y M |)
wherein Y is i Is formed by the actual color of the pixel points in the lip areaR i ,G i ,B i ) According to Y i =0.299×R i +0.587×G i +0.114×B i Calculating to obtain;
Y M is formed by average color (R M ,G M ,B M ) According to Y M =0.299×R M +0.587×G M +0.114×BM calculated;
lambda is the adjustable coefficient.
Optionally, adding the lip gloss to the pixel points in the lip area according to the color mapping curve and the lip gloss intensity function includes:
calculating the final lip color of each pixel point of the lip region according to the following formula by using the color mapping curve and the lip color intensity function;
adding lip colors to the pixel points in the lip region according to the final lip color of each pixel point in the lip region;
R new =f R (R i )α(Y i )+R i (1-α(Y i ))
G new =f G (G i )α(Y i )+G i (1-α(Y i ))
B new =f B (B i )α(Y i )+B i (1-α(Y i ))
wherein, (R) new ,G new ,B new ) The final lip color of each pixel point in the lip area is obtained;
(f R (R i ),f R (R i ),f R (R i ) Is the actual color (R) of each pixel point of the lip region i ,G i ,B i ) According to the color mapping curve R d =f R (R s )、G d =f G (G s ) And B d =f B (B s ) Calculating the obtained mapping color;
α(Y i ) The mapping color of each pixel point of the lip area occupies a first proportion of the final lip color;
(1-α(Y i ) For each pixel point of the lip regionThe actual color of (2) is a second specific gravity of the final lip color;
and adding lip colors to the pixel points in the lip region according to the final lip color of each pixel point in the lip region.
In a second aspect, the present invention provides a system for generating an adaptive virtual lip gloss, including:
the lip region determining unit is used for acquiring lip key points in the face image and determining a lip region by the lip key points;
the color mapping curve generating unit is used for counting the average color of each pixel point in the lip region and generating a color mapping curve according to the counted average color and the reference lip color;
the lip gloss intensity function generating unit is used for obtaining the brightness of each pixel point in the lip area and the average brightness of the lip area, and obtaining a lip gloss intensity function according to the brightness of each pixel point in the lip area and the average brightness of the lip area, wherein the lip gloss intensity function is used for representing the relation between the brightness of each pixel point in the lip area and the added lip gloss intensity;
and the lip gloss adding unit is used for adding lip gloss to pixel points in the lip area according to the color mapping curve and the lip gloss intensity function.
Optionally, the color mapping curve generating unit includes:
an average color calculation module for calculating the average color according to
Figure BDA0001478262610000051
The average color (R) of each pixel point of the lip region is counted M ,G M ,B M ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein, (R) i ,G i ,B i ) The actual color of the pixel points in the lip area is represented by N, which is the number of the pixel points in the lip area;
a target lip color calculating module for calculating the average color (R M ,G M ,B M ) And reference lip gloss (R) R ,G R ,B R ) Determining a target lip gloss color (R T ,G T ,B T );
A mapping curve generating module for generating a mapping curve for the R channel with (0, 0), (R T ,R T ) (255 ) fitting Bezier curve to obtain R channel color mapping curve R d =f R (R s ) The method comprises the steps of carrying out a first treatment on the surface of the In the G channel with (0, 0), (G T ,G T ) (255 ) fitting Bezier curve to obtain color mapping curve G of G channel d =f G (G s ) The method comprises the steps of carrying out a first treatment on the surface of the In the B channel with (0, 0), (B T ,R T ) (255 ) fitting Bezier curve to obtain color mapping curve B of B channel d =f B (B s )。
Optionally, the lip gloss adding unit includes:
the final lip color calculating module is used for calculating the final lip color of each pixel point of the lip area according to the following formula by using the color mapping curve and the lip color intensity function;
R new =f R (R i )α(Y i )+R i (1-α(Y i ))
G new =f G (G i )α(Y i )+G i (1-α(Y i ))
B new =f B (B i )α(Y i )+B i (1-α(Y i ))
wherein, (R) new ,G new ,B new ) The final lip color of each pixel point in the lip area is obtained;
(f R (R i ),f R (R i ),f R (R i ) Is the actual color (R) of each pixel point of the lip region i ,G i ,B i ) According to the color mapping curve R d =f R (R s )、G d =f G (G s ) And B d =f B (B s ) Calculating the obtained mapping color;
α(Y i ) The mapping color of each pixel point of the lip area occupies a first proportion of the final lip color;
(1-α(Y i ) The actual color of each pixel point of the lip area occupies a second proportion of the final lip color;
and the adding module is used for adding lip colors to the pixel points in the lip area according to the final lip color of each pixel point in the lip area.
According to the method and the system for generating the self-adaptive virtual lip gloss, provided by the embodiment of the invention, on one hand, the average color and the reference lip gloss are utilized to construct the color mapping curve for representing the relation between the actual color of each pixel point in the lip area and the target lip gloss, and then the target lip gloss of each pixel point in the lip area can be calculated according to the color mapping curve, so that the lip gloss in the lip area can be naturally transited along with the color of the original image, the added lip gloss is naturally transited in the image, and the make-up effect is better; on the other hand, the method further utilizes the brightness of each pixel point in the lip area and the average brightness of the lip area to generate the lip gloss intensity function, so that the intensity of the lip gloss required to be added to the pixel point can be determined according to the brightness of each pixel point in the lip area, the texture of the lip can be maintained after the lip gloss is added in the lip area, and the texture of the human figure is improved.
Therefore, the method in the embodiment of the invention can determine the lip color to be added and the lip color intensity in the lip area according to the actual color and brightness of each pixel point, so that the added lip color has natural transition and clear texture, the full-automatic lip color adding is realized, the virtual lip color is efficiently added on the face image, and the aesthetic feeling of the portrait is improved.
Drawings
FIG. 1 is a flow chart of a method for generating an adaptive virtual lip gloss according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method for generating an adaptive virtual lip gloss according to another embodiment of the present invention;
FIG. 3 is a schematic diagram of a system for generating an adaptive virtual lip gloss according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an adaptive virtual lip gloss generating system according to another embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The embodiment of the invention provides a method for generating a self-adaptive virtual lip gloss, as shown in fig. 1, comprising the following steps:
s11, acquiring lip key points in a face image, and determining a lip area by the lip key points;
s12, counting the average color of each pixel point in the lip region, and generating a color mapping curve according to the counted average color and the reference lip color;
s13, obtaining the brightness of each pixel point in the lip area and the average brightness of the lip area, and obtaining a lip gloss intensity function according to the brightness of each pixel point in the lip area and the average brightness of the lip area, wherein the lip gloss intensity function is used for representing the relation between the brightness of each pixel point in the lip area and the added lip gloss intensity;
s14, adding lip gloss to the pixel points in the lip area according to the color mapping curve and the lip gloss intensity function.
According to the method for generating the self-adaptive virtual lip gloss, on one hand, the average color and the reference lip gloss are utilized to construct the color mapping curve for representing the relation between the actual color of each pixel point of the lip area and the target lip gloss, and then the target lip gloss of each pixel point of the lip area can be calculated according to the color mapping curve, so that the lip gloss in the lip area can be naturally transited along with the color of an original image, the added lip gloss is naturally transited in the image, and the make-up effect is better; on the other hand, the method further utilizes the brightness of each pixel point in the lip area and the average brightness of the lip area to generate the lip gloss intensity function, so that the intensity of the lip gloss required to be added to the pixel point can be determined according to the brightness of each pixel point in the lip area, the texture of the lip can be maintained after the lip gloss is added in the lip area, and the texture of the human figure is improved.
Therefore, the method in the embodiment can determine the lip color to be added and the lip color strength according to the actual color and brightness of each pixel point in the lip region, so that the added lip color has natural transition and clear texture, the lip color is fully automatically and efficiently added, and the effect of improving the aesthetic feeling of the portrait is achieved.
Optionally, as shown in fig. 2, the lip key points at least include left and right corners of the mouth, M key points uniformly distributed on the outer lip line, and N key points uniformly distributed on the inner lip line;
wherein M is greater than or equal to 4, and N is greater than or equal to 4.
Optionally, the determining the lip region from the lip keypoints comprises:
s111, fitting M key points uniformly distributed on the left and right mouth corners and the outer lip line by using spline curves to obtain a first closed area A1;
s112, fitting the left and right mouth corners and N key points uniformly distributed on the inner lip line by using spline curves to obtain a second closed area A2;
and S113, determining a region obtained by subtracting the second closed region A2 from the first closed region A1 as a lip region.
Optionally, after the acquiring the lip keypoints in the image and determining the lip region from the lip keypoints, the method further comprises:
s03, extracting SIFT features of key points of lips;
s04, judging whether the lip area is blocked by the SVM classifier, and if the lip area is blocked, adding a lip gloss is not needed; if the lip region is not occluded, the next step is performed.
Optionally, the determining, by the SVM classifier, whether the lip region is occluded includes:
inputting an input feature vector consisting of SIFT features of the lip key points into an SVM classifier obtained by training the occluded and non-occluded lip samples;
and outputting a judging result of whether the lip area is blocked or not by the SVM classifier.
Specifically, in this embodiment, SIFT features (Scale-invariant feature transform, that is, scale-invariant feature transform) are calculated based on SIFT algorithm, where SIFT is an algorithm for detecting local features, and the algorithm obtains features by solving descriptors of feature points (or camera points) and related Scale and orientation in a graph and performs image feature point matching, so as to obtain a good effect; the SVM (support vector machine: support Vector Machine) classifier is based on a general machine learning method under a statistical learning theory framework, is a class II classification model and has the advantages of simple structure and strong generalization capability.
According to the method, on one hand, spline curve fitting is used for fitting key points of the corners of the mouth to respectively construct a first closed area A1 and a second closed area A2, and a lip area is determined through the first closed area A1 and the second closed area A2, so that the accuracy of lip detection is improved; on the other hand, the method also judges whether the lip area is blocked or not through the SVM classifier, further, photo is avoided through the SIFT feature and the SVM classifier, the accuracy of the determined lip area is ensured, and accurate and full-automatic lip area detection is achieved.
Optionally, the counting the average color of each pixel point in the lip area, and generating the color mapping curve according to the counted average color and the reference lip color includes:
according to
Figure BDA0001478262610000101
The average color (R) of each pixel point of the lip region is counted M ,G M ,B M ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein, (R) i ,G i ,B i ) The actual color of the pixel points in the lip area is represented by N, which is the number of the pixel points in the lip area;
according to the average color (R) M ,G M ,B M ) And reference lip gloss (R) R ,G R ,B R ) Determining a target lip gloss color (R T ,G T ,B T );
In R channel with (0, 0), (R T ,R T ) (255 ) fitting Bezier curve to obtain R channel color mapping curve R d =f R (R s ) The method comprises the steps of carrying out a first treatment on the surface of the In the G channel with (0, 0), (G T ,G T ) (255 ) fitting Bezier curve to obtain color mapping curve G of G channel d =f G (G s ) The method comprises the steps of carrying out a first treatment on the surface of the In the B channel with (0, 0), (B T ,R T ) (255 ) fitting Bezier curve to obtain color mapping curve B of B channel d =f B (B s )。
Specifically, according to the method, the lip color is mainly fitted through the Bezier curve, so that the lip color is consistent with the color change of the lips in an actual image, the added lip color can be more natural in transition, and the textures of the lips are saved. Wherein the color mapping curve is capable of being calculated based on the average color (R M ,G M ,B M ) And reference lip gloss (R) R ,G R ,B R ) The determined target lip gloss color (R T ,G T ,B T ) And fitting a color mapping curve of the lip gloss consistent with the color change of the lip, wherein the curve comprises three color mapping curves of R, G and B channels respectively, the input of the color mapping curve is a target color component of each pixel point in the lip area, and the output is a new lip gloss color corresponding to the pixel point. Meanwhile, the method can fit a color mapping curve with good effect through the Bezier curve, so that lip color transition is natural, and lip textures are clear; the method can also avoid abrupt lip color and heavy image beautifying processing trace caused by setting the lip color added by the target to a fixed value.
Optionally, the average color (R M ,G M ,B M ) And reference lip gloss (R) R ,G R ,B R ) Determining a target lip gloss color (R T ,G T ,B T ) Comprising the following steps: an average color (R M ,G M ,B M ) And reference lip gloss (R) R ,G R ,B R ) The target lip gloss color (R) is calculated according to the following formula T ,G T ,B T );
Figure BDA0001478262610000111
Wherein Y is M Is the average brightness of each pixel point of the lip area, and is formed by the average color (R M ,G M ,B M ) According to Y M =0.299×R M +0.587×G M +0.114×B M Calculating to obtain;
Y R is the brightness of the reference lip gloss color, and is defined by the reference lip gloss color (R R ,G R ,B R ) According to Y R =0.299×R R +0.587×G R +0.114×B R And (5) calculating to obtain the product.
Specifically, the method of the embodiment can also adjust the brightness of the target lip gloss, and the brightness of the target lip gloss is consistent with the brightness of the actual lip area, so that the added lip gloss is ensured to be basically consistent with the brightness of the lip area on the actual image, and the added lip gloss is more natural and does not appear abrupt.
Optionally, the lip gloss intensity function α (Y i ) Is obtained by the brightness Yi of each pixel point of the lip area i And the average brightness Y of the lip region M Calculated according to the following formula;
α(Y i )=max(0,1-λ|Y i -Y M |)
wherein Y is i Is formed by the actual color (R i ,G i ,B i ) According to Y i =0.299×R i +0.587×G i +0.114×B i Calculating to obtain;
Y M is formed by average color (R M ,G M ,B M ) According to Y M =0.299×R M +0.587×G M +0.114×BM calculated;
λ is an adjustable coefficient, for example λ is a preset reference value of 0.02.
Specifically, the method according to this embodiment uses the brightness Y of each pixel point in the lip region i And the average brightness Y of the lip region M And obtaining an intensity function, wherein the value of the intensity function is inversely proportional to the absolute value of the difference between the brightness of each pixel point in the lip area and the average brightness of the lip area, the value of the intensity function is larger when the difference between the brightness of each pixel point in the lip area and the average brightness is smaller, and the value of the intensity function is smaller when the difference between the brightness of each pixel point in the lip area and the average brightness is larger. And the output result of the intensity function is in the range of (0, 1).
The intensity function characterizes the probability that the pixel point is the lip pixel point, namely, when the lip key point is detected, a small error exists, and the pixel point which is not in the lip area is taken as the pixel point in the lip area. Because the brightness of the pixel points which are not in the lip area is greatly different from the brightness of the pixel points in the lip area, the probability distribution of each pixel point can be realized through an intensity function, and the adding proportion of the lip gloss is adjusted according to the probability distribution, for example, when the average brightness of the pixel point and the average brightness of the lip area are larger, the adding proportion of the lip gloss is smaller, and the original color of the pixel point is reserved; when the average brightness of the pixel point brightness and the lip area is smaller, the ratio of the added lip gloss is larger, and the lip gloss beautifying effect of the pixel point is improved.
On the other hand, the proportion of the color of each pixel point in the lip region to be added is linearly changed along with the brightness change of the pixel point, so that the texture on the lip can be kept more conveniently.
Optionally, adding the lip gloss to the pixel points in the lip area according to the color mapping curve and the lip gloss intensity function includes:
calculating the final lip color of each pixel point of the lip region according to the following formula by using the color mapping curve and the lip color intensity function;
adding lip colors to the pixel points in the lip region according to the final lip color of each pixel point in the lip region;
R new =f R (R i )α(Y i )+R i (1-α(Y i ))
G new =f G (G i )α(Y i )+G i (1-α(Y i ))
B new =f B (B i )α(Y i )+B i (1-α(Y i ))
wherein, (R) new ,G new ,B new ) The final lip color of each pixel point in the lip area is obtained;
(f R (R i ),f R (R i ),f R (R i ) Is the actual color (R) of each pixel point of the lip region i ,G i ,B i ) According to the color mapping curve R d =f R (R s )、G d =f G (G s ) And B d =f B (B s ) Calculating the obtained mapping color;
α(Y i ) The mapping color of each pixel point of the lip area occupies a first proportion of the final lip color;
(1-α(Y i ) The actual color of each pixel point of the lip area occupies a second proportion of the final lip color;
and adding lip colors to the pixel points in the lip region according to the final lip color of each pixel point in the lip region.
In the method of the embodiment, the final lip gloss color is formed by the mapping color of each pixel point in the lip area of the first specific gravity and the actual color of each pixel point in the lip area of the second specific gravity, so that on one hand, the first specific gravity and the second specific gravity can be adjusted according to the probability distribution of each pixel point, the lip gloss self-adaptive intelligent control is realized, and the accuracy of adding the lip gloss area is improved. On the other hand, the color mapping curve fitted by the Bezier curve can be used as a factor in the final lip color, so that the lip color is natural in transition and clear in texture, and the lip color is prevented from being suddenly changed and the image beautifying processing trace is prevented from being heavy due to the fact that the lip color added by the target is set to a fixed value.
According to the formula, when the difference between the brightness of each pixel point in the lip area and the average brightness is large, the value of the intensity function is small, the proportion of the original color in the added lip color is larger, more original colors are reserved, and then some pixel points which are not in the lip area are eliminated.
For example, the RGB value of the actual color of the pixel point i of the lip region is (R i ,G i ,B i ) = (128,100,100), brightness Y i =0.299×R i +0.587×G i +0.114×B i = 108.372, the average luminance of the lip region is 120, α (Y i )=max(0,1-0.02|Y i -Y M |)=0.767
Suppose that the value of (R i ,G i ,B i ) Substituting the color mapping curve f R (R i ),f G (G i ),f B (B i ) Obtaining a mapped color (160,80,80); the new color of the pixel is:
R new =f R (R i )α(Y i )+R i (1-α(Y i ))=160×0.767+128×(1-0.767)=152.544
G new =f G (G i )α(Y i )+G i (1-α(Y i ))=80×0.767+100×(1-0.767)=84.66B new =f B (B i )α(Y i )+B i (1-α(Y i ))=80×0.767+100×(1-0.767)=84.66
optionally, the acquiring the lip key points in the face image includes:
s01, carrying out face detection on an image and generating the face image;
s02, detecting lip key points of the face image, and generating the lip key points.
In summary, according to the first aspect of the method of the embodiment, the lip region can be determined through lip key point detection, and whether the lip region is blocked or not is judged through the SVM classifier, so that accurate and full-automatic lip region detection is realized; according to the second aspect, a color mapping curve can be constructed according to the average color and the reference lip color in the lip area, and the target lip color of each pixel point in the lip area can be obtained according to the color mapping curve, so that the lip color naturally transits along with the color of the original image, and the added lip color naturally transits in the image; the method of the third aspect can further determine the intensity of adding the lip gloss according to the brightness of each pixel point in the lip area, so that the texture of the lips can be maintained after the lip gloss is added.
The embodiment of the invention also provides a system for generating the self-adaptive virtual lip gloss, as shown in fig. 3, the system comprises:
a lip region determining unit 11, configured to acquire lip key points in a face image, and determine a lip region from the lip key points;
a color mapping curve generating unit 12, configured to count an average color of each pixel point in the lip region, and generate a color mapping curve according to the counted average color and the reference lip color;
the lip gloss intensity function generating unit 13 is configured to obtain the brightness of each pixel point in the lip area and the average brightness of the lip area, and obtain a lip gloss intensity function according to the brightness of each pixel point in the lip area and the average brightness of the lip area, where the lip gloss intensity function is used to represent the relationship between the brightness of each pixel point in the lip area and the added lip gloss intensity;
and the lip gloss adding unit 14 is used for adding lip gloss to the pixel points in the lip area according to the color mapping curve and the lip gloss intensity function.
The self-adaptive virtual lip gloss generation system provided by the embodiment of the invention can determine the lip gloss color and the lip gloss intensity to be added according to the actual color and the brightness of each pixel point in the lip area, so that the added lip gloss color has natural transition and clear texture, the full automation and the high efficiency of adding the lip gloss are realized, and the effect of improving the aesthetic feeling of the portrait is achieved.
Alternatively, as shown in fig. 4, the color mapping curve generating unit 12 includes:
an average color calculation module 121 for calculating an average color according to
Figure BDA0001478262610000151
The average color (R) of each pixel point of the lip region is counted M ,G M ,B M ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein, (R) i ,G i ,B i ) The actual color of the pixel points in the lip area is represented by N, which is the number of the pixel points in the lip area; />
The target lip color calculating module 122 is configured to calculate a target lip color according to the average color (R M ,G M ,B M ) And reference lip gloss (R) R ,G R ,B R ) Determining a target lip gloss color (R T ,G T ,B T );
A mapping curve generating module 123 for generating a mapping curve for the R channel with (0, 0), (R T ,R T ) (255 ) fitting Bezier curve to obtain R channel color mapping curve R d =f R (R s ) The method comprises the steps of carrying out a first treatment on the surface of the In the G channel with (0, 0), (G T ,G T ) (255 ) fitting Bezier curve to obtain color mapping curve G of G channel d =f G (G s ) The method comprises the steps of carrying out a first treatment on the surface of the In the B channel with (0, 0), (B T ,R T ) (255 ) fitting Bezier curve to obtain color mapping curve B of B channel d =f B (B s )。
Optionally, the lip gloss adding unit 14 includes:
the final lip color calculating module 141 is configured to calculate a final lip color of each pixel point in the lip region according to the following formula from the color mapping curve and the lip color intensity function;
R new =f R (R i )α(Y i )+R i (1-α(Y i ))
G new =f G (G i )α(Y i )+G i (1-α(Y i ))
B new =f B (B i )α(Y i )+B i (1-α(Y i ))
wherein, (R) new ,G new ,B new ) The final lip color of each pixel point in the lip area is obtained;
(f R (R i ),f R (R i ),f R (R i ) Is the actual color (R) of each pixel point of the lip region i ,G i ,B i ) According to the color mapping curve R d =f R (R s )、G d =f G (G s ) And B d =f B (B s ) Calculating the obtained mapping color;
α(Y i ) The mapping color of each pixel point of the lip area occupies a first proportion of the final lip color;
(1-α(Y i ) The actual color of each pixel point of the lip area occupies a second proportion of the final lip color;
and an adding module 142, configured to add a lip gloss to the pixels in the lip area according to the final lip gloss of each pixel in the lip area.
Those skilled in the art will appreciate that implementing all or part of the above-described methods in accordance with the embodiments may be accomplished by way of a computer program stored on a computer readable storage medium, which when executed may comprise the steps of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), or the like.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any changes or substitutions easily contemplated by those skilled in the art within the scope of the present invention should be included in the present invention. Therefore, the protection scope of the present invention should be subject to the protection scope of the claims.

Claims (12)

1. The method for generating the self-adaptive virtual lip gloss is characterized by comprising the following steps of:
acquiring lip key points in a face image, and determining a lip area by the lip key points;
counting the average color of each pixel point in the lip region, and generating a color mapping curve according to the counted average color and the reference lip color;
obtaining the brightness of each pixel point in the lip area and the average brightness of the lip area, and obtaining a lip gloss intensity function according to the brightness of each pixel point in the lip area and the average brightness of the lip area, wherein the lip gloss intensity function is used for representing the relation between the brightness of each pixel point in the lip area and the added lip gloss intensity;
and adding lip gloss to the pixel points in the lip area according to the color mapping curve and the lip gloss intensity function.
2. The method of claim 1, wherein the lip keypoints comprise at least left and right corners of the mouth, M keypoints evenly distributed on the outer lip line, and N keypoints evenly distributed on the inner lip line;
wherein M is greater than or equal to 4, and N is greater than or equal to 4.
3. The method of claim 2, wherein said determining a lip region from said lip keypoints comprises:
fitting M key points uniformly distributed on the left and right mouth corners and the outer lip line by using spline curves to obtain a first closed area A1;
fitting the left and right mouth corners and N key points uniformly distributed on the inner lip line by using spline curves to obtain a second closed area A2;
the area obtained by subtracting the second closed area A2 from the first closed area A1 is determined as the lip area.
4. A method according to any one of claims 1-3, wherein after the acquiring of the lip keypoints in the face image and determining the lip areas from the lip keypoints, the method further comprises:
extracting SIFT features of key points of lips;
judging whether the lip area is blocked by the SVM classifier, and if the lip area is blocked, adding lip gloss is not needed; if the lip region is not occluded, the next step is performed.
5. The method of claim 4, wherein the determining, by the SVM classifier, whether the lip region is occluded comprises:
inputting an input feature vector consisting of SIFT features of the lip key points into an SVM classifier obtained by training the occluded and non-occluded lip samples;
and outputting a judging result of whether the lip area is blocked or not by the SVM classifier.
6. The method of claim 1, wherein the counting the average color of each pixel point in the lip region and generating a color mapping curve based on the counted average color and the reference lip color comprises:
according to
Figure FDA0004139474510000021
The average color (R) of each pixel point of the lip region is counted M ,G M ,B M ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein, (R) i ,G i ,B i ) The actual color of the pixel points in the lip area is represented by N, which is the number of the pixel points in the lip area;
according to the average color (R) M ,G M ,B M ) And reference lip gloss (R) R ,G R ,B R ) Determining a target lip gloss color (R T ,G T ,B T );
In R channel with (0, 0), (R T ,R T ) (255 ) fitting Bezier curve to obtain R channel color mapping curve R d =f R (R s ) The method comprises the steps of carrying out a first treatment on the surface of the In G channelWith (0, 0), (G) T ,G T ) (255 ) fitting Bezier curve to obtain color mapping curve G of G channel d =f G (G s ) The method comprises the steps of carrying out a first treatment on the surface of the In the B channel with (0, 0), (B T ,B T ) (255 ) fitting Bezier curve to obtain color mapping curve B of B channel d =f B (B s ) The method comprises the steps of carrying out a first treatment on the surface of the The input of the color mapping curve is a target color component of each pixel point in the lip region, and the output is a new lip color corresponding to the pixel point.
7. The method according to claim 6, wherein the color (R M ,G M ,B M ) And reference lip gloss (R) R ,G R ,B R ) Determining a target lip gloss color (R T ,G T ,B T ) Comprising the following steps: an average color (R M ,G M ,B M ) And reference lip gloss (R) R ,G R ,B R ) The target lip gloss color (R) is calculated according to the following formula T ,G T ,B T );
Figure FDA0004139474510000031
Wherein Y is M Is the average brightness of each pixel point of the lip area, and is formed by the average color (R M ,G M ,B M ) According to Y M =0.299×R M +0.587×G M +0.114×B M Calculating to obtain;
Y R is the brightness of the reference lip gloss color, and is defined by the reference lip gloss color (R R ,G R ,B R ) According to Y R =0.299×R R +0.587×G R +0.114×B R And (5) calculating to obtain the product.
8. The method according to claim 1, wherein the lip gloss intensity function α (Y i ) Is formed by the lip regionBrightness Y of each pixel point in domain i And the average brightness Y of the lip region M Calculated according to the following formula;
α(Y i )=max(0,1-λ|Y i -Y M |)
wherein Y is i Is formed by the actual color (R i ,G i ,B i ) According to Y i =0.299×R i +0.587×G i +0.114×B i Calculating to obtain;
Y M is formed by average color (R M ,G M ,B M ) According to Y M =0.299×R M +0.587×G M +0.114×B M Calculating to obtain;
lambda is the adjustable coefficient.
9. The method of claim 8, wherein adding a lip gloss to pixels in the lip region according to the color mapping curve and the lip gloss intensity function comprises:
calculating the final lip color of each pixel point of the lip region according to the following formula by using the color mapping curve and the lip color intensity function;
R new =f R (R i )α(Y i )+R i (1-α(Y i ))
G new =f G (G i )α(Y i )+G i (1-α(Y i ))
B new =f B (B i )α(Y i )+B i (1-α(Y i ))
wherein, (R) new ,G new ,B new ) The final lip color of each pixel point in the lip area is obtained;
(f R (R i ),f G (G i ),f B (B i ) Is the actual color (R) of each pixel point of the lip region i ,G i ,B i ) According to the color mapping curve R d =f R (R s )、G d =f G (G s ) And B d =f B (B s ) Calculated outMapping colors;
α(Y i ) The mapping color of each pixel point of the lip area occupies a first proportion of the final lip color;
(1-α(Y i ) The actual color of each pixel point of the lip area occupies a second proportion of the final lip color;
and adding lip colors to the pixel points in the lip region according to the final lip color of each pixel point in the lip region.
10. An adaptive virtual lip gloss generation system, comprising:
the lip region determining unit is used for acquiring lip key points in the face image and determining a lip region by the lip key points;
the color mapping curve generating unit is used for counting the average color of each pixel point in the lip region and generating a color mapping curve according to the counted average color and the reference lip color;
the lip gloss intensity function generating unit is used for obtaining the brightness of each pixel point in the lip area and the average brightness of the lip area, and obtaining a lip gloss intensity function according to the brightness of each pixel point in the lip area and the average brightness of the lip area, wherein the lip gloss intensity function is used for representing the relation between the brightness of each pixel point in the lip area and the added lip gloss intensity;
and the lip gloss adding unit is used for adding lip gloss to pixel points in the lip area according to the color mapping curve and the lip gloss intensity function.
11. The system according to claim 10, wherein the color mapping curve generating unit includes:
an average color calculation module for calculating the average color according to
Figure FDA0004139474510000041
The average color (R) of each pixel point of the lip region is counted M ,G M ,B M ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein, (R) i ,G i ,B i ) Is the pixel point in the lip areaN is the number of pixels in the lip region;
a target lip color calculating module for calculating the average color (R M ,G M ,B M ) And reference lip gloss (R) R ,G R ,B R ) Determining a target lip gloss color (R T ,G T ,B T );
A mapping curve generating module for generating a mapping curve for the R channel with (0, 0), (R T ,R T ) (255 ) fitting Bezier curve to obtain R channel color mapping curve R d =f R (R s ) The method comprises the steps of carrying out a first treatment on the surface of the In the G channel with (0, 0), (G T ,G T ) (255 ) fitting Bezier curve to obtain color mapping curve G of G channel d =f G (G s ) The method comprises the steps of carrying out a first treatment on the surface of the In the B channel with (0, 0), (B T ,B T ) (255 ) fitting Bezier curve to obtain color mapping curve B of B channel d =f B (B s );
The input of the color mapping curve is a target color component of each pixel point in the lip region, and the output is a new lip color corresponding to the pixel point.
12. The system according to claim 10 or 11, wherein the lip gloss adding unit comprises:
the final lip color calculating module is used for calculating the final lip color of each pixel point of the lip area according to the following formula by using the color mapping curve and the lip color intensity function;
R new =f R (R i )α(Y i )+R i (1-α(Y i ))
G new =f G (G i )α(Y i )+G i (1-α(Y i ))
B new =f B (B i )α(Y i )+B i (1-α(Y i ))
wherein, (R) new ,G new ,B new ) Final lip color for each pixel point of lip region;
(f R (R i ),f G (G i ),f B (B i ) Is the actual color (R) of each pixel point of the lip region i ,G i ,B i ) According to the color mapping curve R d =f R (R s )、G d =f G (G s ) And B d =f B (B s ) Calculating the obtained mapping color;
α(Y i ) The mapping color of each pixel point of the lip area occupies a first proportion of the final lip color;
(1-α(Y i ) The actual color of each pixel point of the lip area occupies a second proportion of the final lip color;
and the adding module is used for adding lip colors to the pixel points in the lip area according to the final lip color of each pixel point in the lip area.
CN201711176136.8A 2017-11-22 2017-11-22 Method and system for generating self-adaptive virtual lip gloss Active CN109816741B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711176136.8A CN109816741B (en) 2017-11-22 2017-11-22 Method and system for generating self-adaptive virtual lip gloss

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711176136.8A CN109816741B (en) 2017-11-22 2017-11-22 Method and system for generating self-adaptive virtual lip gloss

Publications (2)

Publication Number Publication Date
CN109816741A CN109816741A (en) 2019-05-28
CN109816741B true CN109816741B (en) 2023-04-28

Family

ID=66599944

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711176136.8A Active CN109816741B (en) 2017-11-22 2017-11-22 Method and system for generating self-adaptive virtual lip gloss

Country Status (1)

Country Link
CN (1) CN109816741B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110929651B (en) * 2019-11-25 2022-12-06 北京达佳互联信息技术有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN111932332B (en) * 2020-06-04 2023-04-21 北京旷视科技有限公司 Virtual makeup testing method, virtual makeup testing device, electronic equipment and computer readable medium
CN113344836B (en) * 2021-06-28 2023-04-14 展讯通信(上海)有限公司 Face image processing method and device, computer readable storage medium and terminal
CN113469914B (en) * 2021-07-08 2024-03-19 网易(杭州)网络有限公司 Animal face beautifying method and device, storage medium and electronic equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102013103A (en) * 2010-12-03 2011-04-13 上海交通大学 Method for dynamically tracking lip in real time

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4760999B1 (en) * 2010-10-29 2011-08-31 オムロン株式会社 Image processing apparatus, image processing method, and control program
JP4831259B1 (en) * 2011-03-10 2011-12-07 オムロン株式会社 Image processing apparatus, image processing method, and control program
CN103914699B (en) * 2014-04-17 2017-09-19 厦门美图网科技有限公司 A kind of method of the image enhaucament of the automatic lip gloss based on color space
CN107093168A (en) * 2017-03-10 2017-08-25 厦门美图之家科技有限公司 Processing method, the device and system of skin area image
CN107038680B (en) * 2017-03-14 2020-10-16 武汉斗鱼网络科技有限公司 Self-adaptive illumination beautifying method and system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102013103A (en) * 2010-12-03 2011-04-13 上海交通大学 Method for dynamically tracking lip in real time

Also Published As

Publication number Publication date
CN109816741A (en) 2019-05-28

Similar Documents

Publication Publication Date Title
JP7598917B2 (en) Virtual facial makeup removal, fast face detection and landmark tracking
CN109816741B (en) Method and system for generating self-adaptive virtual lip gloss
Wang et al. Single image dehazing based on the physical model and MSRCR algorithm
CN103914699B (en) A kind of method of the image enhaucament of the automatic lip gloss based on color space
KR102485503B1 (en) Apparatus and method for recommending goods based on analysis of image database
CN106845455B (en) Image processing method, system and server based on skin color detection
JP2020526809A5 (en)
WO2018082389A1 (en) Skin colour detection method and apparatus, and terminal
CN107798661B (en) An Adaptive Image Enhancement Method
CN103430208A (en) Image processing device, image processing method, and control program
Hristova et al. Style-aware robust color transfer
CN106023276B (en) Pencil drawing method for drafting and device based on image procossing
CN104282002A (en) Quick digital image beautifying method
CN110728618A (en) Virtual makeup trying method, device and equipment and image processing method
Hussain et al. Color constancy for uniform and non-uniform illuminant using image texture
CN113781370A (en) Image enhancement method and device and electronic equipment
CN108596992B (en) Rapid real-time lip gloss makeup method
Zhang et al. Atmospheric perspective effect enhancement of landscape photographs through depth-aware contrast manipulation
Wu et al. Non-uniform low-light image enhancement via non-local similarity decomposition model
KR20220012786A (en) Apparatus and method for developing style analysis model based on data augmentation
Entok et al. Pixel-Wise Color Constancy Via Smoothness Techniques In Multi-Illuminant Scenes
Huang et al. Example-based painting guided by color features
Phoka et al. Fine tuning for green screen matting
CN115018729A (en) White box image enhancement method for content
Ghodeswar et al. CAMOUFLAGE PATTERN GENERATION USING LAB COLOR MODEL

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 100191, Haidian District, Zhichun Road, Beijing No. 7 to the real building, block B, 18

Applicant after: Beijing Ziguang zhanrui Communication Technology Co.,Ltd.

Address before: 100191, Haidian District, Zhichun Road, Beijing No. 7 to the real building, block B, 18

Applicant before: BEIJING SPREADTRUM HI-TECH COMMUNICATIONS TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant