[go: up one dir, main page]

CN117119317A - Image processing method, device, electronic equipment and readable storage medium - Google Patents

Image processing method, device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN117119317A
CN117119317A CN202311069425.3A CN202311069425A CN117119317A CN 117119317 A CN117119317 A CN 117119317A CN 202311069425 A CN202311069425 A CN 202311069425A CN 117119317 A CN117119317 A CN 117119317A
Authority
CN
China
Prior art keywords
image
pixel
color cast
determining
illumination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311069425.3A
Other languages
Chinese (zh)
Inventor
杨丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202311069425.3A priority Critical patent/CN117119317A/en
Publication of CN117119317A publication Critical patent/CN117119317A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application discloses an image processing method, an image processing device, electronic equipment and a readable storage medium, and belongs to the technical field of image processing. The method comprises the following steps: acquiring a first image; determining a light intensity coefficient matrix according to the pixel information of the first image and the pixel information of the reference image, wherein the light intensity coefficient matrix is used for indicating the lighting intensity for performing image processing on the first image; determining a global normal direction of the first image, and taking the global normal direction as a lighting direction for performing image processing on the first image; and carrying out image processing on the first image according to the light intensity coefficient matrix and the lighting direction to obtain a second image.

Description

Image processing method, device, electronic equipment and readable storage medium
Technical Field
The application belongs to the technical field of image processing, and particularly relates to an image processing method, an image processing device, electronic equipment and a readable storage medium.
Background
At present, when a mobile terminal user shoots common life scenes such as portrait, food and the like, environmental light has the most direct influence on the presentation effect of a shot object. It is difficult to obtain an image of high quality and atmosphere in a dark light environment. In order to enhance the overall texture and atmosphere of the image, a user is usually required to additionally prepare a professional lighting device or process the image by post-processing, such as by a picture trimming application. However, professional lighting equipment is expensive, and the arrangement of the lights and how to shine requires professional knowledge, and the learning threshold is too high. The images are adjusted in a post-processing mode, the operation is complicated, the adjusted images have the problem of stereoscopic impression missing, and the brightness and the color temperature of the adjusted images are flattened, so that the images are distorted.
Disclosure of Invention
The embodiment of the application aims to provide an image processing method, an image processing device, electronic equipment and a readable storage medium, which can quickly realize omnibearing lighting effect without complicated lighting arrangement, have more vivid lighting effect and improve the texture and atmosphere of images.
In a first aspect, an embodiment of the present application provides an image processing method, including:
acquiring a first image;
determining a light intensity coefficient corresponding to each pixel point on the first image according to the pixel information of the first image and the pixel information of the reference image, and determining a light intensity coefficient matrix according to the light intensity coefficient corresponding to each pixel point on the first image, wherein the light intensity coefficient matrix is used for indicating the lighting intensity for performing image processing on the first image;
determining a global normal direction of the first image, and taking the global normal direction as a polishing direction for performing image processing on the first image;
and performing image processing on the first image according to the light intensity coefficient matrix and the lighting direction to obtain a second image.
In a second aspect, an embodiment of the present application provides an image processing apparatus including:
The acquisition module is used for acquiring a first image;
the first determining module is used for determining a light intensity coefficient corresponding to each pixel point on the first image according to the pixel information of the first image and the pixel information of the reference image, and determining a light intensity coefficient matrix according to the light intensity coefficient corresponding to each pixel point on the first image, wherein the light intensity coefficient matrix is used for indicating the lighting intensity for image processing of the first image;
the second determining module is used for determining the global normal direction of the first image and taking the global normal direction as a lighting direction for performing image processing on the first image;
and the image processing module is used for carrying out image processing on the first image according to the light intensity coefficient matrix and the lighting direction to obtain a second image.
In a third aspect, an embodiment of the present application provides an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor perform the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, a first image is acquired, a light intensity coefficient corresponding to each pixel point on the first image is determined according to the pixel information of the first image and the pixel information of a reference image, a light intensity coefficient matrix is determined according to the light intensity coefficient corresponding to each pixel point on the first image, the light intensity coefficient matrix is used for indicating the lighting intensity for carrying out image processing on the first image, the global normal direction of the first image is determined, the global normal direction is used as the lighting direction for carrying out image processing on the first image, and the first image is subjected to image processing according to the light intensity coefficient matrix and the lighting direction, so that a second image is obtained. According to the technical scheme, the corresponding lighting intensity of each pixel point on the first image is determined, the lighting effect of the pixel point is controlled through the lighting intensity, instead of lighting through the uniform lighting intensity, when lighting is performed, the lighting direction is controlled according to the global normal direction of the first image, the lighting direction is controlled according to the global normal direction, the lighting intensity is controlled according to the light intensity coefficient of each pixel point, the first image is subjected to image processing, the omnibearing lighting effect can be rapidly realized without complex lighting arrangement, the lighting effect is more vivid, and the texture and atmosphere sense of the image are improved.
Drawings
FIG. 1 is a flow chart of an image processing method according to an embodiment of the application;
FIG. 2 is a schematic view showing a sub-flow of an image processing method according to an embodiment of the present application;
FIG. 3 shows a schematic diagram of an RGB coordinate system according to an embodiment of the present application;
FIG. 4 is a schematic view showing a sub-flow of an image processing method according to an embodiment of the present application;
FIG. 5 is a schematic view showing another sub-flow of the image processing method according to the embodiment of the present application;
FIG. 6 shows a further sub-flowchart of an image processing method according to an embodiment of the present application;
fig. 7 is a schematic diagram showing a stereoscopic enhancement mode of an image processing method according to an embodiment of the present application;
FIG. 8 is a flow chart of an image processing method according to another embodiment of the present application;
fig. 9 is a flowchart of an image processing method according to still another embodiment of the present application;
fig. 10 is a schematic diagram showing the structure of an image processing apparatus according to an embodiment of the present application;
FIG. 11 is a schematic diagram of an electronic device according to an embodiment of the present application;
fig. 12 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which are obtained by a person skilled in the art based on the embodiments of the present application, fall within the scope of protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the objects identified by "first," "second," etc. are generally of a type not limited to the number of objects, for example, the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The information determining method provided by the embodiment of the application is described in detail below through specific embodiments and application scenes thereof with reference to the accompanying drawings.
Fig. 1 shows a flow chart of an image processing method according to an embodiment of the application. As shown in fig. 1, the image processing method includes steps S101 to S104. The execution subject of the image processing method according to the embodiment of the present application may be an electronic device or a functional module or a functional entity in the electronic device that can implement the image processing method. Wherein the electronic device may include, but is not limited to: smart phones, tablet computers, laptop computers (notebook computers), smart wearable devices, etc. The image processing method provided in the embodiment of the present application will be described below by taking an electronic device as an execution subject.
Step S101: a first image is acquired.
The first image is a to-be-processed image that requires image processing. In an alternative embodiment, the first image may be an already taken image, for example an image stored on an electronic device. In other alternative embodiments, the first image is an image to be taken by a camera of an electronic device, such as a cell phone. If the first image is an image stored on the electronic equipment, the embodiment of the application can process the stored image, realize the omnibearing light effect and improve the texture and atmosphere of the stored image. If the first image is an image to be shot, the embodiment of the application can perform image processing on the first image in the process of shooting the first image, realize the omnibearing light effect without complex light arrangement, and provide a convenient, efficient, high-quality and high-atmosphere direct shooting experience.
Step S102: according to the pixel information of the first image and the pixel information of the reference image, determining a light intensity coefficient corresponding to each pixel point on the first image, and according to the light intensity coefficient corresponding to each pixel point on the first image, determining a light intensity coefficient matrix, wherein the light intensity coefficient matrix is used for indicating the lighting intensity for performing image processing on the first image.
When the image is polished, different illumination intensities are required for the images in different illumination environments in order to avoid the bright image after the polishing. The brightness of the image is the brightness of each pixel point on the image, the brightness of the pixel point is the size of the pixel value, and the larger the pixel value is, the brighter the image is at the pixel point. Therefore, the embodiment of the application can determine the polishing intensity of polishing the first image based on the pixel information of the first image. The brightness of each pixel point on the same image is different, and each pixel point needs different polishing intensity. However, the image is a whole, and in order to avoid distortion of the image after the light is given, not only the pixel value of each pixel but also the pixel values of surrounding pixels need to be considered when calculating the light intensity of each pixel. Therefore, the embodiment of the application can use the reference image as a reference object and combine the pixel information of the first image to determine the polishing intensity of the first image. Optionally, the pixel value of each pixel point on the reference image is a target value, such as an upper limit 255 of the pixel value.
Optionally, as shown in fig. 2, the process of determining the light intensity coefficient corresponding to each pixel point on the first image according to the pixel information of the first image and the pixel information of the reference image includes:
Step S201: and determining a pixel difference value corresponding to the first pixel point according to the pixel value of the first pixel point on the first image and the pixel value of the second pixel point corresponding to the position of the first pixel point on the reference image aiming at each first pixel point of the first image.
RGB color mode is a color standard that obtains a wide variety of colors by varying the three color channels of red (R), green (G), blue (B) and overlapping each other, and has three pixel components R, G, B for each pixel of a color image, which together represent one pixel. The RGB channels each have 256 levels of brightness, numerically expressed as values from 0, 1, 2.
Wherein the expression of the pixel difference value is as follows:
diff_limit_light represents the pixel difference, w and h represent the width and height of the first image, respectively, (r, g, b) ij (i=1, 2, …, w; j=1, 2, …, h) is the pixel value corresponding to the first pixel point on the first image. For a color image, there are 3 color channels per pixel: r (red) channel, G (green) channel and B (blue) channel. R, G, B represent pixel components of the first pixel point on the R channel, the G channel, and the B channel, respectively, i.e., values of the pixel point on the R channel, the G channel, and the B channel. As can be seen from the above, the pixel value of each pixel point on the reference image may be 255, that is, the pixel components of the second pixel point on the R channel, the G channel, and the B channel are 255.
Step S202: and determining a light intensity coefficient corresponding to the first pixel point according to the pixel value of the first pixel point on the first image and the pixel difference value corresponding to the first pixel point.
In an alternative embodiment, in order to ensure that the brightness adjustment increment of each first pixel point is within the pixel difference range, the ratio of the pixel difference value corresponding to the first pixel point to the pixel value of the first pixel point on the first image may be used as the light intensity coefficient corresponding to the first pixel point.
The light intensity coefficient corresponding to each first pixel point forms a light intensity coefficient matrix corresponding to the first image, and the calculation expression of the light intensity coefficient matrix is shown in the following formula (2):
wherein, light weight Representing a matrix of light intensity coefficients.
Step S103: and determining a global normal direction of the first image, and taking the global normal direction as a lighting direction for performing image processing on the first image.
The lighting effect is usually optimal in a horizontal direction perpendicular to the object, i.e. the opposite direction of the normal direction of the object is taken as the optimal recommended lighting effect direction. Therefore, the embodiment of the application takes the global normal direction of the first image as the optimal lighting direction. Wherein there may be two or more objects on the first image, the surfaces of the objects facing in different directions, the global normal direction being a normal direction to the whole of the first image, not a normal direction to an object on the first image, being a lighting direction optimal to the whole of the first image, not a normal direction optimal to only an object.
Before determining the global normal direction of the first image, an RGB coordinate system may be established, the three axes of which represent different colors. The normal direction of the object surface can be represented by color in combination with the RGB coordinate system. Alternatively, fig. 3 shows a schematic diagram of an RGB coordinate system. As shown in fig. 3, the x-axis represents the pixel component of the image on the B-channel, the y-axis represents the pixel component of the image on the G-channel, and the z-axis represents the pixel component of the image on the R-channel. If the color of the object is reddish, it means that the surface normal of the object is biased toward the z-axis and the normal direction coincides with the z-axis direction, i.e., the normal direction is upward. If the color of the object is greenish, it means that the surface normal of the object is biased toward the y-axis and the normal direction coincides with the y-axis direction. If the object is orange in color, it means that the surface normal of the object is biased toward the x-axis and y-axis at an oblique angle. The color of the object can be determined by the pixel components of the pixel points on the RGB channel, and if the pixel components on the R channel are far greater than the pixel components on the G channel and the B channel, the color of the object is reddish. While there may be multiple objects on an image, the surface normal directions of the objects may be taken into account comprehensively to determine the global normal direction of the image.
Referring to the RGB coordinate system shown in fig. 3, the color cast of the image may include: reddening, greenish, bluish, yellowish, and combinations of colors (e.g., reddish yellow, reddish green, reddish blue, yellowish green, bluish green, yellowish blue).
Thus, in an alternative embodiment, as shown in fig. 4, the process of determining the global normal direction of the first image comprises steps S401-S404:
step S401: a first pixel component of the first image on the R channel, a second pixel component on the G channel, and a third pixel component on the B channel are determined.
The first pixel component of the first image on the R channel may be represented by a pixel component mean value of all pixel points on the first image on the R channel. Similarly, the first pixel component of the first image on the G channel may be represented by a pixel component average value of all pixels on the first image on the G channel, and the first pixel component on the B channel may be represented by a pixel component average value of all pixels on the first image on the B channel. Alternatively, the first pixel component of the first image on the R channel, the second pixel component on the G channel, and the third pixel component on the B channel are denoted by (normal rm, normal gm, normal bm), respectively.
Step S402: and determining a color cast result of the first image according to the first pixel component, the second pixel component, the third pixel component and a preset color cast judging rule.
The color cast judging rule is used for judging the color cast of the first image according to the first pixel component, the second pixel component and the third pixel component. Optionally, the color cast judging rule includes:
if 2×normal rm-normal gm-normal bm > =150, the color cast result of the image is reddish;
if normal rm-normal bm > =90, the color cast result of the image is yellow;
if normalRm-normalGm < =25, the color cast result of the image is greenish;
if min (normal Rm-normal Bm, 0) +min (normal Gm-normal Bm, 0) <0, the color cast of the image is bluish.
Substituting the first pixel component of the first image on the R channel, the second pixel component of the first image on the G channel and the third pixel component of the first image on the B channel into the color cast judgment rule to determine the color cast result of the first image. The color cast result of the first image may include one color cast or may include multiple colors cast.
Step S403: and determining the color cast coding of the first image according to the color cast result of the first image.
In order to facilitate the subsequent processing of the color cast result of the first image, the color cast can be represented by using a 0-1 coding mode. Since the colors of an image may be composed of three primary colors (RGB), the number of bits of color cast coding may be three bits. For example, the color shift code corresponding to redness is 001, the color shift code corresponding to yellowing is 110, the color shift code corresponding to greenness is 010, and the color shift code corresponding to blueness is 100.
The color cast result of the first image may include one color cast or may include a plurality of color cast. And taking the color cast code corresponding to the color cast as the color cast code of the first image when the color cast result of the first image comprises one color cast. For example, if the color cast result of the first image is red, the color cast code of the first image is 001. And taking the sum of the color cast codes corresponding to the multiple color cast as the color cast code of the first image when the color cast result of the first image comprises multiple color cast. For example, if the color cast result of the first image is red and yellow, the color cast code of the first image is the sum of the color cast codes corresponding to red and yellow: 001+110=111, i.e. the color cast coding of the first image is 111.
Step S404: and determining the global normal direction of the first image according to the color cast coding of the first image and a preset RGB coordinate system.
As can be seen from the above, if the opposite direction of the normal direction of the object is the optimal recommended light efficiency direction, the global normal direction of the first image is obtained according to the opposite directions of the coordinate axes of the RGB coordinate system and the color cast coding of the first image. For example, color cast encoding of the first image as a first matrix; taking the negative direction vector of the x axis, the negative direction vector of the y axis and the negative direction vector of the y axis of the RGB coordinate system as a second matrix; and determining the global normal direction of the first image according to the first matrix and the second matrix. Wherein the expression of the global normal direction of the first image is shown in the following formula (3):
relight ori =(x 0 ,y 0 ,z 0 )*(-x,-y,-z) (3)
(x 0 ,y 0 ,z 0 ) Representing the color cast coding of the first image, (-x, -y, -z) representing three negative direction vectors of the xyz axis of the RGB coordinate system.
Step S104: and carrying out image processing on the first image according to the light intensity coefficient matrix and the lighting direction to obtain a second image.
In the embodiment of the application, the lighting intensity of the image processing is controlled according to the light intensity coefficient matrix, the lighting direction is controlled according to the global normal direction, and the image processing is carried out on the first image, so that the texture, atmosphere and third dimension of the first image are improved, and the omnibearing lighting effect is realized.
In some alternative embodiments, as shown in fig. 5, the process of performing image processing on the first image in step S104 includes:
step S501: and obtaining a first illumination map of the first image according to the illumination direction, a preset illumination calculation model and a soft shadow calculation model.
The illumination calculation model is a model for describing optical phenomena such as scattering, absorption and the like generated when light irradiates an object and enters a human visual system or a visual sensor. In some alternative embodiments, the present model may choose the Blinn-Phong illumination calculation model for calculation. In alternative embodiments, the Phong optical model and BRDF model (Bidirectional Reflectance Distribution Function, bi-directional reflectance distribution function) may also be chosen for calculation. Soft shadows refer to soft, gradual transitions of shadow boundaries, with no jaggies. Opposite to the soft shadow is a hard shadow, whose boundaries are sharp. In order to improve the sense of realism of the scene in the image, the embodiment of the application selects a soft shadow calculation model for calculation, and optionally, the soft shadow calculation model may select a PCSS soft shadow calculation model (Percentage Closer Soft Shadows, percentage progressive soft shadow). In this step S501, a first illumination map of a first image is obtained in the form of 3D reconstruction based on the Blinn-Phong illumination calculation model and the PCSS soft shadow calculation model in combination with the illumination direction obtained in step S103.
Step S502: and updating the illumination information of the first illumination graph according to the light intensity coefficient matrix to obtain a second illumination graph.
And updating the brightness of each pixel point on the first illumination map based on the light intensity coefficient matrix, namely, the pixel value of each pixel point, so as to obtain a second illumination map. Optionally, the expression of the illumination information of the first illumination map is updated according to the light intensity coefficient matrix as shown in the following expression (4):
light′=(1+Light weight *light/255) (4)
wherein light is The second illumination map is represented, and light represents the first illumination map.
Step S503: and (3) enabling the second illumination image to act on the first image to obtain a light effect image.
In the step, a second illumination map is superimposed on the first image to realize preliminary 3D light effect, and the expression is shown as the following formula (5):
img = light *img (5)
wherein img Representing the light effect image, img representing the first image.
Step S504: and performing exposure inhibition processing on the light effect image to obtain a second image.
Because the 3D light effect is superimposed by illumination on the basis of the original image, the image overexposure is easy to cause, and the light effect image img for preliminary 3D light effect lighting is needed The exposure is suppressed, and the visual sense of the picture is further improved.
Wherein the expression of performing the exposure suppressing process on the photo-image is as follows:
wherein img relight Represents a second image, max (img ) Representing the maximum pixel value on the light effect image.
In other alternative embodiments, to enhance the stereoscopic effect of the second image, the contrast of the highlight and shadow regions of the first image may be enhanced when the first image is image processed. Optionally, as shown in fig. 6, the process of performing image processing on the first image includes:
step S601: and obtaining a first illumination map of the first image according to the illumination direction, a preset illumination calculation model and a soft shadow calculation model.
Step S602: and enhancing the contrast ratio of the high light area and the shadow area of the first illumination map to obtain the enhanced illumination map.
Step S603: and updating the illumination information of the enhanced illumination graph according to the light intensity coefficient matrix to obtain a second illumination graph.
Step S604: the second illumination image acts on the first image to obtain a light effect image;
step S605: and performing exposure inhibition processing on the light effect image to obtain a second image.
In the embodiments shown in fig. 5, steps S601, S603, S604, and S605 may refer to the embodiments shown in fig. 5, and are not repeated here.
For step S602, as shown in fig. 7, the contrast between the highlight region and the shadow region may be enhanced by increasing the highlight region and decreasing the shadow region of the first illumination map. Alternatively, the contrast of the highlight region and the shadow region of the first illumination pattern may be enhanced according to the following formula (7):
Wherein light is enhance Representing the enhanced illumination map, max (light) represents the maximum pixel value of the first illumination map.
The image processing method provided by the embodiment of the application can be applied to the real-time shooting process of the terminal, provides a convenient, efficient, high-quality, moderate-brightness and good-atmosphere direct shooting experience for the user, and avoids bad shooting experience caused by darker and worse ambient light during shooting. The method can be applied to a terminal real-time shooting process, the brightness of the ambient light can be detected in the terminal real-time shooting process, the image can be directly shot and processed under the condition that the brightness of the ambient light is detected to be normal brightness or high brightness, the light supplementing lamp of the terminal can be directly called to carry out auxiliary lighting under the condition that the brightness of the ambient light is detected to be dark light, and the image is shot and processed under the condition that the light supplementing lamp is used for carrying out auxiliary lighting. The embodiment of the application can set different ambient light brightness levels for the ambient light with different brightness, and call the light supplementing lamp of the terminal under the condition that the ambient light brightness level is the target level (for example, the third level shown in fig. 8). The steps of the real-time shooting process of the terminal to which the method is applied are shown in fig. 8, and the method comprises the following steps:
Step S801: in response to activating the photographing apparatus, an ambient light level of the photographing environment is determined.
Step S802: in the case where the ambient light level is the first level or the second level, the first image is photographed.
Step S803: and calling a light supplementing lamp of the shooting equipment under the condition that the ambient light intensity level is the third level, and shooting a first image under the condition that the light supplementing lamp is used for assisting in lighting.
Step S804: and determining a light intensity coefficient matrix according to the pixel information of the first image and the pixel information of the reference image, wherein the light intensity coefficient matrix is used for indicating the lighting intensity for performing image processing on the first image.
Step S805: and determining a global normal direction of the first image, and taking the global normal direction as a lighting direction for performing image processing on the first image.
Step S806: and carrying out image processing on the first image according to the light intensity coefficient matrix and the lighting direction to obtain a second image.
In the embodiments shown in fig. 1 to 8, steps S804 to S806 may be referred to, and are not repeated here.
For step S801, the photographing apparatus may be an electronic apparatus having a camera, such as a cellular phone, a camera, or the like. The user opens the shooting device, and the shooting device detects the starting operation and determines the ambient light brightness level of the current environment.
The shooting ambient light level at which a user is typically located can be divided into three categories: a dim light environment, a normal brightness environment, and a highlight environment. The camera can be used for shooting directly in normal brightness environment and high brightness environment, and auxiliary lighting is needed in dark light environment, such as soft light auxiliary lighting of a shooting device. Corresponding to a dark light environment, a normal brightness environment and a highlight environment, three environment light brightness levels are set in the embodiment of the application: third level, second level, and first level. In the case where the ambient light level is the first level or the second level, an image may be directly captured, in the case where the ambient light level is the third level, a light supplement lamp of the photographing apparatus, such as a soft light, is invoked, and in the case where the soft light assists in lighting, the first image is captured.
The brightness of the images photographed under the environments of different brightness is different, for example, the brightness average value of the images photographed under the dark light environment is 0-85, the brightness average value of the images photographed under the normal environment is 86-170, and the brightness average value of the images photographed under the highlight scene is 171-255. Therefore, the embodiment of the application can determine the ambient light brightness level through the mean value and the deviation of the image on the gray level graph, and when the image has abnormal brightness, the mean value of the brightness of the image deviates from the mean point under the environment, and the variance is smaller. Furthermore, the embodiment of the application can convert the first image into the gray level image, and determine the ambient light brightness level when the first image is shot according to the average value and the average deviation of the gray level image.
Wherein an expression for converting the first image into a gray-scale image is shown in the following formula (8):
img_Gray = 0.3*imgR + 0.59*imgG+ 0.11*imgB (8)
img_gray represents a Gray scale, imgR represents a pixel component of the first image on the R channel, imgG represents a pixel component of the first image on the G channel, and imgB represents a pixel component of the first image on the B channel.
Since auxiliary lighting is required in the case where the ambient light level is the third level, 85 is taken as a reference point for measuring the variance and the mean deviation. Specifically, the mean value D and the mean deviation M of img_gray from 85 are calculated, and the expressions (9) to (12) are as follows:
D=|μ a | (10)
M=|M a | (12)
wherein x is i For the Gray pixel value of the Gray map img_gray, n=w×h, W and H are the width and height of the first image img, respectively, and Hist is the Gray histogram of img_gray.
The luminance parameter K is calculated by the above average value and average deviation as shown in the following expression (13):
thereby judging the ambient Light level level The expression is shown in the following formula (14):
when Light level When the light scene is judged to be the dim light scene in the time of the= -1, automatically calling a soft light of the shooting equipment to assist lighting; the normal brightness scene and the highlight scene can be directly processed without the assistance of soft lights.
According to the image processing method, when a user shoots, the ambient light brightness level is determined firstly, if the current environment is a dim light scene according to the ambient light brightness level, automatic soft light compensation is carried out on the dim light scene, then the self-adaptive light intensity coefficient matrix and the optimal lighting direction are calculated, and the omnibearing lighting is carried out by matching with the stereoscopic enhancement and overexposure inhibition operation, so that a convenient and efficient high-texture, moderate-brightness and good-atmosphere direct shooting mode is provided for the user, and bad shooting experience caused by darker and worse ambient light during shooting is avoided.
In an alternative embodiment, a light effect trigger control is set in a shooting mode of an electronic device such as a mobile phone, and a user triggers image processing by clicking the light effect trigger control. Therefore, the image processing method according to the embodiment of the present application may further include: in response to starting the photographing apparatus, determining whether a light effect trigger operation is detected, and in response to detecting the light effect trigger operation, determining an ambient light level of the photographing environment. Namely, the embodiment of the application starts image processing only when the user clicks the light effect trigger control, and does not start image processing when the light effect trigger operation is not detected, thereby saving the resources of shooting equipment.
Fig. 9 shows a flow chart of an image processing method according to still another embodiment of the present application. In this embodiment, after the first image is subjected to the light effect processing, the scene corresponding to the first image may be further determined, and if the scene corresponding to the first image is a designated scene, for example, a food scene, the color temperature of the second image after the light effect processing may be adjusted, so as to further improve the atmosphere feel and texture of the image.
As shown in fig. 9, the image processing method includes steps S901 to S907.
Step S901: in response to activating the photographing apparatus, an ambient light level of the photographing environment is determined.
Step S902: in the case where the ambient light level is the first level or the second level, the first image is photographed.
Step S903: and calling a light supplementing lamp of the shooting equipment under the condition that the ambient light intensity level is the third level, and shooting a first image under the condition that the light supplementing lamp is used for assisting in lighting.
Step S904: and determining a light intensity coefficient matrix according to the pixel information of the first image and the pixel information of the reference image, wherein the light intensity coefficient matrix is used for indicating the lighting intensity for performing image processing on the first image.
Step S905: and determining a global normal direction of the first image, and taking the global normal direction as a lighting direction for performing image processing on the first image.
Step S906: and carrying out image processing on the first image according to the light intensity coefficient matrix and the lighting direction to obtain a second image.
Step S907: and determining the scene category corresponding to the second image.
Step S908: and under the condition that the scene category corresponding to the second image is the appointed category, adjusting the color temperature of the second image according to the self-adaptive color temperature adjusting strategy, and obtaining a third image.
In the embodiments shown in fig. 1 to 8, steps S901 to S906 may be referred to, and are not described herein again for avoiding repetition.
For step S907, the object on the second image may be identified by the object identification algorithm, and the scene category corresponding to the second image is determined according to the identification result. For example, if the object on the second image is identified as food, the scene category corresponding to the second image is determined to be a food scene or food pattern, if the object on the second image is identified as a person, the scene category corresponding to the second image is determined to be a portrait scene or portrait pattern, and if the second image is identified as a landscape pattern, the scene category corresponding to the second image is determined to be a landscape scene or landscape pattern. In other optional embodiments, the scene category corresponding to the second image may also be determined according to a selection operation of the user, where if the user opens the shooting device and selects the portrait mode, the scene category corresponding to the first image is determined as a portrait scene, and further the scene category corresponding to the second image is determined as a portrait scene.
For step S908, the specified category may be, for example, a food scene. The color temperature is a unit of measure representing the inclusion of color components in a light, which can be adjusted by varying the ratio of the different color components in the light. Alternatively, the color temperature of the second image may be adjusted according to the following procedure:
Determining a fourth pixel component of the second image on the R channel, a fifth pixel component on the G channel, and a sixth pixel component on the B channel;
determining a color temperature adjustment parameter according to the fourth pixel component, the fifth pixel component and the sixth pixel component;
and adjusting the color temperature of the second image according to the fourth pixel component, the fifth pixel component, the sixth pixel component and the color temperature adjusting parameter.
Wherein the fourth pixel component of the second image on the R channel, the fifth pixel component on the G channel, and the sixth pixel component on the B channel refer to the pixel mean of the second image on the RGB channel, expressed as (img relight R m ,img relight G m ,img reiight B m )。
Determining a Color temperature adjusting parameter Color according to the fourth pixel component, the fifth pixel component and the sixth pixel component level The expression of (2) is shown in the following formula (15):
the expression for adjusting the color temperature of the second image according to the fourth pixel component, the fifth pixel component, the sixth pixel component, and the color temperature adjustment parameter is as follows (16):
according to the image processing method, when a user shoots, the ambient light intensity level is determined firstly, if the current environment is determined to be a dim light scene according to the ambient light intensity level, automatic soft light compensation is carried out on the dim light scene, then the self-adaptive light intensity coefficient matrix and the optimal lighting direction are calculated, the 3D light effect lighting is carried out in an omnibearing mode in combination with the three-dimensional improvement and overexposure inhibition operation, and the self-adaptive color temperature adjustment is carried out on a target scene such as a food scene, so that a convenient and efficient direct shooting mode with high texture, moderate brightness and good atmosphere is provided for the user, and bad shooting experience caused by dark ambient light and poor environment during shooting is avoided.
Fig. 10 shows a schematic configuration diagram of an image processing apparatus according to an embodiment of the present application. As shown in fig. 10, the image processing apparatus 1000 includes:
an acquisition module 1001, configured to acquire a first image;
a first determining module 1002, configured to determine, according to pixel information of the first image and pixel information of the reference image, a light intensity coefficient corresponding to each pixel point on the first image, and determine, according to the light intensity coefficient corresponding to each pixel point on the first image, a light intensity coefficient matrix, where the light intensity coefficient matrix is used to indicate a lighting intensity for performing image processing on the first image;
a second determining module 1003, configured to determine a global normal direction of the first image, and use the global normal direction as a lighting direction for performing image processing on the first image;
the image processing module 1004 is configured to perform image processing on the first image according to the light intensity coefficient matrix and the lighting direction, so as to obtain a second image.
The image processing device acquires a first image, determines a light intensity coefficient matrix according to pixel information of the first image and pixel information of a reference image, wherein the light intensity coefficient matrix is used for indicating the lighting intensity for performing image processing on the first image, determines a global normal direction of the first image, and performs image processing on the first image according to the light intensity coefficient matrix and the lighting direction, so as to obtain a second image. According to the technical scheme, the corresponding lighting intensity of each pixel point on the first image is determined, the lighting effect of the pixel point is controlled through the lighting intensity instead of lighting through uniform lighting intensity, when lighting is performed, the lighting direction is controlled according to the global normal direction of the first image, the global normal direction of the first image is the lighting direction optimal for the whole first image instead of the lighting direction optimal for the part of the first image, the lighting intensity is controlled according to the global normal direction and the light intensity coefficient of each pixel point, the omnibearing lighting effect can be rapidly realized without complicated lighting arrangement when the first image is subjected to image processing, the lighting effect is more vivid, and the texture and atmosphere of the image are improved.
In an alternative embodiment, the pixel information of the first image includes a pixel value of each pixel point on the first image, and the pixel information of the reference image includes a pixel value of each pixel point on the reference image;
the first determining module is used for: for each first pixel point of the first image, determining a pixel difference value corresponding to the first pixel point according to a pixel value of the first pixel point on the first image and a pixel value of a second pixel point corresponding to the position of the first pixel point on the reference image; and taking the ratio of the pixel difference value corresponding to the first pixel point to the pixel value of the first pixel point on the first image as the light intensity coefficient corresponding to the first pixel point.
In an alternative embodiment, the second determining module is configured to: determining a first pixel component of the first image on the R channel, a second pixel component on the G channel, and a third pixel component on the B channel; determining a color cast result of the first image according to the first pixel component, the second pixel component, the third pixel component and a preset color cast judging rule; determining a color cast code of the first image according to the color cast result of the first image; and determining the global normal direction of the first image according to the color cast coding of the first image and a preset RGB coordinate system.
In alternative embodiments, the color cast results include one or more of the following: reddening, greenish, bluish and yellowish;
the second determining module is used for: under the condition that the color cast result only comprises any color cast, taking the color cast code corresponding to the color cast as the color cast code of the first image; or, in the case that the color cast result includes more than one color cast, taking the sum of color cast codes corresponding to more than one color cast as the color cast code of the first image.
In an alternative embodiment, the method according to the second determination module is for: obtaining a first matrix according to the color cast coding of the first image; obtaining a second matrix according to a negative direction vector of an x axis, a negative direction vector of a y axis and a negative direction vector of the y axis of a preset RGB coordinate system; and determining the global normal direction of the first image according to the first matrix and the second matrix.
In an alternative embodiment, the image processing module is configured to: obtaining a first illumination map of a first image according to the illumination direction, a preset illumination calculation model and a soft shadow calculation model; updating illumination information of the first illumination map according to the light intensity coefficient matrix to obtain a second illumination map; the second illumination image acts on the first image to obtain a light effect image; and performing exposure inhibition processing on the light effect image to obtain a second image.
In an alternative embodiment, the image processing module is configured to: enhancing the contrast ratio of the high light area and the shadow area of the first illumination map to obtain an enhanced illumination map; and updating the illumination information of the enhanced illumination graph according to the light intensity coefficient matrix to obtain a second illumination graph.
In an alternative embodiment, the acquisition module is configured to: responding to starting the shooting equipment, and acquiring an original image acquired by the shooting equipment; converting the original image into a gray scale map; determining the mean value and the average deviation of the gray map from K; determining the ambient light brightness level of the shooting environment according to the mean value and the average deviation of the gray level map from K; under the condition that the ambient light brightness level is the target level, a light supplementing lamp of the shooting equipment is called, shooting is carried out under the condition that the light supplementing lamp is used for assisting in lighting, and a first image is obtained; the average interval of the ambient brightness corresponding to the target level is (0, K).
In an alternative embodiment, the apparatus further comprises a color temperature adjustment module for: determining a scene category corresponding to the first image; and under the condition that the scene category corresponding to the first image is the appointed category, adjusting the color temperature of the second image according to the self-adaptive color temperature adjusting strategy to obtain a third image.
In an alternative embodiment, the color temperature adjustment module is configured to: determining a fourth pixel component of the second image on the R channel, a fifth pixel component on the G channel, and a sixth pixel component on the B channel; determining a color temperature adjustment parameter according to the fourth pixel component, the fifth pixel component and the sixth pixel component; and adjusting the color temperature of the second image according to the fourth pixel component, the fifth pixel component, the sixth pixel component and the color temperature adjusting parameter.
The image processing device in the embodiment of the application can be an electronic device, or can be a component in the electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a mobile phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The image processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, and the embodiment of the present application is not limited specifically.
The image processing device provided in the embodiment of the present application can implement each process implemented by the embodiments of the methods of fig. 1 to 9, and in order to avoid repetition, a description is omitted here.
Optionally, as shown in fig. 11, the embodiment of the present application further provides an electronic device 1100, including a processor 1101 and a memory 1102, where the memory 1102 stores a program or instructions that can be executed on the processor 1101, and the program or instructions implement each step of the above-mentioned image processing method embodiment when executed by the processor 1101, and achieve the same technical effect, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device.
Fig. 12 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1200 includes, but is not limited to: radio frequency unit 1201, network module 1202, audio output unit 1203, input unit 1204, sensor 1205, display unit 1206, user input unit 1207, interface unit 1208, memory 1209, and processor 1210.
Those skilled in the art will appreciate that the electronic device 1200 may also include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 1210 by a power management system, such as to perform functions such as managing charging, discharging, and power consumption by the power management system. Drawing of the figure 12The electronic device structure shown in (c) does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown in the figures, or may combine some components, or may be arranged in different components, which are not described here.
The processor 1210 is configured to acquire a first image; determining a light intensity coefficient corresponding to each pixel point on the first image according to the pixel information of the first image and the pixel information of the reference image, and determining a light intensity coefficient matrix according to the light intensity coefficient corresponding to each pixel point on the first image, wherein the light intensity coefficient matrix is used for indicating the lighting intensity for performing image processing on the first image; determining a global normal direction of the first image, and taking the global normal direction as a lighting direction for performing image processing on the first image; and carrying out image processing on the first image according to the light intensity coefficient matrix and the lighting direction to obtain a second image.
The processor 1210 in the embodiment of the present application determines, for each pixel point on the first image, the corresponding lighting intensity, and controls the lighting effect on the pixel point through the lighting intensity instead of lighting through the uniform lighting intensity, and when lighting is performed, the lighting direction is controlled according to the global normal direction of the first image, which is the optimal lighting direction for the whole first image but not the locally optimal lighting direction on the first image, and the lighting intensity is controlled according to the global normal direction and the light intensity coefficient of each pixel point, so that the lighting effect in all directions can be rapidly realized without complicated lighting arrangement for performing image processing on the first image, and the lighting effect is more vivid, and the texture and atmosphere sense of the image are improved.
Optionally, the processor 1210 is further configured to determine, for each first pixel of the first image, a pixel difference value corresponding to the first pixel according to a pixel value of the first pixel on the first image and a pixel value of a second pixel corresponding to a position of the first pixel on the reference image; and taking the ratio of the pixel difference value corresponding to the first pixel point to the pixel value of the first pixel point on the first image as the light intensity coefficient corresponding to the first pixel point.
Optionally, the processor 1210 is further configured to determine a first pixel component of the first image on an R channel, a second pixel component on a G channel, and a third pixel component on a B channel; determining a color cast result of the first image according to the first pixel component, the second pixel component, the third pixel component and a preset color cast judging rule; determining a color cast code of the first image according to the color cast result of the first image; and determining the global normal direction of the first image according to the color cast coding of the first image and a preset RGB coordinate system.
Optionally, the processor 1210 is further configured to, when the color cast result includes only any color cast, use a color cast code corresponding to the color cast as the color cast code of the first image; or, in the case that the color cast result includes more than one color cast, taking the sum of color cast codes corresponding to more than one color cast as the color cast code of the first image.
Optionally, the processor 1210 is further configured to obtain a first matrix according to the color cast encoding of the first image; obtaining a second matrix according to a negative direction vector of an x axis, a negative direction vector of a y axis and a negative direction vector of the y axis of a preset RGB coordinate system; and determining the global normal direction of the first image according to the first matrix and the second matrix.
Optionally, the processor 1210 is further configured to obtain a first illumination map of the first image according to the illumination direction, the preset illumination calculation model and the soft shadow calculation model; updating illumination information of the first illumination map according to the light intensity coefficient matrix to obtain a second illumination map; the second illumination image acts on the first image to obtain a light effect image; and performing exposure inhibition processing on the light effect image to obtain a second image.
Optionally, the processor 1210 is further configured to enhance a contrast between a highlight region and a shadow region of the first illumination map, to obtain an enhanced illumination map; and updating the illumination information of the enhanced illumination graph according to the light intensity coefficient matrix to obtain a second illumination graph.
Optionally, the processor 1210 is further configured to acquire an original image acquired by the capturing device in response to starting the capturing device; converting the original image into a gray scale map; determining the mean value and the average deviation of the gray map from K; determining the ambient light brightness level of the shooting environment according to the mean value and the average deviation of the gray level map from K; under the condition that the ambient light brightness level is the target level, a light supplementing lamp of the shooting equipment is called, shooting is carried out under the condition that the light supplementing lamp is used for assisting in lighting, and a first image is obtained; the average interval of the ambient brightness corresponding to the target level is (0, K).
Optionally, the processor 1210 is further configured to determine a scene category corresponding to the first image; and under the condition that the scene category corresponding to the first image is the appointed category, adjusting the color temperature of the second image according to the self-adaptive color temperature adjusting strategy to obtain a third image.
Optionally, the processor 1210 is further configured to determine a fourth pixel component of the second image on the R channel, a fifth pixel component on the G channel, and a sixth pixel component on the B channel; determining a color temperature adjustment parameter according to the fourth pixel component, the fifth pixel component and the sixth pixel component; and adjusting the color temperature of the second image according to the fourth pixel component, the fifth pixel component, the sixth pixel component and the color temperature adjusting parameter.
It should be appreciated that in embodiments of the present application, the input unit 1204 may include a graphics processor (Graphics Processing Unit, GPU) 12041 and a microphone 12042, the graphics processor 12041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 1206 may include a display panel 12061, and the display panel 12061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1207 includes at least one of a touch panel 12071 and other input devices 12072. The touch panel 12071 is also called a touch screen. The touch panel 12071 may include two parts, a touch detection device and a touch controller. Other input devices 12072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
Memory 1209 may be used to store software programs as well as various data. The memory 1209 may mainly include a first memory area storing programs or instructions and a second memory area storing data, wherein the first memory area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 1209 may include volatile memory or nonvolatile memory, or the memory 1209 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 1209 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 1210 may include one or more processing units; optionally, processor 1210 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, and the like, and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into processor 1210.
The embodiment of the application also provides a readable storage medium, and the readable storage medium stores a program or an instruction, which when executed by a processor, implements each process of the above image processing method embodiment, and can achieve the same technical effects, so that repetition is avoided, and no further description is provided here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, which comprises a processor and a communication interface, wherein the communication interface is coupled with the processor, and the processor is used for running programs or instructions to realize the processes of the embodiment of the image processing method, and can achieve the same technical effects, so that repetition is avoided, and the description is omitted here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
Embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the above-described image processing method embodiments, and achieve the same technical effects, and for avoiding repetition, a detailed description is omitted herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.

Claims (18)

1. An image processing method, comprising:
acquiring a first image;
determining a light intensity coefficient corresponding to each pixel point on the first image according to the pixel information of the first image and the pixel information of the reference image, and determining a light intensity coefficient matrix according to the light intensity coefficient corresponding to each pixel point on the first image, wherein the light intensity coefficient matrix is used for indicating the lighting intensity for performing image processing on the first image;
determining a global normal direction of the first image, and taking the global normal direction as a polishing direction for performing image processing on the first image;
and performing image processing on the first image according to the light intensity coefficient matrix and the lighting direction to obtain a second image.
2. The method of claim 1, wherein the pixel information of the first image comprises a pixel value for each pixel on the first image, and the pixel information of the reference image comprises a pixel value for each pixel on the reference image;
the determining the light intensity coefficient corresponding to each pixel point on the first image according to the pixel information of the first image and the pixel information of the reference image includes:
For each first pixel point of the first image, determining a pixel difference value corresponding to the first pixel point according to a pixel value of the first pixel point on the first image and a pixel value of a second pixel point corresponding to the position of the first pixel point on a reference image;
and taking the ratio of the pixel difference value corresponding to the first pixel point to the pixel value of the first pixel point on the first image as the light intensity coefficient corresponding to the first pixel point.
3. The method of claim 1, wherein the determining the global normal direction of the first image comprises:
determining a first pixel component of the first image on an R channel, a second pixel component on a G channel, and a third pixel component on a B channel;
determining a color cast result of the first image according to the first pixel component, the second pixel component, the third pixel component and a preset color cast judging rule;
determining a color cast code of the first image according to the color cast result of the first image;
and determining the global normal direction of the first image according to the color cast coding of the first image and a preset RGB coordinate system.
4. A method according to claim 3, wherein the colour cast result comprises one or more of the following: reddening, greenish, bluish and yellowish;
The determining the color cast coding of the first image according to the color cast result of the first image comprises the following steps:
when the color cast result is that only any color cast is included, taking the color cast code corresponding to the color cast as the color cast code of the first image;
or (b)
And taking the sum of the color cast codes corresponding to more than one color cast as the color cast code of the first image under the condition that the color cast result comprises more than one color cast.
5. A method according to claim 3, wherein determining the global normal direction of the first image from the color cast coding of the first image and a preset RGB coordinate system comprises:
obtaining a first matrix according to the color cast coding of the first image;
obtaining a second matrix according to a negative direction vector of an x axis, a negative direction vector of a y axis and a negative direction vector of the y axis of a preset RGB coordinate system;
and determining the global normal direction of the first image according to the first matrix and the second matrix.
6. The method of claim 1, wherein the performing image processing on the first image according to the matrix of light intensity coefficients and the lighting direction to obtain a second image comprises:
Obtaining a first illumination map of the first image according to the illumination direction, a preset illumination calculation model and a soft shadow calculation model;
updating illumination information of the first illumination map according to the light intensity coefficient matrix to obtain a second illumination map;
the second illumination graph acts on the first image to obtain a light effect image;
and performing exposure inhibition processing on the light effect image to obtain a second image.
7. The method of claim 6, wherein updating the illumination information of the first illumination map according to the light intensity coefficient matrix to obtain a second illumination map comprises:
enhancing the contrast ratio of the highlight region and the shadow region of the first illumination map to obtain an enhanced illumination map;
and updating the illumination information of the enhanced illumination graph according to the light intensity coefficient matrix to obtain a second illumination graph.
8. The method of any of claims 1-7, wherein the acquiring the first image comprises:
responding to starting shooting equipment, and acquiring an original image acquired by the shooting equipment;
converting the original image into a gray scale map;
determining the mean value and the average deviation of the gray map from K;
Determining the ambient light brightness level of the shooting environment according to the mean value and the average deviation of the gray level map from K;
calling a light supplementing lamp of the shooting equipment under the condition that the ambient light brightness level is the target level, and shooting under the condition that the light supplementing lamp is used for assisting in lighting to obtain a first image; and the average interval of the ambient brightness corresponding to the target grade is (0, K).
9. An image processing apparatus, comprising:
the acquisition module is used for acquiring a first image;
the first determining module is used for determining a light intensity coefficient corresponding to each pixel point on the first image according to the pixel information of the first image and the pixel information of the reference image, and determining a light intensity coefficient matrix according to the light intensity coefficient corresponding to each pixel point on the first image, wherein the light intensity coefficient matrix is used for indicating the lighting intensity for image processing of the first image;
the second determining module is used for determining the global normal direction of the first image and taking the global normal direction as a lighting direction for performing image processing on the first image;
and the image processing module is used for carrying out image processing on the first image according to the light intensity coefficient matrix and the lighting direction to obtain a second image.
10. The apparatus of claim 9, wherein the pixel information of the first image comprises a pixel value for each pixel on the first image, and the pixel information of the reference image comprises a pixel value for each pixel on the reference image;
the first determining module is used for:
for each first pixel point of the first image, determining a pixel difference value corresponding to the first pixel point according to a pixel value of the first pixel point on the first image and a pixel value of a second pixel point corresponding to the position of the first pixel point on a reference image;
and taking the ratio of the pixel difference value corresponding to the first pixel point to the pixel value of the first pixel point on the first image as the light intensity coefficient corresponding to the first pixel point.
11. The apparatus of claim 9, wherein the second determining module is configured to:
determining a first pixel component of the first image on an R channel, a second pixel component on a G channel, and a third pixel component on a B channel;
determining a color cast result of the first image according to the first pixel component, the second pixel component, the third pixel component and a preset color cast judging rule;
Determining a color cast code of the first image according to the color cast result of the first image;
and determining the global normal direction of the first image according to the color cast coding of the first image and a preset RGB coordinate system.
12. The apparatus of claim 11, wherein the color cast result comprises one or more of the following: reddening, greenish, bluish and yellowish;
the second determining module is configured to:
when the color cast result is that only any color cast is included, taking the color cast code corresponding to the color cast as the color cast code of the first image;
or (b)
And taking the sum of the color cast codes corresponding to more than one color cast as the color cast code of the first image under the condition that the color cast result comprises more than one color cast.
13. The apparatus of claim 11, wherein the means for determining, according to the second determination, is configured to:
obtaining a first matrix according to the color cast coding of the first image;
obtaining a second matrix according to a negative direction vector of an x axis, a negative direction vector of a y axis and a negative direction vector of the y axis of a preset RGB coordinate system;
and determining the global normal direction of the first image according to the first matrix and the second matrix.
14. The apparatus of claim 9, wherein the image processing module is configured to:
obtaining a first illumination map of the first image according to the illumination direction, a preset illumination calculation model and a soft shadow calculation model;
updating illumination information of the first illumination map according to the light intensity coefficient matrix to obtain a second illumination map;
the second illumination graph acts on the first image to obtain a light effect image;
and performing exposure inhibition processing on the light effect image to obtain a second image.
15. The apparatus of claim 14, wherein the image processing module is configured to:
enhancing the contrast ratio of the highlight region and the shadow region of the first illumination map to obtain an enhanced illumination map;
and updating the illumination information of the enhanced illumination graph according to the light intensity coefficient matrix to obtain a second illumination graph.
16. The apparatus of any one of claims 9-15, wherein the acquisition module is configured to:
responding to starting shooting equipment, and acquiring an original image acquired by the shooting equipment;
converting the original image into a gray scale map;
determining the mean value and the average deviation of the gray map from K;
Determining the ambient light brightness level of the shooting environment according to the mean value and the average deviation of the gray level map from K;
calling a light supplementing lamp of the shooting equipment under the condition that the ambient light brightness level is the target level, and shooting under the condition that the light supplementing lamp is used for assisting in lighting to obtain a first image; and the average interval of the ambient brightness corresponding to the target grade is (0, K).
17. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the image processing method of any of claims 1-8.
18. A readable storage medium, characterized in that the readable storage medium has stored thereon a program or instructions which, when executed by a processor, implement the steps of the image processing method according to any of claims 1-8.
CN202311069425.3A 2023-08-23 2023-08-23 Image processing method, device, electronic equipment and readable storage medium Pending CN117119317A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311069425.3A CN117119317A (en) 2023-08-23 2023-08-23 Image processing method, device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311069425.3A CN117119317A (en) 2023-08-23 2023-08-23 Image processing method, device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN117119317A true CN117119317A (en) 2023-11-24

Family

ID=88810491

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311069425.3A Pending CN117119317A (en) 2023-08-23 2023-08-23 Image processing method, device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN117119317A (en)

Similar Documents

Publication Publication Date Title
CN110675310B (en) Video processing method and device, electronic equipment and storage medium
CN110136183B (en) Image processing method and device and camera device
US8538147B2 (en) Methods and appartuses for restoring color and enhancing electronic images
WO2018176925A1 (en) Hdr image generation method and apparatus
CN109584180A (en) Face image processing process, device, electronic equipment and computer storage medium
US9307213B2 (en) Robust selection and weighting for gray patch automatic white balancing
CN113132704A (en) Image processing method, device, terminal and storage medium
CN112581395A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111970432A (en) Image processing method and image processing device
CN112289278A (en) Screen brightness adjusting method, screen brightness adjusting device and electronic equipment
CN113132696A (en) Image tone mapping method, device, electronic equipment and storage medium
CN116823674A (en) Cross-modal fusion underwater image enhancement method
CN110473156B (en) Image information processing method and device, storage medium and electronic equipment
CN111462158A (en) Image processing method and device, intelligent device and storage medium
CN112419218B (en) Image processing method and device and electronic equipment
CN109934168B (en) Face image mapping method and device
CN117119317A (en) Image processing method, device, electronic equipment and readable storage medium
EP4090006A2 (en) Image signal processing based on virtual superimposition
CN114125302A (en) Image adjustment method and device
CN118674646A (en) Image smoothing processing method, device, electronic device, chip and medium
CN114820822A (en) Image processing method and device, electronic equipment and readable storage medium
CN117596489B (en) Image processing method, image processing device, electronic device and storage medium
CN113114930B (en) Information display method, device, equipment and medium
CN119746400A (en) Color adjustment method, device, equipment and medium for model target area
CN118447107A (en) Image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination