Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the embodiments of the present application.
Referring to fig. 1 and fig. 2, an image processing method according to an embodiment of the present disclosure includes:
01: acquiring preview images of a plurality of different scenes;
02: detecting low-brightness areas which are lower in brightness value than other areas and are distributed continuously in each preview image; and
03: when the coincidence degree between the low-brightness regions of the plurality of preview images is higher than a preset threshold value, a notification that the lens 10 is dirty is generated.
The image processing method according to the embodiment of the present application can be applied to the terminal 100 according to the embodiment of the present application, where the terminal 100 includes a lens 10 and a processor 20, and the processor 20 can be configured to implement steps 01, 02, and 03, that is, the processor 20 can be configured to obtain preview images of a plurality of different scenes; detecting low-brightness areas which are lower in brightness value than other areas and are distributed continuously in each preview image; when the coincidence degree between the low-brightness regions of the plurality of preview images is higher than a preset threshold value, a notification that the lens 10 is dirty is generated.
Specifically, the terminal 100 may be a mobile phone, a tablet computer, a single lens reflex camera, a laptop computer, a smart watch, smart glasses, a smart headset, or other terminal 100, and the terminal 100 shown in fig. 2 is taken as an example for illustration, and it is understood that the specific form of the terminal 100 is not limited to the mobile phone. The lens 10 may be any device on the terminal 100 for receiving light to perform imaging, for example, the lens 10 may be a front camera, a rear camera, a side camera, a screen camera, etc., without limitation. The processor 20 may be a processor 20 such as an application processor, an image processor, etc. of the terminal 100.
In step 01, preview images of a plurality of different scenes are acquired. The preview image may be a picture obtained by the lens 10 for the user to preview without the user confirming shooting, or a picture obtained by shooting after the user confirming shooting. The plurality may be any number greater than one of two, three, four, five, etc. Different scenes refer to different pictures in the preview image, including different contents of the pictures, different angles of the pictures, and the like. It is to be understood that the plurality of preview images may be preview images obtained at different times when the lens 10 is once opened, or may be preview images obtained separately when the lens 10 is opened a plurality of times. In the example shown in fig. 3a, 3b, and 3c, the plurality of preview images are preview image P1, preview image P2, and preview image P3, respectively, and the scenes of the plurality of preview images are different.
In step 02, low-luminance regions in each preview image, which have lower luminance values than other regions and are continuously distributed, are detected. The preview image may be an image stored in a YUV format, each pixel in the preview image has information of a brightness value, and the processor 20 may obtain positions and brightness values of all pixels in each preview image, and further determine a distribution condition of a low-brightness region. Specifically, the low-luminance region refers to a region having a lower luminance value than other regions in the same preview image, and the low-luminance region is also a continuous region, that is, each pixel in the low-luminance region is at least adjacent to another pixel in the same low-luminance region. In the example shown in fig. 3a, 3b and 3c, the low-luminance region in the preview image P1 is D1, the low-luminance region in the preview image P2 is D2 and the low-luminance region in the preview image P3 is D3.
In step 03, when the coincidence degree of the low-luminance regions of the plurality of preview images is higher than a preset threshold value, a notification message that the lens 10 is dirty is generated. Because there is when dirty at camera lens 10, dirty can shelter from partial light that gets into camera lens 10, can have the region darker for the rest of areas in the formation of image of camera lens 10, and because dirty position usually can not change in the short time, therefore this darker region probably can all exist when shooing different scenes, and the position is roughly unchangeable, and dirty wherein may be greasy dirt, granule, dust, water stain, oil stain etc. dirty exists in camera lens 10, dirty camera lens 10 exists dirty can be the dirty that exists on the lens group of camera lens 10, also may be the dirty that exists on the apron or the screen that lie in the light path of camera lens 10. If the coincidence degree of the low-brightness regions of the preview images is higher than the preset threshold value, it is considered that the low-brightness regions in the preview images are likely to be caused by the dirt of the lens 10, and at this time, a prompt message that the dirt exists in the lens 10 is generated, so that the user can conveniently process the dirt. The overlap ratio may be a ratio of an area where the low-luminance regions of the different preview images overlap to an area of the low-luminance region, and when the ratio is large, the overlap ratio is large, and when the ratio is small, the overlap ratio is small. The preset threshold may be set when the terminal 100 leaves a factory, or may be adjusted by a user, for example, the preset threshold may be a numerical value such as 0.5, 0.6, or 0.8, which is not limited herein. After generating the notification information that the lens 10 is dirty, the terminal 100 may respond to the dirty notification information, specifically, the notification information may be displayed on a display screen of the terminal 100, or the notification information that the lens 10 is dirty is emitted through a speaker of the terminal 100, or a specific vibration is emitted by the terminal 100 to prompt the user, which is not limited herein.
In summary, in the image processing method and the terminal 100 according to the embodiment of the present invention, by detecting the low luminance areas in the plurality of preview images, if the overlapping degree of the low luminance areas in the plurality of preview images is higher than the preset threshold, it is determined that the light of the low luminance areas is formed due to the blocking of the dirt, and at this time, a prompt message indicating that the lens 10 is dirty is generated to notify the user.
In some embodiments, before implementing step 02, the image processing method further comprises the steps of: a user instruction is received to set a prompt message whether to respond to the contamination of the lens 10. If the user can input an instruction to set that the lens 10 needs to respond to the dirty prompting message, the terminal 100 responds to the prompting message after generating the prompting message that the lens 10 is dirty in step 03, in one example, the terminal 100 defaults to set the prompting message that the user needs to prompt the lens 10 to be dirty. The user may also input an instruction to set the dirty prompting message that does not need to be responded, for example, in some cases, the user may intend to paint a pigment or other foreign matter on the lens 10 to shoot an image with a personality or a special effect, and the user may set the non-response prompting message first to avoid being disturbed by the dirty prompting message when the user subsequently uses the terminal 100, so as to improve the user experience.
Referring to fig. 4, in some embodiments, the image processing method further includes:
04: dividing the preview image into a plurality of brightness areas according to a plurality of preset brightness value intervals, wherein each brightness value interval is associated with a corresponding adjustment proportion; and
05: and adjusting the brightness value of the corresponding brightness area according to the adjustment ratio.
Referring to fig. 2, in some embodiments, the processor 20 may be further configured to perform step 04 and step 05, that is, the processor 20 may be configured to divide the preview image into a plurality of luminance zones according to a plurality of preset luminance value intervals, where each of the luminance value intervals is associated with a corresponding adjustment ratio; and adjusting the brightness value of the corresponding brightness area according to the adjustment proportion.
Due to the fact that the lens 10 is dirty, the influence of the dirt on the preview image may not only be in a low-brightness area, but also the lens 10 may automatically adjust parameters such as corresponding sensitivity and aperture due to the dirt, so that the overall brightness of the preview image is poor in appearance, in addition, the low-brightness area may be prominent due to the obviously low brightness, so that the attractiveness of the preview image is affected, and the brightness of the preview image can be readjusted by processing the preview image through the step 04 and the step 05 to restore the shooting effect without the dirt as much as possible.
Specifically, in step 04, the preview image is divided into a plurality of luminance areas according to a plurality of preset luminance value intervals. For different preview images, the luminance value sections may be different, and the number of the divided luminance regions may also be different, and in one example, three luminance value sections may be set according to the distribution range of the luminance values of all the pixels of the preview image, and the divided luminance regions are divided into three luminance regions, for example, the distribution range of the luminance values of all the pixels of the preview image is [80, 230], the three luminance value sections set may be [80, 130 ], [130, 180 ], [180, 230], and the three divided luminance regions may be a low luminance region, a medium luminance region, and a high luminance region, respectively. Of course, each brightness region may be a complete and continuous region, and each brightness region may also include a plurality of discrete and spaced regions, which is not limited herein. The interval width of each brightness value interval can be the same or different, and the number of brightness value intervals and brightness areas is not limited to the above three discussions, and can be any number greater than or equal to two.
In step 05, the brightness value of the corresponding brightness region is adjusted by the adjustment ratio. The adjustment proportion is associated with the corresponding brightness value interval, the adjustment proportions associated with different brightness value intervals may be partially the same or completely different, and the brightness value of each pixel after adjustment may be represented by the formula: the adjusted luminance value is expressed as the adjusted luminance value x (1+ adjustment ratio), and the influence of contamination in different luminance regions can be compensated by controlling the difference in the adjustment ratio. In one example, the adjustment ratio of the low-luminance region may be a number greater than zero or a negative number close to zero, the adjustment ratio of the medium-luminance region may be a number equal to zero or close to zero, and the adjustment ratio of the high-luminance region may be a number less than zero, so as to perform differentiation processing on the luminance values of the pixels in different luminance regions, for example, the adjustment ratio of the low-luminance region, the adjustment ratio of the medium-luminance region, and the adjustment ratio of the high-luminance region are respectively 0.1, 0, and-0.1, or respectively-0.1, -0.2, and-0.3, or respectively 0.2, 0.1, and-0.2, and the like. In the example shown in fig. 5, a relatively significant low-luminance region D1 may exist in the preview image P1 before adjustment, the low-luminance region D1 has a large influence on the overall effect of the preview image P1, and the adjusted preview image P1 'is obtained through the processing in steps 04 and 05, so that the significant low-luminance region no longer exists in the preview image P1', the influence of the dirt on the imaging effect of the lens 10 is compensated, and the imaging quality of the lens 10 is improved.
In another embodiment, the area of the preview image other than the low-luminance area may be divided into a plurality of adjustment areas according to the distance between the pixel in the preview image and the low-luminance area, the low-luminance area and the plurality of adjustment areas are respectively associated with different adjustment coefficients, and the luminance values of the pixels in different areas in the preview image are respectively adjusted by the different adjustment coefficients. Since the low-luminance area may be most affected by the contamination, and in other areas, the area closer to the low-luminance area may be more affected by the contamination, and the distortion of the luminance value is more serious, the other areas may be divided according to the distance from the low-luminance area, and different adjustment ratios may be allocated to restore the luminance distribution of the preview image when there is no contamination as much as possible.
In some embodiments, the terminal 100 may simultaneously display the preview image that has not been processed in steps 01 to 05 and the preview image that has been processed in steps 01 to 05, so that the user can conveniently compare and select the preview images before and after processing, and the user can select one of the preview images to be saved or edited, or select two preview images to be saved or edited, which reflects the will of the user.
Referring to fig. 6, in some embodiments, the image processing method further includes:
06: the contrast of the preview image is improved; and/or
07: and improving the color saturation of the preview image.
Referring to fig. 2, in some embodiments, the processor 20 may be further configured to perform step 06 and/or step 07, that is, the processor 20 may be configured to enhance the contrast of the preview image; and/or processor 20 may be used to enhance the color saturation of the preview image.
By further improving the contrast and/or color saturation of the preview image, defects such as the entire whitish preview image and missing color gradation due to the dirt on the lens 10 can be compensated, and the quality of the preview image can be further improved. Specifically, the processor 20 may generate a first preview image that is processed by increasing the contrast and increasing the color saturation, a second preview image that is processed by increasing the contrast only, and a third preview image that is processed by increasing the color saturation only, so that the user can select which one or more of the first preview image, the second preview image, and the third preview image is/are saved, so as to provide more choices for the user and improve the user experience.
Referring to fig. 7, in some embodiments, the image processing method further includes:
08: detecting a brightness value within a preset range of the periphery of a light source when the light source exists in the preview image; and
09: when a banded region connected with the light source exists in the preset range and the brightness value of the banded region is greater than the brightness value preset brightness threshold value of the rest regions in the preset range, a prompt message that the lens 10 is dirty is generated.
Referring to fig. 2, in some embodiments, the processor 20 may be configured to perform steps 08 and 09, that is, the processor 20 may be configured to detect a brightness value within a predetermined range around a periphery of a light source when the light source exists in a preview image; and generating a prompt message that the lens 10 is dirty when a banded region connected with the light source exists in the preset range and the brightness value of the banded region is greater than the brightness value preset brightness threshold of the rest regions in the preset range.
When the light source exists in the preview image, the brightness of the whole preview image may be relatively high, and it is difficult to distinguish an obvious low-brightness region, and at this time, whether the lens 10 is dirty or not may be determined according to the influence of the dirt on the light source in the preview image. Specifically, since light passes through the lens group and then is usually focused on the image sensor of the lens 10 when there is no dirt, if there is dirt on the lens 10, the dirt may affect the refraction angle of the light, or cause the light to scatter, etc., which may result in that the light may not be focused on the image sensor accurately, and may generate halo, flare, etc., especially for scenes with light sources, the light band may also appear at the light source position, etc. Step 08 and step 09 determine that the lens 10 is dirty by detecting a band-shaped area with a brightness value around the light source greater than that of the other areas, so that the user can know the dirty condition of the lens 10 in advance.
Specifically, in step 08, when a light source is present in the preview image, the brightness value within the peripheral preset range of the light source is detected. The determination of whether the light source exists in the preview image may be performed by detecting whether a small range area with extremely high brightness exists in the preview image, for example, the brightness value of the pixel in the small range area reaches the maximum, or the RGB pixel values of the pixels in the small range area are all close to 255, and the like, which is not limited herein. When the light source exists in the preview image, detecting the brightness value in a preset range around the light source, for example, detecting the brightness value of pixels in a preset number of pixel ranges around the light source with the position of the light source as the center, where the preset range may be set by a user or may be set according to the accuracy of the image that the user needs to capture currently, where the preset range may be larger when the accuracy requirement is higher, and the preset range may be smaller when the accuracy requirement is lower. Of course, when it is determined that the light source is not present in the preview image, it is possible to determine whether or not the lens 10 is dirty in steps 02 and 03.
In step 09, when a band-shaped region connected to the light source exists in the preset range and the brightness value of the band-shaped region is greater than the brightness value of the rest regions in the preset range by a preset brightness threshold, a notification message indicating that the lens 10 is dirty is generated. Specifically, the strip-shaped region may be a substantially strip-shaped region, the strip-shaped region is connected to the light source, and a luminance value of the strip-shaped region is greater than a luminance value of the remaining regions within a preset range, exceeds a luminance threshold, and is less than a luminance value of the light source. For example, as shown in fig. 8, in the preview image P4, the luminance value of the pixel where the light source L is located is 255, the pixels of the strip-shaped region D4 are connected to the pixels where the light source L is located, the luminance value of the pixels of the strip-shaped region D4 is in the range of 200 to 245, the luminance value of the pixels of the rest of the preset range is in the range of 100 to 150, and the difference between the luminance value of the pixels of the strip-shaped region D4 and the luminance value of the pixels of the rest of the preset range exceeds the luminance threshold, the strip-shaped region D4 is likely to be generated due to the dirt on the lens 10. Therefore, at this time, it is possible to generate a notice that the lens 10 is dirty.
Referring to fig. 9, in some embodiments, the image processing method further includes:
010: calibrating the brightness value of the banded region according to the brightness values of the rest regions; and/or
011: the color of the banded regions is calibrated against the color of the remaining regions.
As described above, the strip-shaped region is likely to be generated due to the existence of dirt, the information of the strip-shaped region is likely to be distorted, and since the distance between the rest region in the preset range and the strip-shaped region is short, the information of the strip-shaped region obtained according to the information of the rest region is more accurate, so that the processed preview image is more accurate.
In step 010, calibrating the brightness value of the banded region according to the brightness values of the rest regions, specifically, detecting an average value of the brightness values of all pixels of the rest regions, and replacing the brightness value of the pixels of the banded region with the average value; alternatively, the luminance values of the pixels in the remaining region adjacent to the strip region may be substituted for the luminance values of the pixels in the strip region; alternatively, the brightness difference between the light source and the remaining region may be calculated, and the brightness value of the strip region may be calculated according to a decreasing rule in a direction away from the light source, so as to make the brightness value of the strip region closer to a real value, and how to calibrate the brightness value of the strip region by using the brightness values of the remaining region may also have other specific manners, which are not limited herein.
In step 011, calibrating the color of the strip-shaped region according to the color of the remaining region, specifically, replacing the color of the pixels in the strip-shaped region with the color of the pixels in the remaining region adjacent to the strip-shaped region; or, the color of the pixel appearing the most times is taken from the remaining area, and the color of the pixel is substituted for the color of the pixel in the strip area, so that the color of the strip area is closer to the real color, for example, the light source may be the sun, the strip area may be the color of a blue sky, the color of the remaining area may also be the color of a blue sky, the real color of the strip area can be well restored through the color of the remaining area, and how to use the color of the remaining area to calibrate the color of the strip area may also be other specific manners, which are not limited herein.
In the example shown in fig. 8 and 10, the preview image P4 is processed by the image processing method to obtain the preview image P4 ', and in the preview image P4', the brightness and color of the band-shaped region D4 (the region surrounded by the dotted line in fig. 10 and excluding the light source L) are closer to the brightness and color of the surroundings, so that the influence of dirt on the image quality is eliminated, and the reality of the scene to be presented is improved.
Referring to fig. 11, the present application also provides a non-volatile computer-readable storage medium 200, where the computer-readable storage medium 200 contains computer-executable instructions 201, and when the computer-executable instructions 201 are executed by one or more processors 300, the processors 300 are caused to execute the image processing method according to any embodiment of the present application.
For example, when the computer-executable 201 instructions are executed by the processor 300, the processor 300 may be configured to perform the steps of:
01: acquiring preview images of a plurality of different scenes;
02: detecting low-brightness areas which are lower in brightness value than other areas and are distributed continuously in each preview image; and
03: when the coincidence degree between the low-brightness regions of the plurality of preview images is higher than a preset threshold value, a notification that the lens 10 is dirty is generated.
As another example, when the computer-executable instructions 201 are executed by the processor 300, the processor 300 may be configured to perform the steps of:
08: detecting a brightness value within a preset range of the periphery of a light source when the light source exists in the preview image; and
09: when a banded region connected with the light source exists in the preset range and the brightness value of the banded region is greater than the brightness value preset brightness threshold value of the rest regions in the preset range, a prompt message that the lens 10 is dirty is generated.
In the computer-readable storage medium 200 according to the embodiment of the present application, by detecting a low brightness region in a plurality of preview images, if the degree of coincidence of the low brightness region in the plurality of preview images is higher than a preset threshold, it is determined that light in the low brightness region is formed due to blocking of dirt, and at this time, a notification message indicating that the lens 10 is dirty is generated to notify the user.
In the description herein, reference to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.