[go: up one dir, main page]

CN108182671B - Single image defogging method based on sky area identification - Google Patents

Single image defogging method based on sky area identification Download PDF

Info

Publication number
CN108182671B
CN108182671B CN201810072673.6A CN201810072673A CN108182671B CN 108182671 B CN108182671 B CN 108182671B CN 201810072673 A CN201810072673 A CN 201810072673A CN 108182671 B CN108182671 B CN 108182671B
Authority
CN
China
Prior art keywords
pixel
sky
image
sky area
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810072673.6A
Other languages
Chinese (zh)
Other versions
CN108182671A (en
Inventor
顾振飞
王书旺
陈凡
袁小燕
单祝鹏
丁梦悍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Vocational College Of Information Technology
Original Assignee
Nanjing Vocational College Of Information Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Vocational College Of Information Technology filed Critical Nanjing Vocational College Of Information Technology
Priority to CN201810072673.6A priority Critical patent/CN108182671B/en
Publication of CN108182671A publication Critical patent/CN108182671A/en
Application granted granted Critical
Publication of CN108182671B publication Critical patent/CN108182671B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30192Weather; Meteorology

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a single image defogging method based on sky area identification, which comprises the steps of identifying a sky area and a non-sky area in a foggy color image, estimating a global atmospheric light value in the sky area, defogging the non-sky area according to the global atmospheric light value, and obtaining a total defogged image by combining defogging images of the sky area and the non-sky area in the foggy color image; according to the method, the atmospheric light is estimated in the identified sky area, so that the accuracy of atmospheric light estimation can be effectively improved, and the visual effect of the image after defogging processing is improved; the method only carries out defogging treatment on the non-sky area, and avoids the negative effects of color cast, over-enhancement or halo and the like in the sky area.

Description

Single image defogging method based on sky area identification
Technical Field
The invention belongs to the technical field of digital image defogging processing, and particularly relates to a single image defogging method based on sky region identification, which is used for defogging a color image without color deviation.
Background
When the existing image defogging method is used for processing a foggy color image containing a sky area, the following defects exist: 1) the sky area is formed by dense fog, so that the defogging treatment of the sky area in the foggy color image is unreasonable in nature, and the scene details hidden behind the sky area cannot be revealed by eliminating the fog in the sky area; 2) the existing image defogging method aims to eliminate excessive atmospheric light (approximate white) components contained in pixels, so that information gain is achieved by improving saturation, contrast and the like, and considering that a sky area should have characteristics of smoothness and lack of texture, the information gain introducing discomfort to the sky area inevitably causes negative effects such as over-enhancement and color cast.
Disclosure of Invention
In order to solve the problems, the invention provides a single image defogging method based on sky area identification, which can effectively eliminate fog in an image and simultaneously avoid the effects of color cast, over-enhancement or halo and the like in a sky area.
The specific technical scheme of the invention is as follows: a single image defogging method based on sky region identification comprises the following steps:
step 1, identifying a sky area and a non-sky area in a foggy color image;
step 2, estimating a global atmospheric light value in the sky area;
step 3, obtaining a transmission map of the non-sky area, and obtaining a defogging map of the non-sky area by using an atmospheric scattering model according to the global atmospheric light value;
and 4, combining the defogging images of the sky area and the non-sky area in the fog color image to obtain a total defogging image.
Further, in step 1, the specific steps of identifying the sky area and the non-sky area in the foggy color image are as follows:
step 1.1, extracting a sky characteristic value of each pixel in the foggy color image by using the following formula:
Figure GDA0003480314840000011
wherein F (x, y) represents a sky feature value of the pixel (x, y),
Figure GDA0003480314840000012
denotes the gradient of the pixel (x, y), V (x, y) denotes the luminance of the pixel (x, y), S (x, y) denotes the saturation of the pixel (x, y),
Figure GDA0003480314840000013
representing the mean value of the brightness of the whole picture, λ1Representing a threshold value of a characteristic of the gradient, λ2Representing a saturation characteristic threshold, e representing a natural constant;
step 1.2, optimizing the sky characteristic value of each pixel in the foggy color image by using the following formula:
Frefined(x,y)←dilate(erode(F(x,y)))
wherein, Frefined(x, y) represents a sky feature optimization value for pixel (x, y), dilate (·) represents an inflation operator, and erode (·) represents a corrosion operator;
and 1.3, sequentially judging each pixel in the foggy color image, if the sky feature optimization value of the pixel is more than 1.2 times of the sky feature optimization mean value of the whole image, judging that the pixel belongs to a sky area, and if not, judging that the pixel belongs to a non-sky area.
Further, the step 2 of calculating a global atmospheric light value according to the sky area specifically includes: and solving a saturation mean value of the neighborhood of each pixel in the sky area, forming a candidate atmosphere light pixel set by using pixels corresponding to 1% of the saturation mean value, calculating an intensity mean value in the candidate atmosphere light pixel set, and taking the value as a global atmosphere light value.
Further, in step 3, a transmission map of the non-sky region is obtained, and a defogging map of the non-sky region is obtained by using an atmospheric scattering model according to the global atmospheric light value, which specifically includes the following steps:
step 3.1, calculating the rough transmittance of the pixels in the non-sky area by using the following formula to construct a rough transmittance map of the non-sky area:
Figure GDA0003480314840000021
wherein, trough' (x, y) denotes a rough transmittance of a pixel (x, y) in the rough transmission map of the non-sky region, Ω (x, y) denotes a neighborhood of the pixel (x, y) in the rough transmission map of the non-sky region, Ic(x ', y') represents the intensity value of any one of the R, G, B channels of the pixel (x ', y') in the foggy color image corresponding to any one of the pixels (x ', y') in the neighborhood Ω (x, y), LRepresents a global atmospheric light value;
step 3.2, optimizing the rough transmittance of the pixels in the non-sky area by adopting a guide total variation model to obtain an optimized transmittance graph of the non-sky area, wherein the expression of the guide total variation model is as follows:
Figure GDA0003480314840000022
wherein t' (x, y) represents an optimized transmittance of a pixel (x, y) in an optimized transmission map of a non-sky region,
Figure GDA0003480314840000023
a gradient of a pixel (x, y) in the optimized transmission map representing the non-sky region,
Figure GDA0003480314840000024
a gradient of a pixel (x, y) in a gray scale image of the foggy color image corresponding to the pixel (x, y) in the optimized transmission image representing the non-sky region,
Figure GDA0003480314840000025
Figure GDA0003480314840000026
expressing the square of a two-norm;
and 3.3, obtaining a defogging image of the non-sky area by adopting an atmospheric scattering model according to the global atmospheric light value and the optimized transmission image of the non-sky area.
Further, in step 4, a total defogged image is obtained by combining the defogged images of the sky area and the non-sky area in the fog color image, which specifically includes: setting the rough transmissivity of the pixels of the sky area as 1, and constructing a rough transmission graph of the sky area; optimizing the rough transmittance of the pixels of the sky area by adopting a guide total variation model to obtain an optimized transmission image of the sky area; obtaining a defogging map of the sky area by adopting an atmospheric scattering model according to the global atmospheric light value and the optimized transmission map of the sky area; and combining the defogging image of the sky area and the defogging image of the non-sky area to obtain a total defogging image.
Further, the method comprises a step 5 of performing brightness consistency correction on the total defogged image by using the following formula to obtain a defogged corrected image:
Figure GDA0003480314840000027
wherein, R (x, y) represents the intensity value of the pixel (x, y) in the defogged corrected image, J (x, y) represents the intensity value of the pixel (x, y) in the total defogged image corresponding to the pixel (x, y) in the defogged corrected image, Ω (x, y) represents the neighborhood of the pixel (x, y) in the total defogged image corresponding to the pixel (x, y) in the defogged corrected image, J (x, y) represents the neighborhood of the pixel (x, y) in the total defogged image corresponding to the pixel (x, y) in the defogged corrected image, andc(x ', y') represents the intensity value of any one of R, G, B channels of the pixel (x ', y') in the total defogged image corresponding to any one of the pixels (x ', y') in the neighborhood Ω (x, y).
The invention has the beneficial effects that: the method comprises the steps of identifying a sky area and a non-sky area in a foggy color image, estimating a global atmospheric light value in the sky area, carrying out defogging treatment on the non-sky area according to the global atmospheric light value to obtain a defogging image of the non-sky area, and finally combining the sky area in the foggy color image to obtain a total defogging image; according to the method, the global atmospheric light value is estimated in the identified sky area, so that the accuracy of atmospheric light estimation can be effectively improved, and the visual effect of the image after defogging processing is improved; the method only carries out defogging treatment on the non-sky area, and avoids the negative effects of color cast, over-enhancement or halo and the like in the sky area.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Fig. 2 is a flow chart of the intermediate effect obtained by defogging a specific image according to the embodiment of the invention.
Fig. 3 is a diagram illustrating an effect of identifying a sky area according to an embodiment of the present invention.
Fig. 4 is a comparison graph of the effect of the defogging process on the first foggy image according to the embodiment of the present invention and the existing image defogging method.
Fig. 5 is a comparison graph of the effect of the defogging process on the second foggy image according to the embodiment of the present invention and the existing image defogging method.
Detailed Description
The method of the invention is further illustrated with reference to the accompanying drawings and specific examples.
As shown in fig. 1, a defogging method for a single image based on sky region identification according to an embodiment of the present invention includes the following steps:
step 1, identifying a sky area and a non-sky area in a foggy color image, specifically:
step 1.1, extracting a sky feature value of each pixel in the foggy color image by using the following formula to obtain a sky feature map, as shown in (b) of fig. 2:
Figure GDA0003480314840000031
wherein F (x, y) represents a sky feature value of the pixel (x, y),
Figure GDA0003480314840000032
denotes the gradient of the pixel (x, y), V (x, y) denotes the luminance of the pixel (x, y), S (x, y) denotes the saturation of the pixel (x, y),
Figure GDA0003480314840000033
representing the mean value of the brightness of the whole picture, λ1Representing a threshold value of a characteristic of the gradient, λ2Denotes a saturation characteristic threshold, and e denotes a natural constant.
In this example λ1=0.005,λ2=0.04。
Step 1.2, in order to eliminate noise in the sky feature map, performing dilation operation on the sky feature map, and then performing erosion operation on the sky feature map to obtain an optimized sky feature map, as shown in (c) in fig. 2. Specifically, the sky characteristic value of each pixel in the foggy color image is optimized by the following formula:
Frefined(x,y)←dilate(erode(F(x,y)))
wherein, Frefined(x, y) represents the sky feature optimization value for pixel (x, y), dilate (·) represents the dilation operator, and erode (·) represents the erosion operator.
Step 1.3, solving the sky feature optimized mean value of the whole picture
Figure GDA0003480314840000041
Sequentially distinguishing each pixel in the foggy color image, and determining the sky characteristic optimization value F of the pixel (x, y)refined(x, y) optimized mean value of sky features greater than full map
Figure GDA0003480314840000042
1.2 times of the sky area, it is determined that the pixel (x, y) belongs to the sky area IskyOtherwise, it belongs to non-sky region Inon-sky
Step 1 of the embodiment of the present invention can effectively identify a sky region and a non-sky region, and the effect is shown in fig. 3, where (a), (b), (c), and (d) in fig. 3 are fog color images including a sky region, and (e), (f), (g), and (h) in fig. 3 correspond to the sky region identification effect diagrams of (a), (b), (c), and (d) in fig. 3, respectively.
Step 2, obtaining sky area IskyThe saturation mean value of the neighborhood of each pixel in the method is 15 × 15, the saturation mean values are sorted from large to small, the pixels corresponding to the saturation mean value arranged in the first 1% are selected to form a candidate atmospheric light pixel set, the intensity mean value in the candidate atmospheric light pixel set is calculated, and the value is used as the global atmospheric light value L
And 3, solving a transmission map of the non-sky area, and obtaining a defogging map of the non-sky area by using an atmospheric scattering model according to the global atmospheric light value.
Step 3.1, combining a dark channel prior method [1], calculating the rough transmittance of pixels in the non-sky area by using the following formula, and constructing a rough transmittance map of the non-sky area:
Figure GDA0003480314840000043
wherein, trough' (x, y) denotes a rough transmittance of a pixel (x, y) in the rough transmission map of the non-sky region, Ω (x, y) denotes a neighborhood of the pixel (x, y) in the rough transmission map of the non-sky region, Ic(x ', y') represents a pixel in the foggy color image corresponding to any one pixel (x ', y') in the neighborhood Ω (x, y)(x ', y') intensity value, L, for any of R, G, B three channelsRepresenting a global atmospheric light value.
Step 3.2, optimizing the rough transmittance of the pixels in the non-sky area by adopting a guided total variation model proposed in the document [2], so as to obtain an optimized transmittance map of the non-sky area, wherein the expression of the guided total variation model is as follows:
Figure GDA0003480314840000044
wherein t' (x, y) represents an optimized transmittance of a pixel (x, y) in an optimized transmission map of a non-sky region,
Figure GDA0003480314840000045
a gradient of a pixel (x, y) in the optimized transmission map representing the non-sky region,
Figure GDA0003480314840000046
a gradient of a pixel (x, y) in a gray scale image of the foggy color image corresponding to the pixel (x, y) in the optimized transmission image representing the non-sky region,
Figure GDA0003480314840000047
Figure GDA0003480314840000048
which represents squaring the two norms.
Step 3.3, according to the global atmospheric light value LAnd an optimized transmission map of the non-sky area, and obtaining a defogging map of the non-sky area by using the following formula:
Figure GDA0003480314840000049
j '(x, y) represents the intensity value of the pixel (x, y) in the defogged image of the non-sky region, and I' (x, y) represents the intensity value of the pixel (x, y) in the foggy color image corresponding to the pixel (x, y) in the defogged image of the non-sky region.
And 4, combining the defogging images of the sky area and the non-sky area of the fog color image to obtain a total defogging image.
Considering that the boundary edge between the sky area and the non-sky area in the defogging total image is prevented from being abrupt and the jump is too large, which causes the halo effect to be obvious, step 4 in the embodiment of the invention also adds the defogging treatment to the identified sky area, which specifically comprises the following steps:
step 4.1, set coarse transmittance t of pixel (x, y) in sky regionrough *And (x, y) is 1, and a rough transmission map of the sky area is constructed.
Step 4.2, optimizing the rough transmittance of the pixels in the sky area by adopting a guide total variation model to obtain an optimized transmission graph of the sky area, wherein the expression of the guide total variation model is as follows:
Figure GDA0003480314840000051
wherein, t*(x, y) represents an optimized transmittance of a pixel (x, y) in an optimized transmittance map of the sky region,
Figure GDA0003480314840000052
a gradient of an optimized transmittance of a pixel (x, y) in an optimized transmittance map representing the sky area,
Figure GDA0003480314840000053
a gradient of a pixel (x, y) in a gray scale image of the foggy color image corresponding to the pixel (x, y) in the optimized transmission image representing the sky area,
Figure GDA0003480314840000054
Figure GDA0003480314840000055
which represents squaring the two norms.
Step 4.3, according to the global atmospheric light value LAnd obtaining a defogging map of the sky region by using the following formula:
Figure GDA0003480314840000056
wherein, J*(x, y) represents an intensity value of a pixel (x, y) in a defogged image of the sky region, I*(x, y) represents the intensity value of the pixel (x, y) in the foggy color image corresponding to the pixel (x, y) in the defogged image of the sky region.
And 4.4, combining the defogging image of the sky area with the defogging image of the non-sky area to obtain a total defogging image.
It should be noted that, the step 3 and the step 4 may be combined into the same step, and specifically include:
the rough transmittance of the pixels in the foggy color image is found by the following formula to construct a rough transmittance map of the foggy color image, as shown in (e) of fig. 2:
Figure GDA0003480314840000057
wherein, trough(x, y) represents a coarse transmittance of the pixel (x, y) in the coarse transmission map, Ω (x, y) represents a neighborhood of the pixel (x, y) in the coarse transmission map, Ic(x ', y') represents the intensity value of any one of the R, G, B channels of the pixel (x ', y') in the foggy color image corresponding to any one of the pixels (x ', y') in the neighborhood Ω (x, y), LRepresenting a global atmospheric light value.
And (f) optimizing the rough transmittance of the pixels in the rough transmission image by using a guided full-variation model to obtain an optimized transmission image, wherein the expression of the guided full-variation model is as follows:
Figure GDA0003480314840000061
wherein t (x, y) represents an optimized transmittance of the pixel (x, y) in the optimized transmission map,
Figure GDA0003480314840000062
mean excellentThe gradient of the optimized transmittance of the pixel (x, y) in the transmission map is normalized,
Figure GDA0003480314840000063
representing the gradient of the pixel (x, y) in the gray-scale image of the foggy color image corresponding to the pixel (x, y) in the optimized transmission image,
Figure GDA0003480314840000064
which represents squaring the two norms.
According to the global atmospheric light value LAnd optimizing the transmission map to obtain a total defogged image using the following formula, as shown in (g) of fig. 2:
Figure GDA0003480314840000065
wherein J (x, y) represents the intensity value of pixel (x, y) in the total defogged image, and I' (x, y) represents the intensity value of pixel (x, y) in the foggy color image corresponding to pixel (x, y) in the total defogged image.
In the step, transmissivity estimation is carried out on a non-sky area by utilizing dark channel prior, and the rough transmissivity of pixels in the sky area is set to be 1, so that the rough transmissivity of the whole image is obtained. Then, the rough transmittance of the whole graph is optimized by adopting a guide total variation model, so that the optimized transmittance is obtained. And finally, obtaining the defogged image according to the global atmospheric light value and the optimized transmittance. The guiding total variation model is mainly used for optimizing the rough transmittance of the total graph, and the purpose of optimizing the rough transmittance of the total graph is to perform edge-preserving smoothing processing on the rough transmittance of the total graph, so that the transmittance of a depth-of-field abrupt change region does not generate a jump effect, particularly an edge region where a sky region and a non-sky region are intersected. The guiding total variation model uses the gray-scale image of the foggy color image as the guide, and the sky area of the foggy color image is approximately without texture, so that the rough transmissivity of the inside (the rough transmissivity is 1) of the sky area is not greatly changed, and the obtained defogging image of the sky area is not subjected to the effects of over enhancement and color cast.
The embodiment of the invention also comprises a step 5 of correcting the brightness consistency of the total defogged image by using the following formula to obtain a defogged corrected image:
Figure GDA0003480314840000066
wherein, R (x, y) represents the intensity value of the pixel (x, y) in the defogged corrected image, J (x, y) represents the intensity value of the pixel (x, y) in the total defogged image corresponding to the pixel (x, y) in the defogged corrected image, Ω (x, y) represents the neighborhood of the pixel (x, y) in the total defogged image corresponding to the pixel (x, y) in the defogged corrected image, J (x, y) represents the neighborhood of the pixel (x, y) in the total defogged image corresponding to the pixel (x, y) in the defogged corrected image, andc(x ', y') represents the intensity value of any one of R, G, B channels of the pixel (x ', y') in the total defogged image corresponding to any one of the pixels (x ', y') in the neighborhood Ω (x, y).
The method is simple and effective and has high efficiency. The core idea of the step is to adaptively reduce the pixel intensity in the over-bright area of the image and improve the pixel intensity in the over-dark area, thereby achieving the purpose of correcting the brightness consistency of the image.
Fig. 4 and 5 are diagrams illustrating the effect of processing two identical foggy color images by using an embodiment of the present invention and four conventional methods, where (a) in fig. 4 and 5 are foggy degraded images including a sky area, (b) in fig. 4 and 5 are defogged images processed by the He method [1], fig. 4 (c) and 5 (c) are defogged images processed by the Gu method [6], fig. 4 (d) and 5 (d) are defogged images processed by the Tarel method [4], fig. 4 (e) and 5 (e) are defogged images processed by the Meng method [5], and fig. 4 (f) and 5 (f) are defogged images processed by the present invention. As can be seen from fig. 4 and 5, the method of the present invention can achieve a better visual effect, and effectively eliminate fog in the image without adverse effects such as color cast, over-enhancement, halo, etc. in the sky region.
Reference documents:
[1]He K,Sun J,Tang X.Single Image Haze Removal Using Dark Channel Prior[J].IEEE Transactions on Pattern Analysis&Machine Intelligence,2011,33(12):2341-2353.
[2]Ju M,Zhang D,Wang X.Single image dehazing via an improved atmospheric scattering model[J].Visual Computer,2017,33(12):1613-1625.
[3]Gu Zhen-fei,Ju M,Zhang D.A novel Retinex image enhancement approach via brightness channel prior and change of detail prior[J].Pattern Recognition&Image Analysis,2017, 27(2):234-242.
[4]Tarel J P,Hautière N.Fast visibility restoration from a single color or gray level image[C]//IEEE 12th International Conference on Computer Vision,2009:2201-2208.
[5]Meng G,Wang Y,Duan J,et al.Efficient Image Dehazing with Boundary Constraint and Contextual Regularization[C]//IEEE International Conference on Computer Vision.IEEE, 2014:617-624.
[6]Gu Zhen-fei,Ju Ming-ye,Zhang Deng-yin.A single image dehazing method using average saturation prior[J].Mathematical Problems in Engineering,2017,2017:1-17.

Claims (3)

1. a single image defogging method based on sky region identification is characterized by comprising the following steps:
step 1, identifying a sky area and a non-sky area in a foggy color image; the method comprises the following specific steps:
step 1.1, extracting a sky characteristic value of each pixel in the foggy color image by using the following formula:
Figure FDA0003480314830000011
wherein F (x, y) represents a sky feature value of the pixel (x, y),
Figure FDA0003480314830000012
denotes the gradient of the pixel (x, y), V (x, y) denotes the luminance of the pixel (x, y), S (x, y) denotes the saturation of the pixel (x, y),
Figure FDA0003480314830000013
representing the mean value of the brightness of the whole picture, λ1Representing a threshold value of a characteristic of the gradient, λ2Representing a saturation characteristic threshold, e representing a natural constant;
step 1.2, optimizing the sky characteristic value of each pixel in the foggy color image by using the following formula:
Frefined(x,y)←dilate(erode(F(x,y)))
wherein, Frefined(x, y) represents a sky feature optimization value for pixel (x, y), dilate (·) represents an inflation operator, and erode (·) represents a corrosion operator;
step 1.3, sequentially judging each pixel in the foggy color image, if the sky feature optimization value of the pixel is more than 1.2 times of the sky feature optimization mean value of the whole image, judging that the pixel belongs to a sky area, otherwise, judging that the pixel belongs to a non-sky area;
step 2, estimating a global atmospheric light value in the sky area; the method specifically comprises the following steps: solving a saturation mean value of a neighborhood of each pixel in the sky area, sequencing the saturation mean values from large to small, selecting pixels corresponding to the saturation mean value arranged in the first 1% to form a candidate atmospheric light pixel set, calculating an intensity mean value in the candidate atmospheric light pixel set, and taking the value as a global atmospheric light value;
step 3, obtaining a transmission map of the non-sky area, and obtaining a defogging map of the non-sky area by using an atmospheric scattering model according to the global atmospheric light value; defogging only on a non-sky area;
the method comprises the following specific steps:
step 3.1, calculating the rough transmittance of the pixels in the non-sky area by using the following formula to construct a rough transmittance map of the non-sky area:
Figure FDA0003480314830000014
wherein, trough' (x, y) denotes a rough transmittance of a pixel (x, y) in a rough transmittance map of a non-sky area, and Ω (x, y) denotes a rough transmittance of a non-sky areaNeighborhood of pixel (x, y) in the rough transmission map, Ic(x ', y') represents the intensity value of any one of the R, G, B channels of the pixel (x ', y') in the foggy color image corresponding to any one of the pixels (x ', y') in the neighborhood Ω (x, y), LRepresents a global atmospheric light value;
step 3.2, optimizing the rough transmittance of the pixels in the non-sky area by adopting a guide total variation model to obtain an optimized transmission graph of the non-sky area, wherein the expression of the guide total variation model is as follows:
Figure FDA0003480314830000015
wherein t' (x, y) represents an optimized transmittance of a pixel (x, y) in an optimized transmission map of a non-sky region,
Figure FDA0003480314830000016
a gradient of a pixel (x, y) in the optimized transmission map representing the non-sky region,
Figure FDA0003480314830000021
a gradient of a pixel (x, y) in a gray scale image of the foggy color image corresponding to the pixel (x, y) in the optimized transmission image representing the non-sky region,
Figure FDA0003480314830000022
Figure FDA0003480314830000023
Figure FDA0003480314830000024
expressing the square of a two-norm;
3.3, obtaining a defogging image of the non-sky area by adopting an atmospheric scattering model according to the global atmospheric light value and the optimized transmission image of the non-sky area;
and 4, combining the defogging images of the sky area and the non-sky area in the fog color image to obtain a total defogging image.
2. The method of claim 1, wherein the step 4 of obtaining the total defogged image by combining the defogged images of the sky region and the non-sky region in the fog color image comprises: setting the rough transmissivity of the pixels of the sky area as 1, and constructing a rough transmission graph of the sky area; optimizing the rough transmittance of the pixels of the sky area by adopting a guide total variation model to obtain an optimized transmission image of the sky area; obtaining a defogging map of the sky area by adopting an atmospheric scattering model according to the global atmospheric light value and the optimized transmission map of the sky area; and combining the defogging image of the sky area and the defogging image of the non-sky area to obtain a total defogging image.
3. The method of claim 1, further comprising a step 5 of performing a brightness uniformity correction on the total defogged image to obtain a defogged corrected image, wherein the brightness uniformity correction is performed according to the following formula:
Figure FDA0003480314830000025
wherein, R (x, y) represents the intensity value of the pixel (x, y) in the defogged corrected image, J (x, y) represents the intensity value of the pixel (x, y) in the total defogged image corresponding to the pixel (x, y) in the defogged corrected image, Ω (x, y) represents the neighborhood of the pixel (x, y) in the total defogged image corresponding to the pixel (x, y) in the defogged corrected image, J (x, y) represents the neighborhood of the pixel (x, y) in the total defogged image corresponding to the pixel (x, y) in the defogged corrected image, andc(x ', y') represents the intensity value of any one of R, G, B channels of the pixel (x ', y') in the total defogged image corresponding to any one of the pixels (x ', y') in the neighborhood Ω (x, y).
CN201810072673.6A 2018-01-25 2018-01-25 Single image defogging method based on sky area identification Active CN108182671B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810072673.6A CN108182671B (en) 2018-01-25 2018-01-25 Single image defogging method based on sky area identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810072673.6A CN108182671B (en) 2018-01-25 2018-01-25 Single image defogging method based on sky area identification

Publications (2)

Publication Number Publication Date
CN108182671A CN108182671A (en) 2018-06-19
CN108182671B true CN108182671B (en) 2022-04-22

Family

ID=62551272

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810072673.6A Active CN108182671B (en) 2018-01-25 2018-01-25 Single image defogging method based on sky area identification

Country Status (1)

Country Link
CN (1) CN108182671B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109523480B (en) * 2018-11-12 2022-05-06 上海海事大学 A dehazing method, device, computer storage medium and terminal for sea fog images
CN110310241B (en) * 2019-06-26 2021-06-01 长安大学 Method for defogging traffic image with large air-light value by fusing depth region segmentation
CN110555814A (en) * 2019-08-30 2019-12-10 深圳市商汤科技有限公司 Image defogging processing method and device and storage medium
CN112017128B (en) * 2020-08-24 2024-05-03 东南大学 Image self-adaptive defogging method
CN112465720B (en) * 2020-11-27 2024-02-23 南京邮电大学 Image defogging method and device based on image sky segmentation and storage medium
CN113240602A (en) * 2021-05-17 2021-08-10 Oppo广东移动通信有限公司 Image defogging method and device, computer readable medium and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013040857A1 (en) * 2011-09-20 2013-03-28 Fujitsu Limited Exposure enhancement method and apparatus for a defogged image
CN103279931A (en) * 2013-06-03 2013-09-04 中国人民解放军国防科学技术大学 Defogged image denoising method based on transmissivity
CN106780380A (en) * 2016-12-09 2017-05-31 电子科技大学 A kind of image defogging method and system
CN107203981A (en) * 2017-06-16 2017-09-26 南京信息职业技术学院 Image defogging method based on fog concentration characteristics

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013040857A1 (en) * 2011-09-20 2013-03-28 Fujitsu Limited Exposure enhancement method and apparatus for a defogged image
CN103279931A (en) * 2013-06-03 2013-09-04 中国人民解放军国防科学技术大学 Defogged image denoising method based on transmissivity
CN106780380A (en) * 2016-12-09 2017-05-31 电子科技大学 A kind of image defogging method and system
CN107203981A (en) * 2017-06-16 2017-09-26 南京信息职业技术学院 Image defogging method based on fog concentration characteristics

Also Published As

Publication number Publication date
CN108182671A (en) 2018-06-19

Similar Documents

Publication Publication Date Title
CN108182671B (en) Single image defogging method based on sky area identification
CN107767354B (en) An Image Dehazing Algorithm Based on Dark Primitive Color Prior
CN101783012B (en) An Automatic Image Dehazing Method Based on Dark Channel Color
CN105654436B (en) A kind of backlight image enhancing denoising method based on prospect background separation
WO2016206087A1 (en) Low-illumination image processing method and device
CN108876743A (en) A kind of image rapid defogging method, system, terminal and storage medium
CN111861896A (en) A UUV-Oriented Color Compensation and Restoration Method for Underwater Images
CN109919859B (en) A kind of outdoor scene image defogging enhancement method, computing device and storage medium thereof
CN105550999A (en) Video image enhancement processing method based on background reuse
CN108537756A (en) Single image to the fog method based on image co-registration
CN107958465A (en) A kind of single image to the fog method based on depth convolutional neural networks
CN111462022B (en) Underwater image sharpness enhancement method
CN109087254A (en) Unmanned plane image haze sky and white area adaptive processing method
CN108537760B (en) Infrared image enhancement method based on atmospheric scattering model
CN107067375A (en) A kind of image defogging method based on dark channel prior and marginal information
CN117252773A (en) Image enhancement method and system based on adaptive color correction and guided filtering
CN109272475B (en) Method for rapidly and effectively repairing and strengthening underwater image color
CN108133462A (en) A kind of restored method of the single image based on gradient fields region segmentation
CN114693548B (en) Dark channel defogging method based on bright area detection
CN108765355B (en) Foggy day image enhancement method based on variation Retinex model
CN113379631B (en) Image defogging method and device
CN112750089B (en) Optical remote sensing image defogging method based on local block maximum and minimum pixel prior
Al-Ameen Expeditious contrast enhancement for grayscale images using a new swift algorithm
CN102629369A (en) Single color image shadow removal method based on illumination surface modeling
Chen et al. An adaptive image dehazing algorithm based on dark channel prior

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant