CN116112807B - Exposure fusion method based on single-view satellite image - Google Patents
Exposure fusion method based on single-view satellite image Download PDFInfo
- Publication number
- CN116112807B CN116112807B CN202310049415.7A CN202310049415A CN116112807B CN 116112807 B CN116112807 B CN 116112807B CN 202310049415 A CN202310049415 A CN 202310049415A CN 116112807 B CN116112807 B CN 116112807B
- Authority
- CN
- China
- Prior art keywords
- mapping
- image
- exposure
- neutral
- fusion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Image Processing (AREA)
Abstract
The invention discloses an exposure fusion method based on a single-view satellite image, which comprises the steps of tone mapping the single-view satellite image through three different tone mapping tables, wherein the tone mapping comprises bright part mapping showing bright part details, neutral mapping showing neutral details and dark part mapping showing dark part details, so as to simulate an exposure sequence required by exposure fusion; in order to realize smooth transition of the fusion boundary, a multi-scale Laplacian pyramid is established for the satellite image, and fusion operation is carried out on each level of pyramid; meanwhile, according to the characteristics of the satellite images, the method mainly considers the exposure degree of the satellite images in the fusion process, and secondly considers the saturation and the edge texture, so that the weights occupied by the bright part map, the neutral map and the dark part map in fusion are calculated, and finally the exposure fusion of the scene satellite images is completed, and the problem that the satellite sensor can only provide images obtained through single observation and cannot establish an exposure sequence for exposure fusion by using images shot through multiple exposure is well solved.
Description
[ Field of technology ]
The invention belongs to the field of satellite remote sensing image processing, and particularly relates to an exposure fusion method based on a single-view satellite image.
[ Background Art ]
The radiation resolution of satellite sensors represents the minimum radiation intensity that the satellite can resolve, typically expressed using coded bit depth, e.g., sensors employing 16 bit codes can record 65536 levels of radiation intensity, belonging to high dynamic range imaging (HDR). Most display devices today use 8 bits per channel and a maximum of 256 levels of radiation intensity are recorded, so tone mapping is typically required to output high dynamic range satellite imagery onto low dynamic range devices.
Existing tone mapping algorithms aim to selectively retain different information in HDR. Such as linear mapping, reinhard cut, filmic curve, ACES cut, etc. However, the above tone mapping process inevitably loses image information, especially some bright details (such as snow, cloud cover, artificial reflectors, etc.) and dark details (such as asphalt roads, water bodies, buildings, vegetation, mountain shadows, etc.), which are unfavorable for interpretation of satellite images.
And the exposure fusion technology can solve the problem of information loss existing in the direct use of tone mapping algorithm. For example, a digital camera supporting HDR can perform exposure shooting for multiple times under different times or different light irradiation conditions of a shot object to form an exposure sequence containing different details, and then an image fusion algorithm is used for fusing all images in the exposure sequence. The fused image can simultaneously reserve image details in the underexposure image and the overexposed image, has complete ground feature information, meets the reality of human eye visual perception, and solves the problem of information loss in tone mapping to a certain extent.
However, for satellite sensors, which can only provide images acquired by a single observation, exposure fusion cannot be performed by using images captured by multiple exposures to establish an exposure sequence. Therefore, in order to preserve the integrity of satellite image information, the development of an exposure fusion technology suitable for satellite images is of great significance in facilitating subsequent satellite image interpretation work.
[ Invention ]
Aiming at the problem that the satellite sensor cannot establish an exposure sequence, the invention provides an exposure fusion method based on a single-view satellite image, which solves the problem that the satellite sensor can only provide images obtained by single observation and cannot establish an exposure sequence by using images shot by multiple exposure for exposure fusion, and solves the problem that image information is lost when the satellite influence is subjected to tone mapping, thereby preserving the integrity of the satellite image information and facilitating the smooth implementation of subsequent satellite influence interpretation work.
The invention provides an exposure fusion method based on a single-view satellite image, which is realized by the following technical scheme that:
s1, inputting a satellite image, wherein the satellite image is encoded by adopting 16 bit depth, and the encoding value is a rounding of ten thousand times of reflectivity;
s2, establishing a bright part mapping, neutral mapping and dark part mapping tone mapping table by using catmull-rom spline interpolation, and performing tertiary tone mapping on an input image to respectively obtain satellite images after the bright part mapping, the neutral mapping and the dark part mapping, namely an exposure sequence of the images;
s3, respectively calculating saturation, edge texture and exposure degree of the bright part mapping image, the neutral mapping image and the dark part mapping image;
S4, respectively calculating fusion weights of the satellite images after the bright part mapping, the neutral mapping and the dark part mapping according to the saturation, the edge texture and the exposure degree calculated in the S3;
s5, normalizing the 3 weight values calculated in the step S4 to obtain normalized weights, so that the sum of the normalized weights at each pixel is 1;
s6, constructing a Laplacian image pyramid for the normalized fusion weights calculated in the S5;
S7, constructing a Laplacian image pyramid for the satellite images after the bright part mapping, the neutral mapping and the dark part mapping on the basis of each level pyramid established in the S6, wherein the layer number is consistent with the S6;
and S8, copying the weight Laplacian pyramid obtained in the step S6 twice along the channel, aligning the weight Laplacian pyramid with the image Laplacian pyramid obtained in the step S7, and carrying out image fusion in a way of multiplying the weight Laplacian pyramid with the image Laplacian pyramid of the bright part mapping, the neutral mapping and the dark part mapping.
In particular, the catmull-rom spline interpolation ith control point equation P in S2 is determined by its neighboring three control points (i-1, i+1, i+2), and equation P is as follows:
ax=-xi-1+3xi-3xi+1+xi+2
bx=2xi-1-5xi+4xi+1-xi+2
cx=xi+1-xi-1
dx=2xi
P=0.5*(axt3+bxt2+cxt+d) (1),
in the formula (1), x is the coordinate of the corresponding control point, and a, b, c and d are intermediate variables in the function calculation process.
In particular, the interpolation points used by the bright portion mapping function in S2 are as follows:
The interpolation points used by the neutral mapping function are as follows:
the interpolation points used by the dark mapping function are as follows:
In particular, the tone mapping table in S2 is constructed with 0 as a starting point, 1 as a step length, and the maximum value of the interpolation point x as an end point.
In particular, the saturation S in S3 is calculated according to the following formula:
S=std(R,G,B) (2),
In the formula (2), R, G and B are respectively red, green and blue wave bands in the satellite image, and std is the standard deviation;
The edge texture C is calculated as follows:
In the formula (3), gray is a Gray scale of the image to be processed, and Laplacian 3*3 is a Laplacian;
the exposure degree E is calculated according to the following formula:
E=g(R)*g(G)*g(B)
in equation (4), g (i) represents a gaussian function with an average value of 0.5 and an equation of 0.2.
In particular, the weight calculation formula in S4 is as follows:
Wi,j,k=(Ci,j,k)cr×(Si,j,k)sr×(Ei,j,k)er (5),
In formula (5), k=1, 2,3 are satellite images after bright portion mapping, neutral mapping and dark portion mapping, i, j are row numbers of pixels, cr, sr, and er are index weights occupied by saturation, edge texture and exposure degree, respectively.
In particular, the normalized weight W i,j,k in S5 is calculated according to the following formula:
In particular, the mathematical definition of the ith layer of the laplacian image pyramid in S6 is as follows:
In equation (7), G i is the image of the i-th layer, wi is the i-th layer weight pyramid, UP is upsampling, specifically mapping the pixel with position (x, y) in the source image to the position (2x+1, 2y+1), For convolution, g 5×5 is a gaussian convolution kernel of size 5*5 pixels.
In particular, the mathematical definition of the i-th layer of the laplacian image pyramid constructed by the satellite images after the bright portion mapping, the neutral mapping and the dark portion mapping in S7 is as follows:
In equation (8), gi is the image of the i-th layer, ii is the image pyramid of the i-th layer, UP is upsampling, specifically mapping the pixel with position (x, y) in the source image to the position (2x+1, 2y+1), For convolution, g 5×5 is a gaussian convolution kernel of size 5*5 pixels.
Specifically, the image Ri after each stage of fusion in S8 is calculated according to the following formula:
Ri=wiIi+Ri-1 (9),
In equation (9), ri-1 is the I-1 layer fusion image, w i is the weight laplacian pyramid of the tower replicated twice along the channel, and I i is the image laplacian pyramid of the bright, neutral, and dark maps.
The invention provides an exposure fusion method based on a single-view satellite image, which carries out tone mapping on the single-view satellite image through three different tone mapping tables, wherein the tone mapping method comprises the steps of showing bright part mapping of details, showing neutral mapping of neutral details and showing dark part mapping of dark part details, so as to simulate an exposure sequence required by exposure fusion; in order to realize smooth transition of the fusion boundary, a multi-scale Laplacian pyramid is established for the satellite image, and fusion operation is carried out on each level of pyramid. According to the characteristics of the satellite images, the exposure degree of the satellite images is mainly considered in the fusion process, the saturation and the edge texture are secondly considered, the weights occupied by the bright part map, the neutral map and the dark part map in fusion are calculated, and finally the exposure fusion of the scene satellite images is completed. The exposure fusion method provided by the invention solves the problem that the satellite sensor can only provide images obtained by single observation and cannot establish an exposure sequence by using images shot by multiple exposure for exposure fusion, can adapt to complex illumination conditions and various imaging environments, and can enhance details of dark parts and bright parts under the condition of keeping integral brightness in the synthesized images.
[ Description of the drawings ]
FIG. 1 is a flow chart of an exposure fusion method based on a single-view satellite image;
FIG. 2 is a diagram of a mapping table of a bright portion mapping tone according to an embodiment of the present invention;
FIG. 3 is a neutral-mapped tone mapping table according to an embodiment of the present invention;
FIG. 4 is a dark portion mapping tone mapping table according to an embodiment of the present invention;
FIG. 5 is a satellite image after bright portion mapping according to an embodiment of the present invention;
FIG. 6 is a neutral mapped satellite image according to an embodiment of the present invention;
FIG. 7 is a block diagram of a satellite image after dark portion mapping according to an embodiment of the present invention;
FIG. 8 is a saturation gray-scale map of a bright portion map, a neutral map, and a dark portion map according to an embodiment of the present invention;
FIG. 9 is a gray scale image of the edge texture of the bright portion map, the neutral map and the dark portion map according to an embodiment of the present invention;
FIG. 10 is a gray scale plot of exposure levels for a bright portion map, a neutral map, and a dark portion map according to an embodiment of the present invention;
FIG. 11 is a weight gray scale map of a bright portion map, a neutral map and a dark portion map according to an embodiment of the present invention;
FIG. 12 is a normalized weight gray scale according to an embodiment of the present invention;
FIG. 13 is a Laplacian pyramid gray scale map of a luminance map according to an embodiment of the invention;
FIG. 14 is a Laplacian pyramid gray scale map of a neutral map in accordance with an embodiment of the present invention;
FIG. 15 is a Laplacian pyramid gray scale map of dark portion mapping according to one embodiment of the invention;
fig. 16 is a final fused image according to an embodiment of the present invention.
[ Detailed description ] of the invention
It should be noted that, the national natural science young fund project "extracting knowledge representation and reuse based on remote sensing information of geographic ontology: taking urban earth surface coverage as an example, the national natural science foundation committee, 2021-2023 and 42001331, the importance of remote sensing image processing and color fidelity is mentioned in the project, and the invention provides a sub-meter satellite true color image synthesis method, a sub-meter satellite true color image synthesis system and sub-meter satellite true color image synthesis equipment.
In order to make the objects, technical solutions and advantages of the present invention more clear, the present invention will be further described in detail below with reference to the accompanying drawings and specific embodiments, in which the satellite image used is a Beijing No. two image after being subjected to atmospheric correction, the image is encoded into an unsigned 16-bit integer, the encoding value is a ten-thousand times of the reflectivity, and the image includes a cloud layer with higher brightness, a water body and a mountain with lower brightness, and a city area with neutral brightness.
Referring to fig. 1, the present invention provides an exposure fusion method based on a single-view satellite image.
S1, reading the satellite image;
S2, establishing a bright part mapping, neutral mapping and dark part mapping tone mapping table by using catmull-rom spline interpolation, and performing three tone mapping on the input image to obtain satellite images after the bright part mapping, the neutral mapping and the dark part mapping, namely an exposure sequence of the images.
As shown in fig. 2, fig. 2 is a mapping table of the brightness map tone, and the interpolation points used by the brightness map function in S2 are as follows:
As shown in fig. 3, fig. 3 is a neutral mapping tone mapping table, and interpolation points used by the neutral mapping function are as follows:
As shown in fig. 4, fig. 4 is a dark portion mapping tone table, and interpolation points used by the dark portion mapping function are as follows:
as shown in fig. 5-7, which are satellite image maps after bright, neutral, and dark mapping, respectively.
S3, calculating the saturation, edge texture and exposure degree of the bright part mapping, neutral mapping and dark part mapping images of the satellite image, wherein the saturation S is calculated according to the following formula:
S=std(R,G,B) (2),
in the formula (2), R, G and B are respectively red, green and blue wave bands in the satellite image, std is the standard deviation, and after calculation, FIG. 8 is obtained, and FIG. 8 is a saturation gray scale map of bright part mapping, neutral mapping and dark part mapping;
The edge texture C is calculated as follows:
In the formula (3), gray is a Gray map of an image to be processed, laplacian 3*3 is a Laplacian, after calculation, fig. 9 is obtained, and fig. 9 is an edge texture Gray map of bright part mapping, neutral mapping and dark part mapping;
the exposure degree E is calculated according to the following formula:
E=g(R)*g(G)*g(B)
in the formula (4), g (i) represents a gaussian function with a mean value of 0.5 and an equation of 0.2, and after calculation, fig. 10 is obtained, and fig. 10 is an exposure level gray scale map of a bright portion map, a neutral map and a dark portion map.
S4, respectively calculating fusion weights of the satellite images after the bright part mapping, the neutral mapping and the dark part mapping according to the saturation, the edge texture and the exposure degree calculated in the S3, wherein the weight calculation formula is as follows:
Wi,j,k=(Ci,j,k)5×(Si,j,k)3×(Ei,j,k)2 (5),
in formula (5), k=1, 2,3 are satellite images after the bright portion mapping, the neutral mapping and the dark portion mapping, i, j are row and column numbers of pixels, 5,3,2 are index weights occupied by saturation, edge texture and exposure degree, respectively, and after calculation, the weight gray scale map of fig. 11 is obtained, and fig. 11 is a weight gray scale map of the bright portion mapping, the neutral mapping and the dark portion mapping.
S5, normalizing the 3 weight values calculated in the S4 to obtain normalized weights W i,j,k, enabling the sum of the normalized weights W i,j,k at each pixel to be 1, wherein the normalized weights W i,j,k are calculated according to the following formula:
after calculation, fig. 12 is obtained, and fig. 12 is a normalized weight gray scale map.
S6, constructing a Laplacian image pyramid for the normalized fusion weights calculated in the S5; the mathematical definition of the i-th layer of the laplace image pyramid is as follows:
In equation (7), G i is the image of the i-th layer, wi is the i-th layer weight pyramid, UP is upsampling, specifically mapping the pixel with position (x, y) in the source image to the position (2x+1, 2y+1), For convolution, g5×5 is a gaussian convolution kernel of size 5*5 pixels.
S7, constructing a Laplacian image pyramid for the satellite images after the bright part mapping, the neutral mapping and the dark part mapping on the basis of each level pyramid established in the S6, wherein the layer number is consistent with the S6; the mathematical definition of the ith layer of the Laplacian image pyramid constructed by the satellite images after the bright part mapping, the neutral mapping and the dark part mapping in the S7 is as follows:
In equation (8), gi is the image of the i-th layer, ii is the image pyramid of the i-th layer, UP is upsampling, specifically mapping the pixel with position (x, y) in the source image to the position (2x+1, 2y+1), For convolution, g5×5 is a gaussian convolution kernel of size 5*5 pixels.
Fig. 13-15 are obtained after the above calculation, and fig. 13, 14, 15 are laplacian pyramid gray scale maps of the bright portion map, the neutral map, and the dark portion map, respectively.
S8, copying the weight Laplacian pyramid obtained in S6 twice along the channel, aligning the weight Laplacian pyramid with the image Laplacian pyramid obtained in S7, and carrying out image fusion in a way of multiplying the weight Laplacian pyramid with the image Laplacian pyramid of the bright part mapping, the neutral mapping and the dark part mapping, wherein the image Ri after each stage of fusion is calculated according to the following formula:
Ri=wiIi+Ri-1 (9),
In equation (9), ri-1 is the I-1 layer fusion image, w i is the weight laplacian pyramid of the tower replicated twice along the channel, and I i is the image laplacian pyramid of the bright, neutral, and dark maps. As shown in fig. 16, the image well retains the bright part details and the dark part details in the image, solves the problem that in the prior art, the satellite sensor can only provide images acquired by single observation, and cannot use images shot by multiple exposure to establish an exposure sequence for exposure fusion, so that the image information loss is caused, the integrity of satellite image information is well retained, and the subsequent satellite image interpretation work is convenient to smoothly carry out.
Claims (7)
1. The exposure fusion method based on the single-view satellite image is characterized by comprising the following steps of:
S1, inputting a satellite image, wherein the satellite image is encoded by adopting 16 bit depth, and the encoding value is a rounding of ten thousand times of reflectivity;
S2, establishing a bright part mapping, neutral mapping and dark part mapping tone mapping table by using catmull-rom spline interpolation, and performing tertiary tone mapping on an input image to respectively obtain satellite images after the bright part mapping, the neutral mapping and the dark part mapping, namely an exposure sequence of the images; catmull-rom spline interpolation in S2 Three control points adjacent to the control point equation PThe equation P is determined as follows:
(1),
In the formula (1) of the present invention, A, b, c and d are coordinates corresponding to the control points, and are intermediate variables in the function calculation process;
The interpolation points used by the bright portion mapping function in S2 are as follows:
,
The interpolation points used by the neutral mapping function are as follows:
,
the interpolation points used by the dark mapping function are as follows:
;
The tone mapping table in the S2 is constructed by taking 0 as a starting point, 1 as a step length and the maximum value of the interpolation point x as an end point;
S3, respectively calculating saturation, edge texture and exposure degree of the bright part mapping image, the neutral mapping image and the dark part mapping image;
s4, respectively calculating fusion weights of the satellite images after the bright part mapping, the neutral mapping and the dark part mapping according to the saturation, the edge texture and the exposure degree calculated in the S3;
s5, normalizing the 3 weight values calculated in the step S4 to obtain normalized weights, so that the sum of the normalized weights at each pixel is 1;
s6, constructing a Laplacian image pyramid for the normalized fusion weights calculated in the S5;
S7, constructing a Laplacian image pyramid for the satellite images after the bright part mapping, the neutral mapping and the dark part mapping on the basis of each level pyramid established in the S6, wherein the layer number is consistent with the S6;
And S8, copying the weight Laplacian pyramid obtained in the step S6 twice along the channel, aligning the weight Laplacian pyramid with the image Laplacian pyramid obtained in the step S7, and carrying out image fusion in a way of multiplying the weight Laplacian pyramid with the image Laplacian pyramid of the bright part mapping, the neutral mapping and the dark part mapping.
2. The exposure fusion method based on single-view satellite images according to claim 1, wherein the saturation in S3 isThe method is calculated according to the following formula:
(2),
in the formula (2), R, G and B are respectively red, green and blue wave bands in the satellite image, and std is the standard deviation;
The edge texture C is calculated as follows:
(3),
in formula (3), the For a gray scale image of an image to be processed,Is a Laplacian operator;
the exposure degree E is calculated according to the following formula:
(4),
in equation (4), g (i) represents a gaussian function with an average value of 0.5 and an equation of 0.2.
3. The exposure fusion method based on the single-view satellite image according to claim 1, wherein the weight calculation formula in S4 is as follows:
(5),
in formula (5), k=1, 2,3 are satellite images after bright portion mapping, neutral mapping and dark portion mapping, i, j are row numbers of pixels, cr, sr, and er are index weights occupied by saturation, edge texture and exposure degree, respectively.
4. The exposure fusion method based on single-view satellite images according to claim 1, wherein the normalized weights in S5 areThe method is calculated according to the following formula:
(6)。
5. The exposure fusion method based on single-view satellite images according to claim 1, wherein the mathematical definition of the ith layer of the laplace image pyramid in S6 is as follows:
(7),
In equation (7), gi is the image of the i-th layer, wi is the weight pyramid of the i-th layer, UP is upsampling, specifically mapping the pixel with position (x, y) in the source image to the position (2x+1, 2y+1), In the case of a convolution of the two,Is a gaussian convolution kernel of size 5*5 pixels.
6. The exposure fusion method based on single-view satellite images according to claim 1, wherein mathematical definitions of the i-th layer of the laplace image pyramid constructed by the satellite images after the bright part mapping, the neutral mapping and the dark part mapping in S7 are as follows:
(8),
In equation (8), gi is the image of the i-th layer, ii is the image pyramid of the i-th layer, UP is upsampling, specifically mapping the pixel with position (x, y) in the source image to the position (2x+1, 2y+1), In the case of a convolution of the two,Is a gaussian convolution kernel of size 5*5 pixels.
7. The exposure fusion method based on the single-view satellite image according to claim 1, wherein the image Ri after each level of fusion in S8 is calculated according to the following formula:
(9),
In the formula (9), ri-1 is the i-1 layer fusion image, The laplacian pyramid of weights is replicated twice along the path for the tower,Image laplacian pyramids for bright, neutral, and dark maps.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310049415.7A CN116112807B (en) | 2023-02-01 | 2023-02-01 | Exposure fusion method based on single-view satellite image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310049415.7A CN116112807B (en) | 2023-02-01 | 2023-02-01 | Exposure fusion method based on single-view satellite image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116112807A CN116112807A (en) | 2023-05-12 |
CN116112807B true CN116112807B (en) | 2024-07-23 |
Family
ID=86266964
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310049415.7A Active CN116112807B (en) | 2023-02-01 | 2023-02-01 | Exposure fusion method based on single-view satellite image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116112807B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106530263A (en) * | 2016-10-19 | 2017-03-22 | 天津大学 | Single-exposure high-dynamic range image generation method adapted to medical image |
CN107845128A (en) * | 2017-11-03 | 2018-03-27 | 安康学院 | A kind of more exposure high-dynamics image method for reconstructing of multiple dimensioned details fusion |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101394487B (en) * | 2008-10-27 | 2011-09-14 | 华为技术有限公司 | Image synthesizing method and system |
CN115063331B (en) * | 2022-06-14 | 2024-04-12 | 安徽大学 | Ghost-free multi-exposure image fusion method based on multi-scale block LBP operator |
-
2023
- 2023-02-01 CN CN202310049415.7A patent/CN116112807B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106530263A (en) * | 2016-10-19 | 2017-03-22 | 天津大学 | Single-exposure high-dynamic range image generation method adapted to medical image |
CN107845128A (en) * | 2017-11-03 | 2018-03-27 | 安康学院 | A kind of more exposure high-dynamics image method for reconstructing of multiple dimensioned details fusion |
Also Published As
Publication number | Publication date |
---|---|
CN116112807A (en) | 2023-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111259898B (en) | Crop segmentation method based on unmanned aerial vehicle aerial image | |
WO2021120406A1 (en) | Infrared and visible light fusion method based on saliency map enhancement | |
CN108830796B (en) | Hyperspectral image super-resolution reconstruction method based on spectral-space combination and gradient domain loss | |
CN115223004B (en) | Method for generating image enhancement of countermeasure network based on improved multi-scale fusion | |
CN111292264A (en) | A Deep Learning-Based Image High Dynamic Range Reconstruction Method | |
CN109523617A (en) | A kind of illumination estimation method based on monocular-camera | |
CN114862698B (en) | A real over-exposure image correction method and device based on channel guidance | |
CN115797225B (en) | Unmanned ship acquired image enhancement method for underwater topography measurement | |
CN106815808A (en) | A kind of image split-joint method of utilization piecemeal computing | |
Bi et al. | Haze removal for a single remote sensing image using low-rank and sparse prior | |
CN109255758A (en) | Image enchancing method based on full 1*1 convolutional neural networks | |
Agrafiotis et al. | Underwater photogrammetry in very shallow waters: main challenges and caustics effect removal | |
CN108509887A (en) | A kind of acquisition ambient lighting information approach, device and electronic equipment | |
CN108269242B (en) | Image enhancement method | |
CN114862707B (en) | A multi-scale feature restoration image enhancement method, device and storage medium | |
CN111462022B (en) | Underwater image sharpness enhancement method | |
CN111612882A (en) | Image processing method, image processing device, computer storage medium and electronic equipment | |
CN111462002A (en) | Underwater image enhancement and restoration method based on convolutional neural network | |
CN112016478A (en) | Complex scene identification method and system based on multispectral image fusion | |
CN114549669A (en) | A Color 3D Point Cloud Acquisition Method Based on Image Fusion Technology | |
CN113935917A (en) | A method for removing thin clouds from optical remote sensing images based on cloud computing and multi-scale generative adversarial network | |
CN117197329A (en) | Dynamic water rendering method based on WebGL | |
Lv et al. | Two adaptive enhancement algorithms for high gray-scale RAW infrared images based on multi-scale fusion and chromatographic remapping | |
Bai et al. | Making the Earth clear at night: A high-resolution nighttime light image deblooming network | |
CN110390339B (en) | Image correction method, device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |