[go: up one dir, main page]

CN115564694A - Image processing method and device, computer readable storage medium and electronic device - Google Patents

Image processing method and device, computer readable storage medium and electronic device Download PDF

Info

Publication number
CN115564694A
CN115564694A CN202211144770.4A CN202211144770A CN115564694A CN 115564694 A CN115564694 A CN 115564694A CN 202211144770 A CN202211144770 A CN 202211144770A CN 115564694 A CN115564694 A CN 115564694A
Authority
CN
China
Prior art keywords
image
pyramid
fused
weight
pyramid image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211144770.4A
Other languages
Chinese (zh)
Inventor
李海军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202211144770.4A priority Critical patent/CN115564694A/en
Publication of CN115564694A publication Critical patent/CN115564694A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The disclosure provides an image processing method, an image processing device, a computer readable storage medium and an electronic device, and relates to the technical field of images. The image processing method comprises the following steps: acquiring a plurality of images to be fused of the same shooting scene, and decomposing each image to be fused to obtain a Gaussian pyramid and a Laplacian pyramid of each image to be fused; fusing pyramid images positioned on the same layer in the Gaussian pyramid of each image to be fused to obtain a fused Gaussian pyramid; fusing pyramid images positioned on the same layer in the Laplacian pyramid of each image to be fused to obtain a fused Laplacian pyramid; and generating fused images corresponding to the plurality of images to be fused by utilizing the fused Gaussian pyramid and the fused Laplacian pyramid. The present disclosure can improve image quality.

Description

Image processing method and device, computer readable storage medium and electronic device
Technical Field
The present disclosure relates to the field of imaging technologies, and in particular, to an image processing method, an image processing apparatus, a computer-readable storage medium, and an electronic device.
Background
With the development of the video technology and the popularization of video equipment, the requirements of users on image quality are higher and higher, and therefore, various image processing algorithms are emerging, wherein the HDR (High Dynamic Range Imaging) technology can fuse images with different exposure degrees together to determine highlight and shadow details in a High light ratio environment.
At present, for the processing process of image fusion, some schemes may have the problem of poor quality of the fused image.
Disclosure of Invention
The present disclosure provides an image processing method, an image processing apparatus, a computer-readable storage medium, and an electronic device, thereby overcoming, at least to some extent, the problem of poor quality of an image after fusion.
According to a first aspect of the present disclosure, there is provided an image processing method including: acquiring a plurality of images to be fused of the same shooting scene, and decomposing each image to be fused to obtain a Gaussian pyramid and a Laplacian pyramid of each image to be fused; fusing pyramid images positioned on the same layer in the Gaussian pyramid of each image to be fused to obtain a fused Gaussian pyramid; fusing pyramid images positioned on the same layer in the Laplacian pyramid of each image to be fused to obtain a fused Laplacian pyramid; and generating fused images corresponding to the plurality of images to be fused by using the fused Gaussian pyramid and the fused Laplacian pyramid.
According to a second aspect of the present disclosure, there is provided an image processing method including: acquiring a plurality of images to be fused of the same shooting scene, and decomposing each image to be fused to obtain a Gaussian pyramid of each image to be fused; fusing pyramid images positioned on the same layer in the Gaussian pyramid of each image to be fused to obtain a fused Gaussian pyramid; determining a Laplacian pyramid of a target image to be fused in a plurality of images to be fused; and generating fused images corresponding to the plurality of images to be fused by using the fused Gaussian pyramid and the Laplacian pyramid of the target image to be fused.
According to a third aspect of the present disclosure, there is provided an image processing method including: acquiring a plurality of images to be fused of the same shooting scene, and decomposing each image to be fused to obtain a Laplacian pyramid of each image to be fused; fusing pyramid images positioned on the same layer in the Laplacian pyramid of each image to be fused to obtain a fused Laplacian pyramid; determining a Gaussian pyramid of a target image to be fused in a plurality of images to be fused; and generating fused images corresponding to the plurality of images to be fused by utilizing the Gaussian pyramid and the fused Laplacian pyramid of the target image to be fused.
According to a fourth aspect of the present disclosure, there is provided an image processing apparatus comprising: the image decomposition module is used for acquiring a plurality of images to be fused of the same shooting scene and decomposing each image to be fused to obtain a Gaussian pyramid and a Laplacian pyramid of each image to be fused; the first fusion module is used for fusing pyramid images positioned on the same layer in the Gaussian pyramid of each image to be fused to obtain a fused Gaussian pyramid; the second fusion module is used for fusing pyramid images positioned on the same layer in the Laplacian pyramid of each image to be fused to obtain a fused Laplacian pyramid; and the image generation module is used for generating fused images corresponding to the plurality of images to be fused by utilizing the fused Gaussian pyramid and the fused Laplacian pyramid.
According to a fifth aspect of the present disclosure, there is provided an image processing apparatus comprising: the image decomposition module is used for acquiring a plurality of images to be fused of the same shooting scene and decomposing each image to be fused to obtain a Gaussian pyramid of each image to be fused; the hierarchical fusion module is used for fusing pyramid images positioned on the same layer in the Gaussian pyramid of each image to be fused to obtain a fused Gaussian pyramid; the Laplacian pyramid determining module is used for determining a Laplacian pyramid of a target image to be fused in the plurality of images to be fused; and the image generation module is used for generating fused images corresponding to the plurality of images to be fused by utilizing the fused Gaussian pyramid and the Laplacian pyramid of the target image to be fused.
According to a sixth aspect of the present disclosure, there is provided an image processing apparatus comprising: the image decomposition module is used for acquiring a plurality of images to be fused of the same shooting scene and decomposing each image to be fused to obtain a Laplacian pyramid of each image to be fused; the hierarchical fusion module is used for fusing pyramid images positioned in the same layer in the Laplacian pyramid of each image to be fused to obtain a fused Laplacian pyramid; the Gaussian pyramid determining module is used for determining a Gaussian pyramid of a target image to be fused in the plurality of images to be fused; and the image generation module is used for generating fused images corresponding to the plurality of images to be fused by utilizing the Gaussian pyramid of the target image to be fused and the fused Laplacian pyramid.
According to a seventh aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements any one of the image processing methods described above.
According to an eighth aspect of the present disclosure, there is provided an electronic device comprising a processor; a memory for storing one or more programs which, when executed by the processor, cause the processor to implement any of the image processing methods described above.
In some embodiments of the present disclosure, pyramid images located in the same layer in the gaussian pyramid of the image to be fused are fused and/or pyramid images located in the same layer in the laplacian pyramid of the image to be fused are fused, and a processed image is generated by combining the fused pyramids. Through the layer-by-layer fusion of the pyramid images, the image information with the same scale can be well combined, the detail expression of the images is enhanced, the fusion is more exquisite, the transition of the image edges is smooth, and the image fusion scheme can effectively improve the image quality.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It should be apparent that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived by those of ordinary skill in the art without inventive effort. In the drawings:
FIG. 1 shows a schematic diagram of the image processing stages of an embodiment of the present disclosure;
FIG. 2 illustrates a process diagram of pyramid-based image decomposition and reconstruction in accordance with an embodiment of the present disclosure;
FIG. 3 shows a schematic diagram of a Gaussian pyramid of an embodiment of the disclosure;
fig. 4 shows a schematic diagram of generation of a laplacian pyramid in an embodiment of the disclosure;
FIG. 5 schematically shows a flow chart of an image processing method according to an embodiment of the present disclosure;
FIG. 6 schematically illustrates a diagram of fusion weights according to an embodiment of the disclosure;
fig. 7 schematically shows a flowchart of the entire procedure of the image processing method of the embodiment of the present disclosure;
FIG. 8 schematically shows a flow chart of an image processing method according to another embodiment of the present disclosure;
FIG. 9 schematically illustrates a flow diagram of an image processing method according to yet another embodiment of the present disclosure;
fig. 10 schematically shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure;
fig. 11 schematically shows a block diagram of an image processing apparatus according to another embodiment of the present disclosure;
fig. 12 schematically shows a block diagram of an image processing apparatus according to yet another embodiment of the present disclosure;
fig. 13 schematically illustrates a block diagram of an electronic device according to an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all steps. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation. In addition, all of the following terms "first", "second", "third", "fourth", etc. are for distinguishing purposes only and should not be construed as limiting the present disclosure.
When the electronic device captures an image, under exposure or over exposure may occur, for example, in the case that the capturing scene is a sunset environment, the probability of the under exposure or the over exposure occurring in the single-frame capturing is very high. In this case, exposure bracket shooting may be used, and after the post HDR composition, an image with details in both the bright and dark portions may be obtained. However, how to select appropriate information from multiple frame images and perform reasonable tissue fusion is very important to the presentation effect of the final image. The scheme disclosed by the invention focuses on the fusion process of multi-frame images, and provides a new image processing scheme so as to improve the image quality of the fused image.
The image processing scheme of the embodiments of the present disclosure may be implemented by an electronic device, that is, the electronic device may perform the respective steps of the image processing method described below, and the image processing apparatus described below may be configured within the electronic device. The image processing scheme of the present disclosure may be implemented by an image signal processor equipped in an electronic device, for example. In addition, the present disclosure is not limited to the type of electronic device, and may include, but is not limited to, a smartphone, a tablet, a smart wearable device, a personal computer, a server, and the like.
Fig. 1 shows a schematic diagram of the stages in application of an image processing scheme of an embodiment of the present disclosure. Referring to fig. 1, the input images of the embodiment of the present disclosure are a plurality of images of different exposure degrees, the number of the input images is two or more, and the input images are images generated for the same shooting scene. The input image may be an image captured by an imaging module provided in the electronic apparatus, or may be an image acquired by the electronic apparatus from the outside (i.e., another apparatus). The present disclosure does not limit the image source, the image content, the image size, the bit width of the image data, etc. of the input image.
The image processing process of the present disclosure may be employed to process an input image to obtain a fused image. Subsequently, the electronic device may transmit the fused image back to an ISP pipeline (image signal processing pipeline), and continue to perform processing procedures such as brightness enhancement, tone mapping, denoising, and recognition. The present disclosure does not limit the subsequent processing of the fused image.
The image processing scheme of the embodiment of the disclosure can be applied to scenes such as camera preview, video stream processing and the like. For example, in a scene in which an image with a large exposure degree and an image with a small exposure degree are fused, an image with a small exposure degree is usually dark, and thus the dynamic range of an image with a large exposure degree can be supplemented.
In order to better explain the image processing scheme of the embodiment of the present disclosure, the pyramid processing manner in the image field is explained below with reference to fig. 2 to 4.
Referring to fig. 2, for an original image, gaussian blurring processing may be performed on the original image by using a gaussian kernel, and downsampling the image after gaussian blurring processing to obtain an image with a reduced scale. And repeating the process to obtain a plurality of images with sequentially reduced scales, and constructing a Gaussian pyramid by the images. Fig. 3 shows a schematic diagram of a gaussian pyramid, and as shown in fig. 3, after multiple gaussian blurs and downsampling processes step by step, images of a layer 0 (Level 0), a layer 1 (Level 1), a layer 2 (Level 2), a layer 3 (Level 3), a layer 4 (Level 4), and the like can be obtained, and the images of different scales form the gaussian pyramid.
The present disclosure does not limit the down-sampling manner, such as a down-sampling manner that removes odd rows and odd columns (or removes even rows and even columns), or a random down-sampling manner, or a pooled down-sampling manner.
Except for the top layer, the image sampled on the upper layer (the layer with smaller scale) of the Gaussian pyramid is subtracted from each layer of the Gaussian pyramid, and the Laplacian pyramid can be obtained. The present disclosure does not limit the manner of upsampling, such as by filling rows and columns. In addition, it is understood that the process of gaussian blur processing is also performed after upsampling.
Referring to fig. 4, if the original image is recorded as G0, G0 is downsampled by 2 times to obtain G1, G1 is downsampled by 2 times to obtain G2, and G2 is downsampled by 2 times to obtain G3. G0 to G3 correspond to gaussian pyramids.
G3 is LP 3. G3 is interpolated to get G x 3, and LP2 is obtained by subtracting G x 3 from G2. And G2 is interpolated and amplified to obtain G x 2, and LP1 can be obtained by subtracting G x 2 from G1. G1 is interpolated to obtain G1, and LP0 can be obtained by subtracting G1 from G0. LP0 to LP3 correspond to the laplacian pyramid. A laplacian pyramid can be understood as a pyramid consisting of the residual images subtracted above.
Fig. 4 illustrates an example of a 4-level pyramid structure, i.e., 4 different scales of image data. However, it should be understood that the present disclosure does not limit the number of layers of the pyramid, and the number of layers of the pyramid can be determined by integrating the factors of the processing efficiency of the device, the accuracy of the task requirement, and the like.
After the gaussian pyramid and laplacian pyramid of the original image are determined, the two may be combined to generate a processed image. Specifically, the gaussian pyramid and the laplacian pyramid can be fused in order from small scale to large scale, and the processed image is finally generated after upsampling and convolution operation of a gaussian kernel.
The following describes an image processing method according to an embodiment of the present disclosure, taking an example in which an electronic device executes an image processing procedure. It should be noted that the image processing method of the embodiment of the present disclosure is applied to raw data. Performing the processing based on raw data may preserve more image detail than data in RGB or the like, since there is no loss of raw data.
Fig. 5 schematically shows a flowchart of an image processing method of an exemplary embodiment of the present disclosure. Referring to fig. 5, the image processing method may include the steps of:
s52, obtaining a plurality of images to be fused of the same shooting scene, and decomposing each image to be fused to obtain a Gaussian pyramid and a Laplacian pyramid of each image to be fused.
In an exemplary embodiment of the present disclosure, the image to be fused is an image to be subjected to pyramid decomposition. That is, the subject of image decomposition of the present disclosure is the image to be fused.
The multiple images to be fused of the same shooting scene may be images obtained by aligning brightness of multiple original images shot in the same scene. The method and the device have no requirement on shooting scenes, and the scheme can be applied to any shooting scene.
First, the electronic device may acquire a plurality of original images captured based on different exposure levels in the same shooting scene.
According to some embodiments of the present disclosure, the different exposure degrees may correspond to at least two of an overexposed exposure value, a normally exposed exposure value, and an underexposed exposure value. That is, the acquired plurality of original images may include at least two types of images among an overexposed image, a normally exposed image, and an underexposed image.
Because the details of the dark part of the overexposed image are good, the signal to noise ratio of the overexposed image is good, and the details of the bright part of the underexposed image are good, the overexposed original image and the underexposed original image are combined to perform image fusion, and the subsequent high-quality HDR image can be obtained.
In one aspect, the disclosure does not limit the degree of overexposure and underexposure, that is, the specific exposure values used for different exposure values are not limited. On the other hand, the number of the original images in the embodiment of the present disclosure is two or more, and the specific number thereof is not limited.
It is noted that the plurality of original images may also include overexposed images and not underexposed images, or may include underexposed images and not overexposed images.
Next, the electronic device may perform brightness alignment on the original images by using a proportional relationship between exposure degrees of the original images to obtain a plurality of images to be fused of the same shooting scene.
It will be appreciated that for any original image, there will be an exposure time and a gain, and in the case of characterizing the exposure level by the exposure time and the gain, the exposure level may be defined as the product of the exposure time and the gain. In this case, the proportional relationship of the exposure degrees between the two original images may be a ratio of the exposure degrees of the two original images.
Taking two original images as an example, taking the Ratio of the exposure degrees between the original image with a large exposure degree and the original image with a small exposure degree as Ratio, the luminance alignment can be performed by using equation 1:
l = S Ratio (formula 1)
Wherein, S is an original image with small exposure degree, and L is an image obtained by brightness alignment of S, namely an image to be fused corresponding to S.
For the two original image embodiments, L and the original image with a large exposure degree are the images to be fused in the present disclosure, and the bit width of L is greater than the bit width of the original image with a large exposure degree.
For the case that there are more than two original images, in one embodiment, the original image with the maximum exposure degree may be used as a reference image for brightness alignment, and the images to be fused corresponding to the original images are determined respectively by combining the above formula 1.
In another embodiment, the original image with the best image quality in the plurality of original images may be used as a reference image for luminance alignment, and the images to be fused corresponding to the original images are determined respectively by combining the above formula 1. The image quality can be determined by one or more evaluation indexes of signal-to-noise ratio, under-exposure degree, over-exposure degree and contrast, and the determination process of the image quality is not limited by the disclosure.
In addition, when the difference between the exposure degrees of the original images is smaller than a difference threshold, the electronic device may directly take the original images as the images to be fused.
In particular, the electronic device may determine a difference between the exposure levels of the original images and compare the difference to a difference threshold. If the determined difference is smaller than the difference threshold, the electronic device may directly use the original image as the image to be fused in order to reduce the calculation cost of the algorithm. If the difference is determined to be greater than or equal to the difference threshold, the above-mentioned brightness alignment process may be performed to obtain a plurality of images to be fused.
After obtaining a plurality of images to be fused of the same shooting scene, the electronic device may decompose each image to be fused to obtain a gaussian pyramid and a laplacian pyramid of each image to be fused.
Specifically, a gaussian pyramid of the image to be fused can be determined through gaussian blurring and downsampling, the image sampled on the upper layer of the gaussian pyramid is subtracted from each layer of the gaussian pyramid, and the laplacian pyramid of the image to be fused is constructed through the generated residual images. The specific process is shown in fig. 2 to 4, and is not described herein again.
It should be noted that the present disclosure does not limit the gaussian kernel, the down-sampling and up-sampling modes, the pyramid layer number, and the like used in the gaussian blur.
After the images to be fused are decomposed, the number of layers of the Gaussian pyramid of each image to be fused is the same, and the number of layers of the Laplacian pyramid of each image to be fused is the same.
And S54, fusing pyramid images positioned on the same layer in the Gaussian pyramid of each image to be fused to obtain a fused Gaussian pyramid.
The gaussian pyramid is a pyramid consisting of a plurality of pyramid images with sizes ranging from large to small. For the gaussian pyramid of each image to be fused, the electronic device may fuse the pyramid images of each layer by layer to obtain a fused gaussian pyramid.
For example, the plurality of images to be fused include an image a to be fused and an image B to be fused. Taking the four-layer pyramid as an example, the gaussian pyramid of the image to be fused a may include a first-layer pyramid image a11, a second-layer pyramid image a12, a third-layer pyramid image a13, and a fourth-layer pyramid image a14, and the gaussian pyramid of the image to be fused B may include a first-layer pyramid image B11, a second-layer pyramid image B12, a third-layer pyramid image B13, and a fourth-layer pyramid image B14. The fusion of pyramid images located at the same layer in the gaussian pyramid in the present disclosure refers to the fusion of a11 and b11, the fusion of a12 and b12, the fusion of a13 and b13, and the fusion of a14 and b14, respectively.
The following describes a process of fusing two pyramid images in the same layer in each gaussian pyramid, taking the fusion of the two pyramid images as an example. It will be appreciated that the process of fusing two pyramid images described below can be extended to a fusion process between multiple pyramid images. That is, after determining the manner of fusing two pyramid images, the same processing manner may be adopted to perform the process of fusing two pyramid images, so as to obtain the fusion result of multiple pyramid images.
The multiple images to be fused obtained in step S52 may include a first image to be fused and a second image to be fused, and the exposure degree of the original image corresponding to the first image to be fused is greater than the exposure degree of the original image corresponding to the second image to be fused. The Gaussian pyramid of the first image to be fused comprises a first pyramid image, and an image in the Gaussian pyramid of the second image to be fused, which is positioned on the same layer as the first pyramid image, is a second pyramid image. In this case, the process of the electronic device fusing the first pyramid image with the second pyramid image may include: determining a fusion weight of the first pyramid image and the second pyramid image, and fusing the first pyramid image and the second pyramid image by using the fusion weight.
The process of determining the fusion weight of the first pyramid image and the second pyramid image of the present disclosure is exemplified below.
The first embodiment of the present disclosure to determine the fusion weight:
the electronic device can determine a difference between the first pyramid image and the second pyramid image and determine a fusion weight for the first pyramid image and the second pyramid image based on the difference.
Whether a moving object exists in the image can be determined through the difference between the first pyramid image and the second pyramid image, and therefore weight calculation based on the moving object is achieved. And then, the weight is utilized to realize image fusion, so that the motion information of the image can be reflected, the representation of the motion details of the fused image is further improved, and the image quality of the fused image is improved.
The electronic device may determine a difference in pixel values between each pixel point in the first pyramid image and a pixel point corresponding to a location in the second pyramid image to generate a first difference weight map.
For example, for the pixel e at (i, j) in the first pyramid image, the pixel f at (i, j) in the second pyramid image is also determined, and then the difference between the pixel e and the pixel f is calculated. And (i, j) is the coordinate of the pixel point in the pyramid image, all the pixel points in the pyramid image are traversed, the difference between the pyramid images is calculated, and a first difference weight map can be generated.
It is understood that computing the difference as referred to in this disclosure includes taking the absolute value of the operation, the difference being a value greater than or equal to 0,
for the process of determining the difference between the pixel values of each pixel in the first pyramid image and the pixel corresponding to the position in the second pyramid image, in some embodiments of the present disclosure, the electronic device may directly subtract the two pixel values and then take the absolute value to obtain the difference between the pixel values of the pixels at the same position.
In addition, the direct subtraction of pixel values may result in the calculation result being too discrete, resulting in the problem of unsmooth image area transition. To solve this problem, in other embodiments of the present disclosure, the determination of the difference may be implemented by combining the pixel regions to which the pixel points belong.
First, the electronic device may determine a neighboring pixel point of a first pixel point in the first pyramid image, and the first pixel point and the neighboring pixel point of the first pixel point constitute a first pixel region. For example, the first pixel region is a region composed of the first pixel and 8 neighboring pixels.
Next, the electronic device may determine a second pixel point in the second pyramid image corresponding to the position of the first pixel point, for example, both positions are (i, j) in the pyramid image. And determining adjacent pixel points of the second pixel points, wherein the second pixel points and the adjacent pixel points of the second pixel points form a second pixel area. For example, similarly, the second pixel region is a region composed of the second pixel point and 8 neighboring pixel points around the second pixel point.
The electronics can then determine a difference in pixel values between the first pixel region and the second pixel region as a difference in pixel values between the first pixel point and the second pixel point.
For the process of determining the difference in pixel values between the first pixel region and the second pixel region, the electronic device may traverse each pixel point in the first pixel region, and determine the difference in pixel values between the pixel point in the first pixel region and the pixel point corresponding to the position in the second pixel region. For a pixel area composed of 9 pixels, the position of a first pixel point in the first pixel area corresponds to the position of a second pixel point in the second pixel area, the position of a pixel point at the upper left corner of the first pixel point in the first pixel area corresponds to the position of a pixel point at the upper left corner of the second pixel point in the second pixel area, and the position of a pixel point at the lower right corner of the first pixel point in the first pixel area corresponds to the position of a pixel point at the lower right corner of the second pixel point in the second pixel area.
The electronic device may determine a difference in pixel values between the first pixel region and the second pixel region according to a difference in pixel values between a pixel point in the first pixel region and a pixel point corresponding to a position in the second pixel region.
Specifically, differences in pixel values between pixels in the first pixel region and pixels corresponding to positions in the second pixel region may be accumulated, and an accumulated result may be determined as a difference in pixel values between the first pixel region and the second pixel region. Alternatively, the difference between the pixel values of the pixel points in the first pixel region and the pixel points corresponding to the positions in the second pixel region may be averaged, and the average value may be determined as the difference between the pixel values of the first pixel region and the second pixel region. In addition, the difference of the pixel values between the first pixel region and the second pixel region may also be characterized by using a statistical value, such as variance of the difference, which is not limited by the present disclosure.
After one pixel point is calculated, when the next pixel point is calculated, the pixel areas are overlapped, and the smooth transition image is obtained.
After the difference of the pixel values between each pixel point in the first pyramid image and the pixel point corresponding to the position in the second pyramid image is determined, a first difference weight map can be constructed according to the difference of each pixel point. The first difference weight map is consistent with the first pyramid image in size, and each value on the first difference weight map is a pixel point weight value and is the difference of pixel values of pixel points at corresponding positions.
After generating the first difference weight map, the electronic device can determine a fusion weight of the first pyramid image and the second pyramid image using the first difference weight map.
The electronic device may map a weight value of a pixel point in the first difference weight map to a predetermined bit width range to obtain a second difference weight map. Specifically, the electronic device may determine whether a weighted value of a pixel point in the first difference weight map is consistent with a predetermined bit width range, and if so, do not perform processing, where the second difference weight map is the first difference weight map; and if the pixel point weight values are inconsistent, mapping the pixel point weight values to a preset bit width range to obtain a second difference weight map. For example, the predetermined bit width range is 8 bits (i.e., 0-255), and if the pixel point weight value is 10 bits (0-1023), the electronic device can map 10 bits of data to 8 bits using a mapping curve. The mapping curve is a pre-constructed curve with 10 bits and 8 bits of horizontal and vertical coordinates respectively, and the construction process of the mapping curve and the specific form of the curve are not limited by the disclosure. In addition, the mapping curves referred to herein are for the dimension of the difference weight map, and the difference in dimension results in the mapping curves referred to in the embodiments of the present disclosure being different.
To facilitate subsequent weighting calculation, the electronic device may then perform normalization on the second difference weight map, and determine a result of the normalization as a fusion weight of the first pyramid image and the second pyramid image.
A second embodiment of the present disclosure to determine fusion weights:
the electronic device may adjust the pixel data of the second pyramid image to a bit width consistent with the bit width of the first pyramid image to generate a third pyramid image. Thus, the third pyramid image is aligned with the first pyramid image with luminance information, and the overexposed region is ignored.
In view of the fact that the exposure degree of the original image corresponding to the first image to be fused is greater than the exposure degree of the original image corresponding to the second image to be fused, after the luminance alignment processing, the bit width of the pixel data of the second pyramid image is greater than the bit width of the pixel data of the first pyramid image. In this case, the electronic device may perform a clipping operation on the pixel data of the second pyramid image to adjust the bit width to coincide with the bit width of the first pyramid image. The clipping operation can be called a Clip operation, namely, the clipping operation is set to be 0 when being less than 0 and set to be Max when being more than Max. Max is the bit width of the first pyramid image, e.g., 1024. In addition, max may also be 1024-64=960, for example, in the case of considering the black level.
After generating the third pyramid image, the electronic device can determine a fusion weight of the first pyramid image and the second pyramid image from the third pyramid image.
First, the electronic device may mean filter the third pyramid image (the size of the filter kernel is, for example, 5 × 5) to obtain a fourth pyramid image.
Next, the electronic device may determine a difference between a bit width of the first pyramid image and a pixel value of each pixel point in the fourth pyramid image to obtain a first luminance weight map. That is, the fourth pyramid image is subtracted by MAX, and the result of the subtraction is the first luminance weight map. MAX is a matrix having a size identical to the fourth pyramid image size, and each value is MAX (e.g., 1024) described above.
The electronic device can then determine a fusion weight of the first pyramid image and the second pyramid image using the first luminance weight map.
The electronic device may map the pixel point weight values in the first luminance weight map to a predetermined bit width range to obtain a second luminance weight map. Specifically, the electronic device may determine whether a weighted value of a pixel point in the first luminance weight map is consistent with a predetermined bit width range, and if so, do not perform processing, where the second luminance weight map is the first luminance weight map; and if the pixel point weight values are inconsistent, mapping the pixel point weight values to a preset bit width range to obtain a second brightness weight map. For example, the predetermined bit width range is 8 bits (i.e., 0-255), and if the pixel point weight value is 10 bits (0-1023), the electronic device can map 10 bits of data to 8 bits using a mapping curve. The mapping curve is a curve which is constructed in advance and has the abscissa and the ordinate of 10 bits and the ordinate of 8 bits respectively, and the construction process of the mapping curve and the specific form of the mapping curve are not limited by the disclosure. In addition, the mapping curve mentioned here is for the dimension of the brightness weight map, and the difference of the dimension causes the mapping curve mentioned in the embodiments of the present disclosure to be different.
To facilitate subsequent weighting calculation, the electronic device may then perform normalization on the second luminance weight map, and determine the result after the normalization as the fusion weight of the first pyramid image and the second pyramid image.
A third embodiment of the present disclosure to determine fusion weights:
the electronic device may adjust the pixel data of the second pyramid image to be consistent with the bit width of the first pyramid image to generate a third pyramid image. This process is the same as that in the second embodiment, and is not described again.
After generating the third pyramid image, the electronic device may determine a maximum value of pixel values in a bayer (bayer) unit to which each pixel point in the third pyramid image belongs, and generate a first local maximum weight map from the maximum value corresponding to each pixel point. That is to say, the maximum value of the pixel value in the bayer unit to which the pixel point belongs may be used to replace the pixel value of the pixel point, and all the pixel points are traversed, so as to obtain the first local maximum value weight map.
Next, the electronic device may determine a fusion weight of the first pyramid image and the second pyramid using the first local maximum weight map.
The electronic device may map a weight value of a pixel point in the first local maximum weight map to a predetermined bit width range to obtain a second local maximum weight map. Specifically, the electronic device may determine whether a weight value of a pixel point in the first local maximum value weight map is consistent with a predetermined bit width range, and if so, do not perform processing, where the second local maximum value weight map is the first local maximum value weight map; and if the pixel point weight values are inconsistent, mapping the pixel point weight values to a preset bit width range to obtain a second local maximum value weight map. For example, the predetermined bit width range is 8 bits (i.e., 0-255), and if the pixel point weight value is 10 bits (0-1023), the electronic device can map 10 bits of data to 8 bits using a mapping curve. The mapping curve is a pre-constructed curve with 10 bits and 8 bits of horizontal and vertical coordinates respectively, and the construction process of the mapping curve and the specific form of the curve are not limited by the disclosure. In addition, the mapping curve described herein is for the dimension of the local maximum weight map, and the difference in dimension causes the mapping curve described in the embodiments of the present disclosure to be different.
To facilitate subsequent weighting calculation, the electronic device may then perform normalization processing on the second local maximum weight map, and determine a result after the normalization processing as a fusion weight of the first pyramid image and the second pyramid image.
A fourth embodiment of the present disclosure to determine fusion weights:
the electronic device may adjust the pixel data of the second pyramid image to be consistent with the bit width of the first pyramid image to generate a third pyramid image. This process is the same as that in the second embodiment, and is not described again.
On one hand, the electronic device may perform mean filtering on the third pyramid image to obtain a fourth pyramid image, and determine a bit width of the first pyramid image and a difference between pixel values of pixels in the fourth pyramid image to obtain a first luminance weight map. And mapping the pixel point weight in the first brightness weight map to a preset bit width range to obtain a second brightness weight map. This process is the same as that in the second embodiment, and is not described again.
On the other hand, the electronic device may determine a maximum value of pixel values in a bayer unit to which each pixel point in the third pyramid image belongs, and generate the first local maximum value weight map from the maximum value of pixel values in the bayer unit to which each pixel point in the third pyramid image belongs. And mapping the pixel point weight value in the first local maximum value weight map to the preset bit width range to obtain a second local maximum value weight map. This process is the same as that in the third embodiment, and is not described again.
Then, the electronic device may multiply the second luminance weight map by the second local maximum weight map, normalize the result of the multiplication, and determine the result of the normalization as the fusion weight of the first pyramid image and the second pyramid image.
A fifth embodiment of the present disclosure to determine the fusion weight:
the electronic device may adjust the pixel data of the second pyramid image to be consistent with the bit width of the first pyramid image to generate a third pyramid image. This process is the same as that in the second embodiment, and is not described again.
After generating the third pyramid image, the electronic device can determine a fusion weight of the first pyramid image and the second pyramid image from the first pyramid image, the second pyramid image, and the third pyramid image.
Specifically, the electronic device may determine a difference in pixel value between each pixel point in the first pyramid image and a pixel point corresponding to a position in the second pyramid image to generate a first difference weight map. This process is the same as that in the first embodiment, and is not described again.
The electronic device can determine a fusion weight of the first pyramid image and the second pyramid image from the first difference weight map and the third pyramid image.
On one hand, the electronic device may perform mean filtering on the third pyramid image to obtain a fourth pyramid image, and determine a difference between a bit width of the first pyramid image and a pixel value of each pixel point in the fourth pyramid image to obtain a first luminance weight map. And mapping the pixel point weight in the first brightness weight map to a preset bit width range to obtain a second brightness weight map. This process is the same as that in the second embodiment, and is not described again.
On the other hand, the electronic device may map the pixel point weight value in the first difference weight map to the predetermined bit width range to obtain a second difference weight map. This process is the same as that in the first embodiment, and is not described again.
Then, the electronic device may multiply the second luminance weight map by the second difference weight map, normalize the result of the multiplication, and determine the result of the normalization as the fusion weight of the first pyramid image and the second pyramid image.
A sixth embodiment of the present disclosure to determine fusion weights:
the electronic device may adjust the pixel data of the second pyramid image to be consistent with the bit width of the first pyramid image to generate a third pyramid image. This process is the same as that in the second embodiment, and is not described again.
After generating the third pyramid image, the electronic device can determine a fusion weight of the first pyramid image and the second pyramid image from the first pyramid image, the second pyramid image, and the third pyramid image.
Specifically, the electronic device may determine a difference in pixel value between each pixel point in the first pyramid image and a pixel point corresponding to a position in the second pyramid image to generate a first difference weight map. This process is the same as that in the first embodiment, and is not described again.
The electronic device can determine a fusion weight of the first pyramid image and the second pyramid image from the first difference weight map and the third pyramid image.
In one aspect, the electronic device may determine a maximum value of pixel values in a bayer cell to which each pixel point in the third pyramid image belongs, and generate the first local maximum weight map from the maximum value of pixel values in the bayer cell to which each pixel point in the third pyramid image belongs. And mapping the weight value of the pixel point in the first local maximum value weight map to the preset bit width range to obtain a second local maximum value weight map. This process is the same as that in the third embodiment, and is not described again.
On the other hand, the electronic device may map the weight value of the pixel point in the first difference weight map to the predetermined bit width range to obtain a second difference weight map. This process is the same as that in the first embodiment, and is not described again.
Then, the electronic device may multiply the second local maximum weight map by the second difference weight map, normalize the result of the multiplication, and determine the result of the normalization as a fusion weight of the first pyramid image and the second pyramid image.
A seventh embodiment of the present disclosure to determine the fusion weight:
the electronic device may adjust the pixel data of the second pyramid image to be consistent with the bit width of the first pyramid image to generate a third pyramid image. This process is the same as that in the second embodiment, and is not described again.
After generating the third pyramid image, the electronic device can determine a fusion weight of the first pyramid image and the second pyramid image from the first pyramid image, the second pyramid image, and the third pyramid image.
Specifically, the electronic device may determine a difference in pixel value between each pixel point in the first pyramid image and a pixel point corresponding to a position in the second pyramid image to generate a first difference weight map. This process is the same as that in the first embodiment, and is not described again.
The electronic device can determine a fusion weight of the first pyramid image and the second pyramid image from the first difference weight map and the third pyramid image.
On one hand, the electronic device may perform mean filtering on the third pyramid image to obtain a fourth pyramid image, and determine a difference between a bit width of the first pyramid image and a pixel value of each pixel point in the fourth pyramid image to obtain a first luminance weight map. And mapping the pixel point weight in the first brightness weight map to a preset bit width range to obtain a second brightness weight map. This process is the same as that in the second embodiment, and is not described again.
On the other hand, the electronic device may determine a maximum value of pixel values in a bayer unit to which each pixel point in the third pyramid image belongs, and generate the first local maximum value weight map from the maximum value of pixel values in the bayer unit to which each pixel point in the third pyramid image belongs. And mapping the pixel point weight value in the first local maximum value weight map to the preset bit width range to obtain a second local maximum value weight map. This process is the same as that in the third embodiment, and is not described again.
In another aspect, the electronic device may map the weight value of the pixel point in the first difference weight map to the predetermined bit width range to obtain a second difference weight map. This process is the same as that in the first embodiment, and is not described again.
Then, the electronic device may multiply the second difference weight map, the second luminance weight map, and the second local maximum value weight map, normalize the multiplication result, and determine the normalization result as a fusion weight of the first pyramid image and the second pyramid image.
Referring to fig. 6, in the process of determining the fusion weight of the first pyramid image and the second pyramid image, the pixel difference weight, the luminance weight, and the local maximum weight may be combined, multiplied by each other, to determine an intermediate weight, and then the intermediate weight is normalized to obtain a final fusion weight.
It should be understood that one of the above seven embodiments for determining the fusion weight can be implemented to determine the fusion weight, and the fusion weight determined by the present disclosure in the above manner is the fusion weight of the first pyramid image, which is denoted as w1. The fusion weight of the second pyramid image is denoted w2, then w2=1-w1. If the first pyramid image is denoted as L1 and the second pyramid image is denoted as L2, the fusion result L' of the first pyramid image and the second pyramid image can be expressed as formula 2:
l' = L1 w1+ L2 w2 (formula 2)
And traversing each layer of pyramid image of the Gaussian pyramid to perform the fusion process of the first pyramid image and the second pyramid image, thereby obtaining the fused Gaussian pyramid.
And S56, fusing the pyramid images positioned on the same layer in the Laplacian pyramid of each image to be fused to obtain a fused Laplacian pyramid.
For the laplacian pyramid of each image to be fused, the electronic device may fuse the pyramid images of each layer by layer to obtain a fused laplacian pyramid.
For example, the laplacian pyramid of the image to be fused a may include a first-layer pyramid image a21, a second-layer pyramid image a22, a third-layer pyramid image a23, and a fourth-layer pyramid image a24, and the laplacian pyramid of the image to be fused B may include a first-layer pyramid image B21, a second-layer pyramid image B22, a third-layer pyramid image B23, and a fourth-layer pyramid image B24. The fusing of pyramid images located at the same layer in the laplacian pyramid in the present disclosure means respectively fusing a21 and b21, a22 and b22, a23 and b23, and a24 and b24.
The fusion process of the pyramid images for the laplacian pyramid is similar to the fusion process of the first pyramid image and the second pyramid image described in step S54. That is to say, in the fusion process of the pyramid images of the laplacian pyramid, the fusion weight can be determined by one of the seven embodiments for determining the fusion weight, and the fusion of the pyramid images of the laplacian pyramid is realized by combining the fusion weight. And will not be described in detail herein.
Furthermore, it is understood that the present disclosure does not limit the execution order of step S54 and step S56. That is, step S56 may be performed first and then step S54 may be performed, or both steps may be performed simultaneously.
And S58, generating fused images corresponding to the plurality of images to be fused by using the fused Gaussian pyramid and the fused Laplacian pyramid.
After determining the fused gaussian pyramid and the fused laplacian pyramid, the electronic device may fuse the fused gaussian pyramid and the fused laplacian pyramid, and generate a fused image corresponding to the multiple images to be fused through upsampling and gaussian kernel calculation as shown in fig. 2, so as to obtain an image with improved quality corresponding to the multiple original images.
The following describes the entire process of the image processing method according to the embodiment of the present disclosure with reference to fig. 7 by taking two input images as an example.
In step S702, the electronic device acquires a long-exposure raw image and a short-exposure raw image, and performs luminance alignment on the long-exposure raw image and the short-exposure raw image.
In step S704, the electronic device may perform pyramid decomposition on the image after the brightness alignment to obtain a gaussian pyramid and a laplacian pyramid.
In step S706, the electronic device may perform Clip processing on the luminance-aligned pyramid image corresponding to the short-exposure image to generate a short-frame luminance map having a bit width consistent with that of the long-exposure image.
In step S708, the electronic device can calculate a fusion weight in conjunction with the short frame luminance map. For example, the dimension formed by the fusion weight includes the pixel difference weight, the luminance weight, and the local maximum weight, and the three are multiplied and normalized to obtain the fusion weight.
In step S710, the electronic device may perform pyramid image fusion layer by layer according to the corresponding fusion weights of the pyramid images of different layers.
In step S712, the electronic device may perform image reconstruction using the gaussian pyramid and the laplacian pyramid generated by layer-by-layer fusion to generate fused images corresponding to the long-exposed raw image and the short-exposed raw image acquired in step S710.
In the image processing method, the Gaussian pyramid and the Laplacian pyramid are respectively fused layer by layer, so that on one hand, a fusion result with better detail expression and smooth transition can be obtained, a high dynamic range is obtained, and ghost (ghost) is eliminated; on the other hand, the fusion weight construction strategy combining the pixel difference, the brightness and the local maximum value can effectively detect the moving object and balance the brightness effect, and further improves the image quality of the fusion image.
In addition, although the image processing procedure is mainly directed to raw domain images, it should be noted that the pyramid layer-by-layer fusion concept and the fusion weight determination concept of the present disclosure may also be applied to RGB and RYB images, for example, and the present disclosure is not limited thereto.
In the image processing process, the process of layer-by-layer fusion is performed on both the gaussian pyramid and the laplacian pyramid. However, in other embodiments of the present disclosure, the layer-by-layer fusion process may also be performed on one of the gaussian pyramid and the laplacian pyramid.
Further, the present disclosure also provides another image processing method.
Fig. 8 schematically shows a flowchart of an image processing method according to another embodiment of the present disclosure. Referring to fig. 8, the image processing method may include the steps of:
s82, acquiring a plurality of images to be fused of the same shooting scene, and decomposing each image to be fused to obtain the Gaussian pyramid of each image to be fused.
The process of obtaining the gaussian pyramid of each image to be fused and each image to be fused is the same as that in step S52, and is not described herein again.
And S84, fusing the pyramid images positioned on the same layer in the Gaussian pyramid of each image to be fused to obtain a fused Gaussian pyramid.
The process of layer-by-layer fusing the gaussian pyramid is the same as the process of step S54, and is not described herein again.
S86, determining the Laplacian pyramid of the target image to be fused in the plurality of images to be fused.
In an exemplary embodiment of the present disclosure, the target image to be fused may be any one of the images to be fused; or the image corresponding to the maximum or minimum original exposure degree or the medium exposure degree in the image to be fused; the image quality may be the highest image quality in the images to be fused, and the image quality may be determined by one or more evaluation indexes of signal-to-noise ratio, under exposure degree, over exposure degree, and contrast ratio. The present disclosure does not limit the manner in which the target to-be-fused image is determined.
In the case that the gaussian pyramid of the target image to be fused is determined in step S82, the laplacian pyramid of the target image to be fused can be obtained by using the gaussian pyramid of the target image to be fused. This process will not be described in detail.
And S88, generating fused images corresponding to the multiple images to be fused by using the fused Gaussian pyramid and the Laplacian pyramid of the target image to be fused.
Under the condition that the fused gaussian pyramid and the laplacian pyramid of the target image to be fused are determined, the electronic device can generate fused images corresponding to the multiple images to be fused by using the fused gaussian pyramid and the laplacian pyramid.
In the image processing method from step S82 to step S86, the gaussian pyramids are fused layer by layer, so that not only can a fused image with good detail expression and smooth transition be obtained, but also low-frequency noise can be effectively reduced, and image quality is improved.
Furthermore, the present disclosure also provides another image processing method.
Fig. 9 schematically shows a flowchart of an image processing method according to still another embodiment of the present disclosure. Referring to fig. 9, the image processing method may include the steps of:
s92, obtaining a plurality of images to be fused of the same shooting scene, and decomposing each image to be fused to obtain the Laplacian pyramid of each image to be fused.
The process of obtaining the laplacian pyramid of each image to be fused is the same as that in step S52, and is not repeated herein.
And S94, fusing the pyramid images positioned on the same layer in the Laplacian pyramid of each image to be fused to obtain a fused Laplacian pyramid.
The process of performing layer-by-layer fusion on the laplacian pyramid is the same as the process of step S56, i.e., similar to the process of fusing the first pyramid image and the second pyramid image in step S54, and is not repeated herein.
And S96, determining a Gaussian pyramid of the target image to be fused in the plurality of images to be fused.
In an exemplary embodiment of the present disclosure, the target image to be fused may be any one of the images to be fused; or the image corresponding to the maximum or minimum original exposure degree or the medium exposure degree in the image to be fused; the image quality can also be the image with the highest image quality in the images to be fused, and the image quality can be determined by one or more evaluation indexes of signal-to-noise ratio, under exposure degree, over exposure degree and contrast. The present disclosure does not limit the manner in which the target to-be-fused image is determined.
The process of determining the gaussian pyramid of the target image to be fused is similar to the related description in step S52, and is not repeated here.
And S98, generating fused images corresponding to the plurality of images to be fused by utilizing the Gaussian pyramid and the fused Laplacian pyramid of the target image to be fused.
Under the condition that the fused laplacian pyramid and the gaussian pyramid of the target image to be fused are determined, the electronic device can generate fused images corresponding to the multiple images to be fused by using the laplacian pyramid and the gaussian pyramid.
In the image processing method from step S92 to step S96, by performing layer-by-layer fusion on the laplacian pyramid, not only can a fused image with good detail expression and smooth transition be obtained, but also high-frequency noise can be effectively reduced, and image quality is improved.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Further, an image processing apparatus is also provided in the present exemplary embodiment.
Fig. 10 schematically shows a block diagram of an image processing apparatus of an exemplary embodiment of the present disclosure. Referring to fig. 10, the image processing apparatus 10 according to an exemplary embodiment of the present disclosure may include an image decomposition module 101, a first fusion module 103, a second fusion module 105, and an image generation module 107.
Specifically, the image decomposition module 101 may be configured to obtain a plurality of images to be fused in the same shooting scene, and decompose each image to be fused to obtain a gaussian pyramid and a laplacian pyramid of each image to be fused; the first fusion module 103 may be configured to fuse pyramid images located in the same layer in the gaussian pyramid of each image to be fused to obtain a fused gaussian pyramid; the second fusion module 105 may be configured to fuse pyramid images located in the same layer in the laplacian pyramid of each image to be fused to obtain a fused laplacian pyramid; the image generating module 107 may be configured to generate a fused image corresponding to the multiple images to be fused by using the fused gaussian pyramid and the fused laplacian pyramid.
According to an exemplary embodiment of the present disclosure, the process of the image decomposition module 101 acquiring a plurality of images to be fused of the same shooting scene may be configured to perform: acquiring a plurality of original images acquired based on different exposure degrees in the same shooting scene; and aligning the brightness of the plurality of original images by utilizing the proportional relation of the exposure degrees among the original images to obtain a plurality of images to be fused of the same shooting scene.
According to an exemplary embodiment of the disclosure, the plurality of images to be fused include a first image to be fused and a second image to be fused, the exposure degree of the original image corresponding to the first image to be fused is greater than the exposure degree of the original image corresponding to the second image to be fused, the gaussian pyramid of the first image to be fused includes a first pyramid image, and an image in the gaussian pyramid of the second image to be fused, which is located on the same layer as the first pyramid image, is a second pyramid image. In this case, the process of the first fusing module 103 fusing the first pyramid image with the second pyramid image may be configured to perform: determining a fusion weight of the first pyramid image and the second pyramid image; the first pyramid image is fused with the second pyramid image using the fusion weight.
According to an exemplary embodiment of the disclosure, the process of the first fusing module 103 determining the fusing weight of the first pyramid image and the second pyramid image may be configured to perform: determining a difference between the first pyramid image and the second pyramid image, and determining a fusion weight of the first pyramid image and the second pyramid image according to the difference.
According to an exemplary embodiment of the disclosure, the first fusion module 103 may be configured to perform: determining the difference of pixel values between each pixel point in the first pyramid image and a pixel point corresponding to the position in the second pyramid image to generate a first difference weight map; determining a fusion weight of the first pyramid image and the second pyramid image using the first difference weight map.
According to an exemplary embodiment of the present disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image using the first difference weight map may be configured to perform: mapping the pixel point weight value in the first difference weight map to a preset bit width range to obtain a second difference weight map; and carrying out normalization processing on the second difference weight graph, and determining the normalization processing result as the fusion weight of the first pyramid image and the second pyramid image.
According to an exemplary embodiment of the present disclosure, the process of the first fusing module 103 determining the fusing weight of the first pyramid image and the second pyramid image may be configured to perform: adjusting the pixel data of the second pyramid image to be consistent with the bit width of the first pyramid image so as to generate a third pyramid image; determining a fusion weight of the first pyramid image and the second pyramid image according to the third pyramid image.
According to an exemplary embodiment of the present disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image according to the third pyramid image may be configured to perform: performing mean filtering on the third pyramid image to obtain a fourth pyramid image; determining a fusion weight of the first pyramid image and the second pyramid image according to the fourth pyramid image.
According to an exemplary embodiment of the present disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image according to the fourth pyramid image may be configured to perform: determining the difference value between the bit width of the first pyramid image and the pixel value of each pixel point in the fourth pyramid image to obtain a first brightness weight map; determining a fusion weight of the first pyramid image and the second pyramid image using the first luminance weight map.
According to an exemplary embodiment of the present disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image using the first luminance weight map may be configured to perform: mapping the weighted value of the pixel point in the first brightness weight map to a preset bit width range to obtain a second brightness weight map; and performing normalization processing on the second brightness weight graph, and determining the normalization processing result as the fusion weight of the first pyramid image and the second pyramid image.
According to an exemplary embodiment of the present disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image according to the third pyramid image may be configured to perform: determining the maximum value of the pixel value in the Bayer unit to which each pixel point belongs in the third pyramid image, and generating a first local maximum value weight map according to the maximum value of the pixel value in the Bayer unit to which each pixel point belongs in the third pyramid image; determining a fusion weight of the first pyramid image and the second pyramid image using the first local maximum weight map.
According to an exemplary embodiment of the disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image using the first local maximum weight map may be configured to perform: mapping the pixel point weight value in the first local maximum value weight map to a preset bit width range to obtain a second local maximum value weight map; and carrying out normalization processing on the second local maximum value weight image, and determining the normalization processing result as the fusion weight of the first pyramid image and the second pyramid image.
According to an exemplary embodiment of the present disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image according to the third pyramid image may be configured to perform: performing mean filtering on the third pyramid image to obtain a fourth pyramid image; determining the difference value between the bit width of the first pyramid image and the pixel value of each pixel point in the fourth pyramid image to obtain a first brightness weight map; determining the maximum value of the pixel value in the Bayer unit to which each pixel point in the third pyramid image belongs, and generating a first local maximum value weight map according to the maximum value of the pixel value in the Bayer unit to which each pixel point in the third pyramid image belongs; determining a fusion weight of the first pyramid image and the second pyramid image using the first luminance weight map and the first local maximum weight map.
According to an exemplary embodiment of the present disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image using the first luminance weight map and the first local maximum weight map may be configured to perform: mapping the pixel point weight value in the first brightness weight map to a preset bit width range to obtain a second brightness weight map; mapping the pixel point weight value in the first local maximum value weight map to a preset bit width range to obtain a second local maximum value weight map; and multiplying the second brightness weight map by the second local maximum value weight map, carrying out normalization processing on the multiplication result, and determining the normalization processing result as the fusion weight of the first pyramid image and the second pyramid image.
According to an exemplary embodiment of the disclosure, the process of the first fusing module 103 determining the fusing weight of the first pyramid image and the second pyramid image may be configured to perform: adjusting the pixel data of the second pyramid image to be consistent with the bit width of the first pyramid image so as to generate a third pyramid image; determining a fusion weight of the first pyramid image and the second pyramid image according to the first pyramid image, the second pyramid image and the third pyramid image.
According to an exemplary embodiment of the present disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image according to the first pyramid image, the second pyramid image and the third pyramid image may be configured to perform: determining the difference of pixel values between each pixel point in the first pyramid image and a pixel point corresponding to the position in the second pyramid image to generate a first difference weight map; and determining the fusion weight of the first pyramid image and the second pyramid image according to the first difference weight map and the third pyramid image.
According to an exemplary embodiment of the disclosure, the process of the first fusion module 103 determining the difference in pixel value between each pixel point in the first pyramid image and the pixel point corresponding to the position in the second pyramid image may be configured to perform: determining adjacent pixel points of a first pixel point in the first pyramid image, wherein the first pixel point and the adjacent pixel points of the first pixel point form a first pixel area; determining a second pixel point corresponding to the first pixel point in the second pyramid image, and determining an adjacent pixel point of the second pixel point, wherein the second pixel point and the adjacent pixel point of the second pixel point form a second pixel area; and determining the difference of the pixel values between the first pixel area and the second pixel area as the difference of the pixel values between the first pixel point and the second pixel point.
According to an exemplary embodiment of the present disclosure, the process of the first fusion module 103 determining the difference in pixel values between the first pixel region and the second pixel region may be configured to perform: traversing each pixel point in the first pixel region, and determining the difference of pixel values between the pixel point in the first pixel region and the pixel point corresponding to the position in the second pixel region; and determining the difference of the pixel values between the first pixel area and the second pixel area according to the difference of the pixel values between the pixel points in the first pixel area and the pixel points corresponding to the positions in the second pixel area.
According to an exemplary embodiment of the present disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image according to the first difference weight map and the third pyramid image may be configured to perform: performing mean filtering on the third pyramid image to obtain a fourth pyramid image; determining the difference value between the bit width of the first pyramid image and the pixel value of each pixel point in the fourth pyramid image to obtain a first brightness weight map; determining a fusion weight of the first pyramid image and the second pyramid image using the first luminance weight map and the first difference weight map.
According to an exemplary embodiment of the present disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image using the first luminance weight map and the first difference weight map may be configured to perform: mapping the pixel point weight value in the first brightness weight map to a preset bit width range to obtain a second brightness weight map; mapping the weighted value of the pixel point in the first difference weight map to a preset bit width range to obtain a second difference weight map; and multiplying the second brightness weight map by the second difference weight map, normalizing the multiplied result, and determining the normalized result as the fusion weight of the first pyramid image and the second pyramid image.
According to an exemplary embodiment of the present disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image according to the first difference weight map and the third pyramid image may be configured to perform: determining the maximum value of the pixel value in the Bayer unit to which each pixel point in the third pyramid image belongs, and generating a first local maximum value weight map according to the maximum value of the pixel value in the Bayer unit to which each pixel point in the third pyramid image belongs; determining a fusion weight of the first pyramid image and the second pyramid image using the first local maximum weight map and the first difference weight map.
According to an exemplary embodiment of the disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image using the first local maximum weight map and the first difference weight map may be configured to perform: mapping the weighted value of the pixel point in the first local maximum value weighted graph to a preset bit width range to obtain a second local maximum value weighted graph; mapping the weighted value of the pixel point in the first difference weight map to a preset bit width range to obtain a second difference weight map; and multiplying the second local maximum value weight map by the second difference weight map, normalizing the multiplied result, and determining the normalized result as the fusion weight of the first pyramid image and the second pyramid image.
According to an exemplary embodiment of the present disclosure, the process of the first fusion module 103 determining the fusion weight of the first pyramid image and the second pyramid image according to the first difference weight map and the third pyramid image may be configured to perform: performing mean filtering on the third pyramid image to obtain a fourth pyramid image; determining the difference value between the bit width of the first pyramid image and the pixel value of each pixel point in the fourth pyramid image to obtain a first brightness weight map; determining the maximum value of the pixel value in the Bayer unit to which each pixel point belongs in the third pyramid image, and generating a first local maximum value weight map according to the maximum value of the pixel value in the Bayer unit to which each pixel point belongs in the third pyramid image; determining a fusion weight of the first pyramid image and the second pyramid image using the first difference weight map, the first luminance weight map, and the first local maximum weight map.
According to an exemplary embodiment of the disclosure, the process of the first fusing module 103 determining the fusing weight of the first pyramid image and the second pyramid image using the first difference weight map, the first luminance weight map, and the first local maximum weight map may be configured to perform: mapping the pixel point weight value in the first difference weight map to a preset bit width range to obtain a second difference weight map; mapping the pixel point weight value in the first brightness weight map to a preset bit width range to obtain a second brightness weight map; mapping the pixel point weight value in the first local maximum value weight map to a preset bit width range to obtain a second local maximum value weight map; and multiplying the second difference weight map, the second brightness weight map and the second local maximum value weight map, normalizing the multiplication result, and determining the normalization result as the fusion weight of the first pyramid image and the second pyramid image.
Further, another image processing apparatus is also provided in an exemplary embodiment of the present disclosure.
Fig. 11 schematically shows a block diagram of an image processing apparatus according to another exemplary embodiment of the present disclosure. Referring to fig. 11, the image processing apparatus 11 according to an exemplary embodiment of the present disclosure may include an image decomposition module 111, a hierarchical fusion module 113, a laplacian pyramid determination module 115, and an image generation module 117.
Specifically, the image decomposition module 111 may be configured to obtain a plurality of images to be fused in the same shooting scene, and decompose each image to be fused to obtain a gaussian pyramid of each image to be fused; the hierarchical fusion module 113 may be configured to fuse pyramid images located in the same layer in the gaussian pyramid of each image to be fused to obtain a fused gaussian pyramid; the laplacian pyramid determination module 115 may be configured to determine a laplacian pyramid of a target image to be fused in the multiple images to be fused; the image generating module 117 may be configured to generate a fused image corresponding to a plurality of images to be fused by using the fused gaussian pyramid and the laplacian pyramid of the target image to be fused.
The process of layer-by-layer pyramid image fusion performed by the layered fusion module 113 is similar to the process of fusing the first pyramid image and the second pyramid image by the first fusion module 103, and is not described herein again.
Further, still another image processing apparatus is provided in an exemplary embodiment of the present disclosure.
Fig. 12 schematically shows a block diagram of an image processing apparatus according to still another exemplary embodiment of the present disclosure. Referring to fig. 12, the image processing apparatus 12 according to an exemplary embodiment of the present disclosure may include an image decomposition module 121, a hierarchical fusion module 123, a gaussian pyramid determination module 125, and an image generation module 127.
Specifically, the image decomposition module 121 may be configured to obtain multiple images to be fused in the same shooting scene, and decompose each image to be fused to obtain a laplacian pyramid of each image to be fused; the hierarchical fusion module 123 may be configured to fuse pyramid images located in the same layer in the laplacian pyramid of each image to be fused to obtain a fused laplacian pyramid; the gaussian pyramid determining module 125 may be configured to determine a gaussian pyramid of a target image to be fused in the multiple images to be fused; the image generating module 127 may be configured to generate a fused image corresponding to a plurality of images to be fused by using the gaussian pyramid and the fused laplacian pyramid of the target image to be fused.
The process of layer-by-layer pyramid image fusion performed by the layered fusion module 123 is similar to the process of fusing the first pyramid image and the second pyramid image by the first fusion module 103, and is not described herein again.
Since each functional module of the image processing apparatus in the embodiment of the present disclosure is the same as that in the embodiment of the method described above, it is not described herein again.
FIG. 13 shows a schematic diagram of an electronic device suitable for use in implementing exemplary embodiments of the present disclosure. It should be noted that the electronic device shown in fig. 13 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
The electronic device of the present disclosure includes at least a processor and a memory for storing one or more programs, which when executed by the processor, cause the processor to implement the image processing method of the exemplary embodiments of the present disclosure.
Specifically, as shown in fig. 13, the electronic device 130 may include: processor 1310, internal memory 1321, external memory interface 1322, universal Serial Bus (USB) interface 1330, charge management Module 1340, power management Module 1341, battery 1342, antenna 1, antenna 2, mobile communication Module 1350, wireless communication Module 1360, audio Module 1370, sensor Module 1380, display screen 1390, camera Module 1391, indicator 1392, motor 1393, button 1394, and Subscriber Identity Module (SIM) card interface 1395, and so forth. The sensor module 1380 may include a depth sensor, a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
It is to be understood that the illustrated structure of the embodiments of the present disclosure does not constitute a specific limitation to the electronic device 130. In other embodiments of the present disclosure, the electronic device 130 may include more or fewer components than illustrated, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 1310 may include one or more processing units, such as: the Processor 1310 may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. Additionally, a memory may be provided in the processor 1310 for storing instructions and data.
The electronic device 130 may implement a camera function via the ISP, camera module 1391, video codec, GPU, display screen 1390, application processor, and so on. In some embodiments, the electronic device 130 may include 1 or N camera modules 1391, N being a positive integer greater than 1, if the electronic device 130 includes N cameras, one of the N cameras being the primary camera.
Internal memory 1321 may be used to store computer-executable program code, including instructions. The internal memory 1321 may include a program storage area and a data storage area. The external memory interface 1322 may be used for connecting an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 130.
The present disclosure also provides a computer-readable storage medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device.
A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable storage medium may transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The computer readable storage medium carries one or more programs which, when executed by one of the electronic devices, cause the electronic device to implement the method as described in the embodiments of the present disclosure.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes illustrated in the above figures are not intended to indicate or limit the temporal order of the processes. In addition, it is also readily understood that these processes may be performed, for example, synchronously or asynchronously in multiple modules.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice in the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (31)

1. An image processing method, comprising:
acquiring a plurality of images to be fused of the same shooting scene, and decomposing each image to be fused to obtain a Gaussian pyramid and a Laplacian pyramid of each image to be fused;
fusing pyramid images positioned on the same layer in the Gaussian pyramid of each image to be fused to obtain a fused Gaussian pyramid;
fusing pyramid images positioned on the same layer in the Laplacian pyramid of each image to be fused to obtain a fused Laplacian pyramid;
and generating fused images corresponding to the plurality of images to be fused by utilizing the fused Gaussian pyramid and the fused Laplacian pyramid.
2. The image processing method according to claim 1, wherein acquiring a plurality of images to be fused of the same shooting scene comprises:
acquiring a plurality of original images acquired based on different exposure degrees in the same shooting scene;
and carrying out brightness alignment on the plurality of original images by utilizing the proportional relation of the exposure degrees among the original images so as to obtain a plurality of images to be fused of the same shooting scene.
3. The method according to claim 2, wherein the plurality of images to be fused include a first image to be fused and a second image to be fused, the exposure degree of the original image corresponding to the first image to be fused is greater than the exposure degree of the original image corresponding to the second image to be fused, the gaussian pyramid of the first image to be fused includes a first pyramid image, and an image in the same layer as the first pyramid image in the gaussian pyramid of the second image to be fused is a second pyramid image; wherein fusing the first pyramid image with the second pyramid image comprises:
determining a fusion weight of the first pyramid image and the second pyramid image;
fusing the first pyramid image with the second pyramid image using the fusion weight.
4. The image processing method of claim 3, wherein determining the blending weight of the first pyramid image and the second pyramid image comprises:
determining a difference between the first pyramid image and the second pyramid image, and determining a fusion weight of the first pyramid image and the second pyramid image according to the difference.
5. The method of claim 4, wherein determining a difference between the first pyramid image and the second pyramid image, and determining a fusion weight for the first pyramid image and the second pyramid image based on the difference comprises:
determining the difference of pixel values between each pixel point in the first pyramid image and a pixel point corresponding to the position in the second pyramid image so as to generate a first difference weight map;
determining a fusion weight of the first pyramid image and the second pyramid image using the first difference weight map.
6. The method of claim 5, wherein determining the blending weight of the first pyramid image and the second pyramid image using the first difference weight map comprises:
mapping the weighted value of the pixel point in the first difference weight map to a preset bit width range to obtain a second difference weight map;
and normalizing the second difference weight graph, and determining the result of the normalization processing as the fusion weight of the first pyramid image and the second pyramid image.
7. The image processing method of claim 3, wherein determining the blending weight of the first pyramid image and the second pyramid image comprises:
adjusting the pixel data of the second pyramid image to be consistent with the bit width of the first pyramid image so as to generate a third pyramid image;
determining a fusion weight of the first pyramid image and the second pyramid image according to the third pyramid image.
8. The method of claim 7, wherein determining the blending weight of the first pyramid image and the second pyramid image from the third pyramid image comprises:
performing mean filtering on the third pyramid image to obtain a fourth pyramid image;
determining a fusion weight of the first pyramid image and the second pyramid image according to the fourth pyramid image.
9. The method of claim 8, wherein determining the blending weight of the first pyramid image and the second pyramid image from the fourth pyramid image comprises:
determining the difference value between the bit width of the first pyramid image and the pixel value of each pixel point in the fourth pyramid image to obtain a first brightness weight map;
determining a fusion weight of the first pyramid image and the second pyramid image using the first luminance weight map.
10. The method of claim 9, wherein determining the blending weight of the first pyramid image and the second pyramid image using the first luminance weight map comprises:
mapping the pixel point weight value in the first brightness weight map to a preset bit width range to obtain a second brightness weight map;
and normalizing the second brightness weight graph, and determining the result of the normalization processing as the fusion weight of the first pyramid image and the second pyramid image.
11. The method of claim 7, wherein determining the blending weight of the first pyramid image and the second pyramid image from the third pyramid image comprises:
determining the maximum value of the pixel value in the Bayer unit to which each pixel point belongs in the third pyramid image, and generating a first local maximum value weight map according to the maximum value of the pixel value in the Bayer unit to which each pixel point belongs in the third pyramid image;
determining a fusion weight of the first pyramid image and the second pyramid image using the first local maximum weight map.
12. The method according to claim 11, wherein determining the fusion weight of the first pyramid image and the second pyramid image using the first local maximum weight map comprises:
mapping the weight value of the pixel point in the first local maximum value weight map to a preset bit width range to obtain a second local maximum value weight map;
and carrying out normalization processing on the second local maximum value weight map, and determining the result of the normalization processing as the fusion weight of the first pyramid image and the second pyramid image.
13. The method of claim 7, wherein determining the blending weight of the first pyramid image and the second pyramid image from the third pyramid image comprises:
performing mean filtering on the third pyramid image to obtain a fourth pyramid image;
determining the difference value between the bit width of the first pyramid image and the pixel value of each pixel point in the fourth pyramid image to obtain a first brightness weight map;
determining the maximum value of the pixel value in the Bayer unit to which each pixel point belongs in the third pyramid image, and generating a first local maximum value weight map according to the maximum value of the pixel value in the Bayer unit to which each pixel point belongs in the third pyramid image;
determining a fusion weight of the first pyramid image and the second pyramid image using the first luminance weight map and the first local maximum weight map.
14. The method according to claim 13, wherein determining the fusion weight of the first pyramid image and the second pyramid image using the first luminance weight map and the first local maximum weight map comprises:
mapping the pixel point weight value in the first brightness weight map to a preset bit width range to obtain a second brightness weight map;
mapping the weighted value of the pixel point in the first local maximum value weighted graph to the preset bit width range to obtain a second local maximum value weighted graph;
and multiplying the second brightness weight map by the second local maximum weight map, normalizing the multiplied result, and determining the normalized result as the fusion weight of the first pyramid image and the second pyramid image.
15. The image processing method of claim 3, wherein determining the blending weight of the first pyramid image and the second pyramid image comprises:
adjusting the pixel data of the second pyramid image to be consistent with the bit width of the first pyramid image so as to generate a third pyramid image;
determining a fusion weight of the first pyramid image and the second pyramid image according to the first pyramid image, the second pyramid image, and the third pyramid image.
16. The method of claim 15, wherein determining the fusion weight of the first pyramid image and the second pyramid image from the first pyramid image, the second pyramid image, and the third pyramid image comprises:
determining the difference of pixel values between each pixel point in the first pyramid image and a pixel point corresponding to the position in the second pyramid image to generate a first difference weight map;
determining a fusion weight of the first pyramid image and the second pyramid image according to the first difference weight map and the third pyramid image.
17. The image processing method of claim 5 or 16, wherein determining the difference in pixel value between each pixel point in the first pyramid image and the pixel point corresponding to the position in the second pyramid image comprises:
determining adjacent pixel points of a first pixel point in the first pyramid image, wherein the first pixel point and the adjacent pixel points of the first pixel point form a first pixel area;
determining a second pixel point corresponding to the position of the first pixel point in the second pyramid image, and determining an adjacent pixel point of the second pixel point, wherein the second pixel point and the adjacent pixel point of the second pixel point form a second pixel area;
determining a difference in pixel values between the first pixel region and the second pixel region as a difference in pixel values between the first pixel point and the second pixel point.
18. The image processing method of claim 17, wherein determining the difference in pixel values between the first pixel region and the second pixel region comprises:
traversing each pixel point in the first pixel region, and determining the difference of pixel values between the pixel point in the first pixel region and the pixel point corresponding to the position in the second pixel region;
and determining the difference of the pixel values between the first pixel region and the second pixel region according to the difference of the pixel values between the pixel points in the first pixel region and the pixel points corresponding to the positions in the second pixel region.
19. The method of claim 16, wherein determining the fusion weight of the first pyramid image and the second pyramid image from the first difference weight map and the third pyramid image comprises:
performing mean filtering on the third pyramid image to obtain a fourth pyramid image;
determining the difference value between the bit width of the first pyramid image and the pixel value of each pixel point in the fourth pyramid image to obtain a first brightness weight map;
determining a fusion weight of the first pyramid image and the second pyramid image using the first luminance weight map and the first difference weight map.
20. The method of claim 19, wherein determining the fusion weight of the first pyramid image and the second pyramid image using the first luminance weight map and the first difference weight map comprises:
mapping the weighted value of the pixel point in the first brightness weight map to a preset bit width range to obtain a second brightness weight map;
mapping the weighted value of the pixel point in the first difference weight map to the preset bit width range to obtain a second difference weight map;
and multiplying the second brightness weight map and the second difference weight map, normalizing the multiplied result, and determining the normalized result as the fusion weight of the first pyramid image and the second pyramid image.
21. The method of claim 16, wherein determining the blending weight of the first pyramid image and the second pyramid image from the first difference weight map and the third pyramid image comprises:
determining the maximum value of the pixel value in the Bayer unit to which each pixel point belongs in the third pyramid image, and generating a first local maximum value weight map according to the maximum value of the pixel value in the Bayer unit to which each pixel point belongs in the third pyramid image;
determining a fusion weight of the first pyramid image and the second pyramid image using the first local maximum weight map and the first difference weight map.
22. The method of claim 21, wherein determining the fusion weight of the first pyramid image and the second pyramid image using the first local maximum weight map and the first difference weight map comprises:
mapping the pixel point weight value in the first local maximum value weight map to a preset bit width range to obtain a second local maximum value weight map;
mapping the weighted value of the pixel point in the first difference weight map to the preset bit width range to obtain a second difference weight map;
and multiplying the second local maximum value weight map by the second difference weight map, normalizing the multiplied result, and determining the normalized result as the fusion weight of the first pyramid image and the second pyramid image.
23. The method of claim 16, wherein determining the blending weight of the first pyramid image and the second pyramid image from the first difference weight map and the third pyramid image comprises:
performing mean filtering on the third pyramid image to obtain a fourth pyramid image;
determining the difference value between the bit width of the first pyramid image and the pixel value of each pixel point in the fourth pyramid image to obtain a first brightness weight map;
determining the maximum value of the pixel value in the Bayer unit to which each pixel point belongs in the third pyramid image, and generating a first local maximum value weight map according to the maximum value of the pixel value in the Bayer unit to which each pixel point belongs in the third pyramid image;
determining a fusion weight of the first pyramid image and the second pyramid image using the first difference weight map, the first luminance weight map, and the first local maximum weight map.
24. The image processing method of claim 23, wherein determining the fusion weight of the first pyramid image and the second pyramid image using the first difference weight map, the first luminance weight map, and the first local maximum weight map comprises:
mapping the pixel point weight value in the first difference weight map to a preset bit width range to obtain a second difference weight map;
mapping the weighted value of the pixel point in the first brightness weighted graph to the preset bit width range to obtain a second brightness weighted graph;
mapping the weighted value of the pixel point in the first local maximum value weighted graph to the preset bit width range to obtain a second local maximum value weighted graph;
multiplying the second difference weight map, the second luminance weight map, and the second local maximum weight map, normalizing a result of the multiplication, and determining a result of the normalization as a fusion weight of the first pyramid image and the second pyramid image.
25. An image processing method, comprising:
acquiring a plurality of images to be fused of the same shooting scene, and decomposing each image to be fused to obtain a Gaussian pyramid of each image to be fused;
fusing pyramid images positioned on the same layer in the Gaussian pyramid of each image to be fused to obtain a fused Gaussian pyramid;
determining a Laplacian pyramid of a target image to be fused in the plurality of images to be fused;
and generating fused images corresponding to the plurality of images to be fused by utilizing the fused Gaussian pyramid and the Laplacian pyramid of the target image to be fused.
26. An image processing method, comprising:
acquiring a plurality of images to be fused of the same shooting scene, and decomposing each image to be fused to obtain a Laplacian pyramid of each image to be fused;
fusing pyramid images positioned on the same layer in the Laplacian pyramid of each image to be fused to obtain a fused Laplacian pyramid;
determining a Gaussian pyramid of a target image to be fused in the plurality of images to be fused;
and generating fused images corresponding to the plurality of images to be fused by utilizing the Gaussian pyramid of the target image to be fused and the fused Laplacian pyramid.
27. An image processing apparatus characterized by comprising:
the image decomposition module is used for acquiring a plurality of images to be fused of the same shooting scene and decomposing the images to be fused to obtain a Gaussian pyramid and a Laplacian pyramid of the images to be fused;
the first fusion module is used for fusing pyramid images positioned on the same layer in the Gaussian pyramid of each image to be fused to obtain a fused Gaussian pyramid;
the second fusion module is used for fusing pyramid images positioned on the same layer in the laplacian pyramid of each image to be fused to obtain a fused laplacian pyramid;
and the image generation module is used for generating fused images corresponding to the plurality of images to be fused by utilizing the fused Gaussian pyramid and the fused Laplacian pyramid.
28. An image processing apparatus characterized by comprising:
the image decomposition module is used for acquiring a plurality of images to be fused of the same shooting scene and decomposing each image to be fused to obtain a Gaussian pyramid of each image to be fused;
the hierarchical fusion module is used for fusing pyramid images positioned in the same layer in the Gaussian pyramid of each image to be fused to obtain a fused Gaussian pyramid;
the Laplacian pyramid determining module is used for determining a Laplacian pyramid of a target image to be fused in the plurality of images to be fused;
and the image generation module is used for generating fused images corresponding to the plurality of images to be fused by utilizing the fused Gaussian pyramid and the Laplacian pyramid of the target image to be fused.
29. An image processing apparatus characterized by comprising:
the image decomposition module is used for acquiring a plurality of images to be fused of the same shooting scene and decomposing each image to be fused to obtain a Laplacian pyramid of each image to be fused;
the hierarchical fusion module is used for fusing pyramid images positioned in the same layer in the laplacian pyramids of the images to be fused to obtain fused laplacian pyramids;
the Gaussian pyramid determining module is used for determining a Gaussian pyramid of a target image to be fused in the plurality of images to be fused;
and the image generation module is used for generating fused images corresponding to the plurality of images to be fused by utilizing the Gaussian pyramid of the target image to be fused and the fused Laplacian pyramid.
30. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out an image processing method according to any one of claims 1 to 26.
31. An electronic device, comprising:
a processor;
a memory for storing one or more programs which, when executed by the processor, cause the processor to implement the image processing method of any of claims 1 to 26.
CN202211144770.4A 2022-09-20 2022-09-20 Image processing method and device, computer readable storage medium and electronic device Pending CN115564694A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211144770.4A CN115564694A (en) 2022-09-20 2022-09-20 Image processing method and device, computer readable storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211144770.4A CN115564694A (en) 2022-09-20 2022-09-20 Image processing method and device, computer readable storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN115564694A true CN115564694A (en) 2023-01-03

Family

ID=84740314

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211144770.4A Pending CN115564694A (en) 2022-09-20 2022-09-20 Image processing method and device, computer readable storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN115564694A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118632138A (en) * 2024-07-05 2024-09-10 深圳市立浦科技有限公司 Data processing method and system for image sensor

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118632138A (en) * 2024-07-05 2024-09-10 深圳市立浦科技有限公司 Data processing method and system for image sensor
CN118632138B (en) * 2024-07-05 2025-03-14 深圳市立浦科技有限公司 Data processing method and system of image sensor

Similar Documents

Publication Publication Date Title
CN111028189B (en) Image processing method, device, storage medium and electronic equipment
JP6169186B2 (en) Image processing method and apparatus, and photographing terminal
CN108335279B (en) Image fusion and HDR imaging
EP2368226B1 (en) High dynamic range image combining
KR101699919B1 (en) High dynamic range image creation apparatus of removaling ghost blur by using multi exposure fusion and method of the same
Rao et al. A Survey of Video Enhancement Techniques.
CN106920221B (en) Take into account the exposure fusion method that Luminance Distribution and details are presented
CN110033418B (en) Image processing method, image processing device, storage medium and electronic equipment
WO2022116988A1 (en) Image processing method and apparatus, and device and storage medium
CN108259774A (en) Image combining method, system and equipment
EP3123710B1 (en) System and method of fast adaptive blending for high dynamic range imaging
CN111242860B (en) Super night scene image generation method and device, electronic equipment and storage medium
CN113674193B (en) Image fusion method, electronic device and storage medium
CN114820405A (en) Image fusion method, device, equipment and computer readable storage medium
CN110047060B (en) Image processing method, image processing device, storage medium and electronic equipment
US20240155248A1 (en) Method and apparatus for generating high-dynamic-range image, and electronic device
CN113259594A (en) Image processing method and device, computer readable storage medium and terminal
CN113962859A (en) Panorama generation method, device, equipment and medium
CN112927162A (en) Low-illumination image oriented enhancement method and system
CN113793257B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN115564694A (en) Image processing method and device, computer readable storage medium and electronic device
CN115471413A (en) Image processing method and device, computer readable storage medium and electronic device
WO2023137956A1 (en) Image processing method and apparatus, electronic device, and storage medium
CN110351489B (en) Method and device for generating HDR image and mobile terminal
CN115147304A (en) Image fusion method and device, electronic equipment, storage medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination