[go: up one dir, main page]

CN100562894C - An image synthesis method and device - Google Patents

An image synthesis method and device Download PDF

Info

Publication number
CN100562894C
CN100562894C CNB2007100647399A CN200710064739A CN100562894C CN 100562894 C CN100562894 C CN 100562894C CN B2007100647399 A CNB2007100647399 A CN B2007100647399A CN 200710064739 A CN200710064739 A CN 200710064739A CN 100562894 C CN100562894 C CN 100562894C
Authority
CN
China
Prior art keywords
image
pixel
frequency component
value
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CNB2007100647399A
Other languages
Chinese (zh)
Other versions
CN101021945A (en
Inventor
沈操
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongxing Micro Technology Co ltd
Original Assignee
Vimicro Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vimicro Corp filed Critical Vimicro Corp
Priority to CNB2007100647399A priority Critical patent/CN100562894C/en
Publication of CN101021945A publication Critical patent/CN101021945A/en
Application granted granted Critical
Publication of CN100562894C publication Critical patent/CN100562894C/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of image combining method and device.Method provided by the invention comprises: A, obtain the monochrome information of each pixel in every width of cloth original image respectively; B, from the monochrome information of described pixel, extract the high fdrequency component of respective pixel point; C, according to the high fdrequency component of each pixel in each original image, determine the value of relevant position pixel in the composograph.In addition, in step B, can obtain the low frequency component of respective pixel point in each pixel high fdrequency component of extraction, determine the low frequency component of relevant position pixel in the composograph according to the low frequency component of each pixel in the original image.According to method provided by the invention, can fast, effectively several multiple focussing images be synthesized all big clearly depth image in each orientation.

Description

Image synthesis method and device
Technical Field
The invention relates to the field of information fusion, in particular to an image synthesis method and device.
Background
In recent years, image pickup apparatuses such as digital cameras, digital video cameras, and video cameras have rapidly started to be popularized, and desired images can be acquired by these image pickup apparatuses. But because the image acquisition equipment has the depth of field, the acquired image is not completely clear. The object can be clearly imaged on the imaging surface only within a limited distance before and after the focusing point of the image acquisition equipment, and the distance capable of clearly imaging is the depth of field. Because the depth of field of the existing image acquisition equipment is small, only one distance in a scene can be focused accurately, so that the obtained image is clear only in a focusing position, for example, when the image is focused on a foreground, the foreground of the obtained image is clear, the background of the obtained image is fuzzy, and conversely, when the image is focused on a background, the image with the fuzzy foreground and the clear background of the obtained image is obtained. Therefore, a method for extending the depth of field of an image is needed.
To solve the above problem, the depth of field can be enlarged by an optical processing method of reducing the aperture, but this method has certain limitations: (1) the cost of manufacturing the image acquisition equipment with the adjustable aperture is higher than that of the image acquisition equipment with the non-adjustable aperture, so that some image acquisition equipment does not have an aperture enlarging or reducing mechanism in order to save the cost; (2) although the aperture can enlarge the depth of field, the light entering the sensor is reduced at the same time, so that the image cannot be properly exposed, and if the method of increasing the light entering the sensor by increasing the exposure time is adopted, the relative position between the scene and the image acquisition equipment must be ensured to be unchanged, which easily generates image blurring in a moving scene or in the case of slight shaking of the image acquisition equipment.
In order to overcome the defect that the optical processing method expands the depth of field, the image processing method is adopted to obtain a high-quality image with larger depth of field at present, wherein the image synthesis technology is mainly adopted. The image synthesis technology is an image processing technology that acquires a plurality of information of the same subject and fuses the information into a new image by using different imaging modes of different sensors in various image acquisition devices. More reliable and accurate images can be obtained through image synthesis technology for observation or further processing. In recent years, image synthesis technology has become a very important and useful image analysis and computer vision technology. The synthesis of multi-focus images, which is one of image synthesis techniques, is to record a plurality of focused images of the same scene under the same imaging conditions, and then synthesize the images to obtain a synthesized image with a large depth of field and all targets being focused clearly.
At present, there is a multi-sensor image fusion method based on an optimal wavelet filter bank, which is disclosed in chinese patent publication No. CN 1794300a, and in the method, edge feature amplitude and edge connection probability features related to visual characteristics are used to fuse high frequency components of an original image, and a weighted average method is used to fuse low frequency components of the original image. According to the method, the original image is fused to obtain the frequency component of the synthetic image, the high-frequency and low-frequency information is reconstructed into the synthetic image through wavelet inverse transformation, the high-frequency and low-frequency components are overlapped for multiple times according to a certain rule in the wavelet inverse transformation process, the calculated amount is large, and in addition, the wavelet filter bank coefficient is required to be calculated in the method. Although the method can obtain the composite image with better quality, the image quality of the corresponding area is reduced compared with the clear area of the original image, and the quality of the composite image has certain deviation from the ideal composite image.
In addition, the multi-focus image fusion method based on block segmentation disclosed in chinese patent publication No. CN 1177298C is a method of segmenting a plurality of original images of different focus points into a plurality of equal-sized block regions, and taking an average value of sums of absolute values of ratios of high-frequency components and low-frequency components of each point in the block region as a local contrast of the block region on the basis of obtaining the low-frequency components and the high-frequency components of the original images. The local contrast reflects the difference between the focus clear region and the focus blurred region of the image, and the whole original image is divided into a clear block region, a blurred block region and a boundary region by using the local contrast of the block region. For the clear block area and the fuzzy block area, because the original image is complementary in the two areas, the clear block area is directly selected as the corresponding block area of the fused composite image during the fusion processing; for each pixel point in the boundary region, the sum A of the low-frequency components of all the pixel points in the neighborhood of the pixel point is obtainedZThe absolute value of the high-frequency component of the pixel point is compared with AZAnd finally, selecting the original image with the maximum contrast of the pixel points at the same position in each original image in the boundary region, and taking the gray value of the pixel point at the corresponding position of the original image as the gray value of the corresponding pixel point in the synthesized image. Although the method can obtain a large depth-of-field image with clear directions, the method also has certain defects, namely, the block region selection in the method needs to be carried out according to the local contrast of each block region or the contrast of pixel points, and the calculated amount of the local contrast of the block region and the contrast of the pixel points of the image is also large; in addition, the method is based on block region operation rather than operation for each pixel point, so that compared with multiple original images, some pixel points in some block regions of the synthesized image still cannot obtain the gray values of the corresponding pixel points with the best definition in the original images, and an ideal result cannot be achieved.
In summary, the existing methods cannot rapidly and effectively acquire a high depth-of-field image with clear pixel points.
Disclosure of Invention
In view of this, the present invention provides an image synthesis method and an image synthesis device, so as to solve the problem that in the prior art, a clear high-depth-of-field image of each pixel point cannot be obtained quickly and effectively.
The invention provides an image synthesis method, which comprises the following steps: A. respectively obtaining the brightness information of each pixel point in each original image; B. extracting high-frequency components of corresponding pixel points from the brightness information of the pixel points; C. determining the value of the pixel point at the corresponding position in the synthesized image according to the high-frequency component of each pixel point in each original image; the step C comprises the following steps: c11, obtaining a high-frequency component intensity value of each pixel point in each original image; c12, comparing the high-frequency component intensity values of the pixel points at the same positions in different original images, and determining the original image with the maximum high-frequency component intensity value of each pixel point at each position; and C13, determining the value of the corresponding position pixel point in the synthetic image according to the original image with the maximum high-frequency component intensity value of each position pixel point.
The step C13 includes: directly taking the value of the pixel point at the corresponding position in the original image with the maximum high-frequency component intensity value of the pixel point at each position as the value of the pixel point at the corresponding position in the synthesized image; or setting a mask plane with the same size as the original image, setting the value of the corresponding position point in the mask plane according to the original image with the maximum high-frequency component intensity value of each position pixel point, performing smooth filtering on the mask plane after the high-frequency component intensity values of all the pixel points in each original image are compared to obtain a smooth plane, setting a threshold according to the values of all the points on the smooth plane, comparing the value of each point in the smooth plane with the threshold, and taking the value of the corresponding position pixel point in the original image as the value of the corresponding position pixel point in the synthesized image according to the comparison result.
The present invention providesThe image synthesizing method of (1), the method comprising: A. respectively obtaining the brightness information of each pixel point in each original image; B. extracting high-frequency components of corresponding pixel points from the brightness information of the pixel points; C. determining the value of the pixel point at the corresponding position in the synthesized image according to the high-frequency component of each pixel point in each original image; the step C comprises the following steps: calculating the high-frequency component intensity value of each pixel point in each original image, calculating the weighting coefficient of each pixel point according to the high-frequency component intensity value of the pixel point at the same position in each original image, and obtaining the value of each pixel point in the synthesized image according to the weighting coefficient; wherein, in the N original images with the size of m × N, the weighting coefficient k of the pixel point at the position (p, q) in the ith original imagei(p, q) is:
<math> <mrow> <msub> <mi>k</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <mi>abs</mi> <mo>_</mo> <msub> <mi>edge</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> </mrow> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </munderover> <mi>abs</mi> <mo>_</mo> <msub> <mi>edge</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow> </math>
wherein abs _ edgei(m, N) is the high-frequency component intensity value of the pixel point at the position (p, q) on the ith original image, and i is 1 to NA positive integer, p is a positive integer from 1 to m, and q is a positive integer from 1 to n.
The invention provides another image synthesis method, which comprises the following steps: A. respectively obtaining the brightness information of each pixel point in each original image; B. extracting high-frequency components of corresponding pixel points from the brightness information of the pixel points; C. determining the value of the pixel point at the corresponding position in the synthesized image according to the high-frequency component of each pixel point in each original image; the step C comprises the following steps: c21, taking two original images; c22, synthesizing the two images into an intermediate synthetic image; c23, judging whether all the original images are synthesized, if so, taking the obtained intermediate synthetic image as a final synthetic image, otherwise, continuing to execute C24 and continuing the next iteration process; c24, selecting the intermediate synthetic image and an original image which is not subjected to the synthetic operation, and returning to execute C22; wherein the step C22 includes: obtaining a high-frequency component intensity value of each pixel point in the two images; determining an image with the maximum high-frequency component intensity value of the pixel points at the same position in the two images, and determining the value of the pixel point at the corresponding position in the intermediate synthetic image according to the image with the maximum high-frequency component intensity value; or calculating the weighting coefficient of each pixel point according to the high-frequency component intensity values of the pixel points at the same positions in the two images, and obtaining the value of each pixel point of the intermediate synthetic image according to the weighting coefficient.
In any of the above image synthesis methods, when extracting the high-frequency component of the corresponding pixel from the luminance information of the pixel in step B, the method further includes: extracting low-frequency components of pixel points in an original image; the step C is further followed by: and determining the value of the pixel point at the corresponding position in the low-frequency synthetic image according to the values of the pixel points with the low-frequency components at the same positions in all the original images, and processing the low-frequency synthetic image and the synthetic image to obtain the synthetic image containing the high-frequency components and the low-frequency components.
The above determining the value of the pixel point at the corresponding position in the low-frequency synthesized image includes: taking the weighted average of the values of all pixel points with low-frequency components at the corresponding positions of all the original images as the values of the pixel points at the corresponding positions in the low-frequency synthesized image; or, the value of the pixel point at the corresponding position of one of the original images with the low-frequency component at the corresponding position of each original image is used as the value of the pixel point at the corresponding position in the low-frequency synthetic image.
An image synthesizing apparatus according to the present invention includes: the system comprises a brightness information extraction unit, a high-frequency component extraction unit and a high-frequency synthesis unit, wherein the brightness information extraction unit is used for extracting the brightness information of each pixel point in each original image; the high-frequency component extraction unit is used for obtaining the high-frequency component of each pixel point in each original image according to the brightness information of each pixel point in each original image; the high-frequency synthesis unit is used for determining the value of the pixel point at the corresponding position in the synthesized image according to the high-frequency component of each pixel point in each original image; the high-frequency synthesis unit comprises a high-frequency component intensity value calculation unit, a comparison unit and an assignment unit, wherein the high-frequency component intensity value calculation unit is used for calculating a high-frequency component intensity value of a corresponding pixel point according to the high-frequency component of each pixel point in each original image, the comparison unit is used for comparing the high-frequency component intensity values of the pixel points at the same positions in different original images to determine an original image with the maximum high-frequency component intensity value of each pixel point, and the assignment unit is used for determining the value of the pixel point at the corresponding position in the synthesized image according to the original image with the maximum high-frequency component intensity value of each pixel point.
In the image synthesis device, the assignment unit comprises a mask plane setting unit, a smoothing filtering unit and a threshold value comparison unit, wherein the mask plane setting unit is used for setting a mask plane with the same size as that of an original image, and determining the value of a corresponding position point on the mask plane according to the original image with the maximum high-frequency component intensity value of each position pixel point; the smoothing filtering unit is used for performing smoothing filtering on the mask plane to obtain a smooth plane after the comparison of the high-frequency component intensity values of all the pixel points in each original image is finished; the threshold comparison unit is used for setting a threshold according to the values of all the points on the smooth plane, comparing the value of each point on the smooth plane with the threshold, and taking the value of the pixel point at the corresponding position of the corresponding original image as the value of the pixel point at the corresponding position in the synthesized image according to the comparison result.
Another image synthesizing apparatus according to the present invention includes: the system comprises a brightness information extraction unit, a high-frequency component extraction unit and a high-frequency synthesis unit, wherein the brightness information extraction unit is used for extracting the brightness information of each pixel point in each original image; the high-frequency component extraction unit is used for obtaining the high-frequency component of each pixel point in each original image according to the brightness information of each pixel point in each original image; the high-frequency synthesis unit is used for determining the value of the pixel point at the corresponding position in the synthesized image according to the high-frequency component of each pixel point in each original image; the high-frequency synthesis unit comprises a high-frequency component intensity value calculation unit, a weighting coefficient calculation unit and a weighting unit, wherein the high-frequency component intensity value calculation unit is used for calculating a high-frequency component intensity value of a corresponding pixel point according to the high-frequency component of each pixel point in each original image, the weighting coefficient calculation unit is used for calculating a weighting coefficient of each pixel point in each original image according to the high-frequency component intensity value of each pixel point in each original image, and the weighting unit is used for weighting and synthesizing the values of the pixel points of all the original images at the same position into the value of the pixel point at the corresponding position in the synthesized image according to the weighting coefficient of each pixel point in each original image.
The present invention provides still another image synthesizing apparatus, comprising: the system comprises a brightness information extraction unit, a high-frequency component extraction unit and a high-frequency synthesis unit, wherein the brightness information extraction unit is used for extracting the brightness information of each pixel point in each original image; the high-frequency component extraction unit is used for obtaining the high-frequency component of each pixel point in each original image according to the brightness information of each pixel point in each original image; the high-frequency synthesis unit is used for determining the value of the pixel point at the corresponding position in the synthesized image according to the high-frequency component of each pixel point in each original image; the high-frequency synthesizing unit includes a first high-frequency component intensity value calculating unit, a first storage unit, a third storage unit, a two-map synthesizing unit, a second storage unit, a second luminance information extracting unit, a second high-frequency component intensity value calculating unit, and a counting unit. The first high-frequency component intensity value calculation unit is used for calculating a high-frequency component intensity value of a corresponding pixel point according to the high-frequency component of each pixel point in each original image; the first storage unit is used for storing the high-frequency component intensity value of each pixel point in each original image, sequentially numbering the high-frequency component intensity values by taking the original image as a unit, and providing the high-frequency component intensity value of each pixel point in the original image with the number corresponding to the current count value when receiving the current count value provided by the counting unit; the third storage unit is used for storing each original image, numbering each original image in sequence, and providing the original image with the number corresponding to the current count value when receiving the current count value provided by the counting unit; the two-image synthesis unit is used for determining the value of the pixel point at the corresponding position in the intermediate synthesis image according to the high-frequency component intensity value of each pixel point in the two images and providing a counting notification message; the second storage unit is used for storing the intermediate synthetic image, providing the intermediate synthetic image according to the current counting value provided by the counting unit and outputting the intermediate synthetic image as a final synthetic image according to the output notification message provided by the counting unit; the second brightness information extraction unit is used for extracting the brightness information of each pixel point in the intermediate synthetic image; the second high-frequency component extraction unit is used for obtaining the high-frequency component of each pixel point according to the brightness information of each pixel point in the intermediate synthetic image; the second high-frequency component intensity value calculation unit is used for calculating the high-frequency component intensity value of each pixel point according to the high-frequency component of each pixel point in the intermediate synthetic image; and the counting unit is used for adding 1 to the counting value according to the counting notification message to serve as a current counting value, judging whether all the original images are synthesized or not according to the current counting value, if so, sending an output notification message to the second storage unit, and otherwise, respectively providing the current counting value to the first storage unit and the third storage unit.
Still another image synthesizing apparatus provided by the present invention includes: the device comprises a first storage unit, a second storage unit, a brightness information extraction unit, a high-frequency component intensity value calculation unit, a two-image synthesis unit and a counting unit, wherein the first storage unit is used for storing each original image, sequentially numbering the original images and providing the original images corresponding to the current count value according to the current count value provided by the counting unit; the second storage unit is used for storing the intermediate synthetic image, directly providing the intermediate synthetic image to the brightness information extraction unit, and outputting the intermediate synthetic image as a final synthetic image according to the output notification message provided by the counting unit; the brightness information extraction unit is used for extracting the brightness information of each pixel point in the image; the high-frequency component extraction unit is used for obtaining the high-frequency component of each pixel point according to the brightness information of each pixel point in the image; the high-frequency component intensity value calculation unit is used for calculating the high-frequency component intensity value of each pixel point according to the high-frequency component of each pixel point in the image; the two-image synthesis unit is used for determining the value of a corresponding pixel point in an intermediate synthesis image according to the high-frequency component intensity value of each pixel point in the two images, providing the intermediate synthesis image and providing a counting notification message; and the counting unit is used for adding 1 to the counting value according to the counting notification message to obtain a current counting value, judging whether all the original images are synthesized according to the current counting value, if so, sending an output notification message to the second storage unit, and otherwise, respectively providing the current counting value to the first storage unit and the second storage unit.
Any of the image synthesizing apparatuses described above further includes: the low-frequency component extraction unit is connected with the brightness information extraction unit or the high-frequency component extraction unit and is used for acquiring the low-frequency components of the pixel points in each original image and selecting the pixel points with the low-frequency components at the same position in different original images; the low-frequency synthesis unit is connected with the low-frequency component extraction unit and used for determining the value of the pixel point at the corresponding position in the low-frequency synthesized image according to the values of the pixel points at the same position in all the original images with the low-frequency component; the synthesis unit is connected with the low-frequency synthesis unit and the high-frequency synthesis unit or the two-image synthesis unit and is used for processing the synthesis image and the low-frequency synthesis image to obtain a synthesis image containing high-frequency components and low-frequency components.
The invention uses an image synthesis scheme based on pixel points to expand the depth of field, in the scheme provided by the invention, the brightness information of each pixel point in each original image is respectively obtained, the high-frequency component of the corresponding pixel point is extracted from the brightness information of the pixel point, and the value of the pixel point at the corresponding position in the synthesized image is determined according to the high-frequency component of each pixel point in each original image. In the invention, the calculation of the high-frequency component intensity value of each pixel point in the original image is only simple operation, and the process of determining the value of each pixel point in the synthetic image only needs simple comparison or four arithmetic operations, so compared with the prior art, the scheme provided by the invention greatly reduces the operation amount in the image synthetic process, and can conveniently and quickly obtain the required synthetic image. Because the scheme provided by the invention is based on the operation of each pixel point, compared with the prior art, the details of each pixel point of the synthetic image obtained by adopting the scheme provided by the invention are clearer.
In addition, in the method of the invention, the corresponding pixel points in the synthesized image are generated without using the values of the high-frequency component and the low-frequency component of each pixel point in the original image, but the values of the pixel points at the corresponding positions in the original image are directly used as the values of the pixel points at the corresponding positions in the synthesized image, or the values of the pixel points at the corresponding positions in the original image are weighted and averaged to be used as the values of the pixel points at the corresponding positions in the synthesized image, so that compared with the prior art, the method of the invention can not cause the deviation of the details of the synthesized image and the details of the original image due to the related operations of wavelet transformation and the like on.
In summary, the scheme provided by the invention can quickly and effectively obtain the synthetic image with high depth of field.
Drawings
FIG. 1 is a flow chart of image synthesis in the present invention;
FIG. 2 is a flow chart of image synthesis using a comparison method according to the present invention;
FIG. 3 is a flow chart of a second implementation of a comparison method for synthesizing an image in accordance with the present invention;
FIG. 4 is a flow chart of comparative compositing of two images in accordance with the present invention;
FIG. 5 is a flow chart of comparative synthesis of three images according to the present invention;
FIG. 6 is a flow chart of image synthesis using a weighting method according to the present invention;
FIG. 7 is a flow chart of image synthesis using an iterative method according to the present invention;
FIG. 8 is a schematic diagram of the image synthesis using an iterative method according to the present invention;
FIG. 9 is a schematic structural diagram of a first image synthesis apparatus according to the present invention;
FIG. 10 is a first schematic diagram of a medium-high frequency synthesizing unit of a first image synthesizing apparatus according to the present invention;
FIG. 11 is a schematic diagram of an exemplary embodiment of a value assignment unit;
FIG. 12 is a second schematic structural diagram of a medium-high frequency synthesizing unit in the first image synthesizing apparatus of the present invention;
FIG. 13 is a third schematic structural diagram of a medium-high frequency synthesizing unit in the first image synthesizing apparatus of the present invention;
FIG. 14 is a schematic structural diagram of a second image synthesis apparatus according to the present invention;
FIG. 15 is a schematic structural diagram of a third image synthesizer according to the present invention.
Detailed Description
The image synthesis method provided by the invention comprises the steps of firstly obtaining the brightness information of each pixel point of each original image in a plurality of original images; then extracting high-frequency components from the brightness information of the pixel points; and determining the value of the pixel point at the corresponding position in the synthesized image according to the high-frequency component of each pixel point.
The method is suitable for all monochrome images and color images, and for monochrome images, the value of a pixel point refers to the gray value of the pixel point in the monochrome images; for a color image, the value of a pixel point refers to the Red, Green, Blue (RGB) value of the pixel point in the color image. The implementation steps of the method provided by the present invention are specifically described below by taking a color image with a size of m × n as an example, where m and n are both positive integers.
Please refer to fig. 1, which is a flowchart illustrating image synthesis according to the present invention, and the specific implementation steps are as follows:
s100: and respectively obtaining the brightness information Y (p, q) of each pixel point in each original image from the plurality of original images. Wherein the value of p is any positive integer from 1 to m, and the value of q is any positive integer from 1 to n.
Here, since the RGB color space is ideal for hardware implementation, the original image is represented by the RGB color space. When the human eye observes a colored object, the hue, saturation and brightness can be used to describe the object. The human eye is sensitive to luminance versus chrominance and luminance is a key parameter describing color perception, so luminance information of an image is also an important parameter characterizing an image. However, since the brightness is only a subjective description in the RGB color space, the RGB color space is often converted into a luminance-chrominance (YUV) space, and since the luminance and chrominance in the YUV space are separated, the luminance information of the pixel point of the original image can be extracted in the YUV space.
Wherein, Y (p, q) is the weighted sum of the RGB components of the pixels at the corresponding positions of the original image, that is:
Y(p,q)=a1*R(p,q)+a2*G(P,q)+a3*B(P,q)
wherein, a1, a2, and a3 are weighting coefficients of Red (Red, R), Green (Green, G), and Blue (Blue, B) components of pixels at corresponding positions in the original image, respectively.
S101: since the high-frequency component of the image can represent the detailed part of the image, the detailed part of the image refers to the information such as the edge and other sharp changes of the image, and the like, the high-frequency component of the corresponding pixel point is extracted from the brightness information Y (p, q) of each pixel point, which is equivalent to the extraction of the detailed information of the image at the corresponding pixel point.
The extraction of the high-frequency components of the pixel points can be realized through a high-pass filter, and the high-frequency components of all the pixel points are obtained after the original image passes through the high-pass filter with a set threshold value.
S102: and determining the RGB value of the pixel point at the corresponding position in the synthesized image according to the high-frequency component of each pixel point in each original image.
The above S102 can be implemented by three ways: a comparison method, a weighting method, and an iterative method based on the comparison method or the weighting method, and the following describes in detail three implementation manners of S102, respectively.
Please refer to fig. 2, which is a flowchart illustrating image synthesis using a comparison method according to the present invention, including:
s200: and respectively taking the absolute value of the high-frequency component of each pixel point of each original image to obtain the high-frequency component intensity value of each pixel point, wherein the size of the high-frequency component intensity value represents the definition degree of the pixel point of the original image at the corresponding position. Here, the high-frequency component intensity value of the pixel point of the original image at the position (p, q) is represented by abs _ edge (p, q).
S201: and comparing the high-frequency component intensity values of the pixel points at the same positions in different original images, and determining the original image with the maximum high-frequency component intensity value of the pixel point at the position aiming at each position.
S202: and determining the RGB value of the pixel point at the corresponding position in the synthesized image according to the original image with the maximum high-frequency component intensity value of each pixel point at the position.
The above S202 has two implementation manners, and the first implementation manner is: and directly taking the RGB value of the pixel point at the corresponding position of the original image with the maximum high-frequency component intensity value of the pixel point at each position as the RGB value of the pixel point at the corresponding position of the synthetic image.
Fig. 3 is a flowchart of a second implementation manner of S202, including:
s301: setting a mask plane with the same size as the original image, wherein the mask is used for representing the mask plane; and for each specific position, setting the value of the point at the corresponding position on the mask according to the original image with the maximum high-frequency component intensity value of the pixel point at the position.
For example, there are N original images, abs _ edge (p, q) represents the intensity value of the high frequency component of the pixel at the position (p, q) of any one of the original images, and if abs _ edge (p, q) is the ith original image at the maximum, the value of mask (p, q) of the point at the position (p, q) in the mask may be set to i, and different original images correspond to different values of mask (p, q).
S302: after abs _ edge (p, q) corresponding to all position pixel points in the original image are compared, smooth filtering is carried out on mask to obtain a smooth plane, the smooth plane is represented by mask _ smooth, a threshold value is set according to values of all mask _ smooth (p, q), and N-1 threshold values are required to be set for N original images.
Wherein the smoothing filtering may be implemented by a normalized smoothing filter. The reason why the smoothing filtering is performed is to eliminate the noise left in the original image when the high frequency components of the pixel points are extracted, because the noise is also extracted as one of the high frequency components in S101, which results in inaccurate results in the subsequent S201. In the process of smoothing filtering, the value of a point on the mask is determined by comprehensively using the values of the mask (p, q) of the current point and the points around the current point, so that the reliability of the final result is improved. For example, for a 5 × 5 smoothing window, there are 24 1 s, and only the center point is 2, we consider that 2 of the center point is not reliable, and the output result of the center point after smoothing filtering is close to 1, e.g., the output value of the center point of the smoothing window is 26/25 ═ 1.04.
S303: and comparing the value of each mask _ smooth (p, q) in the smooth plane with the set threshold value, and taking the RGB value of the pixel point corresponding to the corresponding position (p, q) in the original image as the RGB value of the pixel point corresponding to the corresponding position in the synthesized image according to the comparison result.
The following describes the second implementation of the comparison method in detail by taking an example of the composition of two original images and three original images.
Referring to fig. 4, the present invention is a flow chart comparing two images. The method specifically comprises the following steps:
s400: and judging whether the abs _ edge1(p, q) which is not compared is larger than abs _ edge2(p, q), if so, executing S401, otherwise, executing S402.
Wherein abs _ edge1(p, q) and abs _ edge2(p, q) represent the intensity values of the high frequency components of the pixels at the same position in the original image 1 and the original image 2, respectively.
S401: the value of the corresponding position point of the mask is set, for example, mask (p, q) ═ 1 is set.
S402: the value of the corresponding position point of the mask is set, for example, mask (p, q) ═ 2 is set.
S403: and judging whether the high-frequency component intensity values of the pixel points at the same positions in the original image are compared completely, namely judging whether abs _ edge1(p, q) and abs _ edge2(p, q) are compared completely, if so, continuing to execute S404, and otherwise, executing S420.
S404: carrying out smooth filtering on the mask to obtain a mask _ smooth; the threshold Th is set according to the values of all mask _ smooth (p, q), and then S405 is executed. For example, when mask (p, q) is set to 1 or 2, since the values of all points on mask _ smooth after smoothing filtering are close to either 1 or 2, the threshold Th may be set to 1.5.
S420: selecting abs _ edge1(p, q) and abs _ edge2(p, q) corresponding to the pixel point at the same position (p, q) where no comparison is performed, and returning to execute S400.
S405: and judging whether the mask _ smooth (p, q) which is not compared with the threshold value is smaller than Th, if so, executing S406, and otherwise, executing S407.
S406:C(p,q)=C1(p, q), namely, the RGB values of the pixel points at the corresponding positions in the original image 1 are used as the RGB values of the pixel points at the corresponding positions in the synthesized image, and then S408 is executed.
S407:C(p,q)=C2(p, q), namely, the RGB values of the pixel points at the corresponding positions in the original image 2 are used as the RGB values of the pixel points at the corresponding positions in the synthesized image.
Wherein, C (p, q), C1(p, q) and C2And (p, q) respectively represent the RGB values of the pixel points at the same position of the composite image, the original image 1 and the original image 2.
S408: and judging whether the comparison of the values of all the points on the mask _ smooth with the threshold value is finished, if so, finishing the operation, and otherwise, executing S409.
S409: and selecting the mask _ smooth (p, q) which is not compared, and returning to execute the step S405.
Referring again to FIG. 5, a flow chart for synthesizing three images is compared in the present invention. The method comprises the following steps:
s500: and comparing the high-frequency component intensity values of the pixel points corresponding to the same position in the three original images, namely comparing the same (p, q) corresponding abs _ edge1(p, q), abs _ edge2(p, q) and abs _ edge3(p, q), and finding out the original image with the maximum high-frequency component intensity value of the pixel point at the position.
S501: setting the values of all corresponding position points on the mask according to the comparison result, for example, if the maximum high-frequency component intensity value of a certain position pixel point obtained by the comparison in S500 is the ith original image, where i is 1, 2, and 3, then the mask (p, q) may be i.
S502: and judging whether the comparison of the high-frequency component intensity values of the pixel points at the same positions in the three original images is finished, if so, continuing to execute S503, otherwise, executing S520.
S503: and carrying out smooth filtering on the mask to obtain the mask _ smooth. 2 thresholds Th1 and Th2 are set according to the values of all mask _ smooth (p, q), and then S504 is executed. For example, when mask (p, q) ═ i is set, since the values of all points on mask _ smooth after smoothing filtering are close to either 1 or 2, or 3, Th1 can be set to 1.5 and Th2 can be set to 2.5.
S520: selecting abs _ edge1(p, q), abs _ edge2(p, q) and abs _ edge3(p, q) corresponding to the pixel point at the same position (p, q) where no comparison is performed, and returning to perform S500.
S504: and judging whether the mask _ smooth (p, q) which is not compared with the threshold value is smaller than Th1, if so, executing S505, otherwise, executing S506.
S505:C(p,q)=C1(p, q), the RGB values of the pixels at the corresponding positions in the original image 1 are used as the RGB values of the pixels at the corresponding positions in the synthesized image, and then S509 is executed.
S506: and judging whether the mask _ smooth (p, q) is larger than Th2, if so, executing S507, otherwise, executing S508.
S507:C(p,q)=C3(p, q), namely, the RGB values of the pixels at the corresponding positions in the original image 3 are taken as the RGB values of the pixels at the corresponding positions in the synthesized image, and then S5 is executed09。
S508:C(p,q)=C2(p, q), namely, the RGB value of the pixel point at the corresponding position of the original image 2 is used as the RGB value of the pixel point at the corresponding position in the synthetic image.
S509: and judging whether the comparison of the values of all the points on the mask _ smooth with the threshold value is finished, if so, ending the operation, and otherwise, executing S510.
S510: and selecting the mask _ smooth (p, q) which is not compared, and returning to execute S504.
Fig. 6 is a flowchart of image synthesis using a weighting method, including:
s600: calculating the high-frequency component intensity value of each pixel point in each original image, for example, for N original images m × N, the high-frequency component intensity value of the pixel point at the position (p, q) on the ith original image is abs _ edgei(p,q)。
S601: and calculating the weighting coefficient of each pixel point according to the high-frequency component intensity value of the pixel point at the same position in each original image.
If N original images exist, the weighting coefficient k of the pixel point at the position (p, q) on the ith original imagei(p, q) is:
<math> <mrow> <msub> <mi>k</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <mi>abs</mi> <mo>_</mo> <msub> <mi>edge</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> </mrow> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </munderover> <mi>abs</mi> <mo>_</mo> <msub> <mi>edge</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow> </math>
wherein i is any positive integer from 1 to N, ki(p, q) satisfies: <math> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </munderover> <msub> <mi>k</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>=</mo> <mn>1</mn> <mo>,</mo> </mrow> </math> such normalization is performed to make the overall brightness of the final composite image constant for each original image.
S602: synthesizing the original image into a synthesized image according to the weighting coefficient, wherein pixel points at each position of the synthesized image satisfy the following conditions:
<math> <mrow> <mi>C</mi> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </munderover> <msub> <mi>k</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> <mo>*</mo> <msub> <mi>C</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> </mrow> </math>
wherein C (p, q) represents the RGB value of the pixel point at the position (p, q) in the composite image, CiAnd (p, q) represents the RGB value of a pixel point at the position (p, q) in the ith original image, and the RGB value of each pixel point on the synthesized image is obtained by weighted average of the RGB values of the pixel points at the corresponding positions of all the original images. Due to the weighting coefficient kiThe (p, q) represents the proportion of the high-frequency component of the pixel point at the corresponding position in each original image, namely the amount of detail information on the pixel point at the corresponding position in each original image, so that the contribution of the pixel point at the corresponding position in each original image to the pixel point at the corresponding position in the synthesized image can be determined according to the method, and the clearer synthetic image with the large depth of field can be obtained.
Fig. 7 is a flowchart of image synthesis using an iterative method according to the present invention, and the specific implementation steps are as follows.
S700: two original images are arbitrarily selected.
S701: synthesizing the two images into an intermediate synthesized image according to the comparison method or the weighting method for the two images;
s702: and judging whether all the original images are synthesized, if so, taking the obtained intermediate synthesized image as a final synthesized image, otherwise, continuing to execute S703 and continuing the next iteration process.
S703: and selecting the intermediate synthetic image obtained in the last iteration process, randomly selecting an original image which is not subjected to the synthetic operation, and returning to execute the step S701.
Please refer to fig. 8, which is a schematic diagram illustrating the image synthesis using the iterative method according to the present invention. Wherein, for N original images, the iteration process is as follows: synthesizing the original image 1 and the original image 2 according to a comparison method or a weighting method aiming at the two images to obtain an intermediate synthetic image New _ 1; the intermediate synthetic image New _1 and the original image 3 are synthesized according to a comparison method or a weighting method aiming at the two images to obtain an intermediate synthetic image New _ 2; the intermediate synthetic image New _2 and the original image 4 are synthesized according to a comparison method or a weighting method aiming at the two images to obtain an intermediate synthetic image New _ 3; and sequentially iterating until the intermediate synthetic image New _ N-2 and the original image N are synthesized according to a comparison method or a weighting method aiming at the two images to obtain an intermediate synthetic image New _ N-1, wherein the intermediate synthetic image is the final synthetic image.
The iteration rule is formulated as:
CNew_1=C1+C2
CNew_i-1=CNew_i-2+Ci
wherein, CiRepresenting the ith original image, CNew_i-1Denotes an intermediate composite image obtained through i-1 image composition operations, likewise, CNew_i-2And i is an arbitrary positive integer from 1 to N, and represents an intermediate synthetic image obtained through i-2 image synthesis operations. The final composite image can be obtained after N original images need to be subjected to N-1 image synthesis operations. The method has the beneficial effects that: the iteration logic is simple, the occupied memory during calculation is very small, and the whole iteration process only needs one set of processing scheme for synthesizing two images, so that the method has strong expandability.
It should be noted that, in order to express the method of the present invention more clearly, all the above formulas are expressed based on the value of a single pixel, and during the specific calculation, for example, for an original image with a size of m × n, the implementation of the above scheme only needs to perform simple size comparison or four arithmetic operations on a plurality of m × n matrices in a computer, and does not need too many complicated calculations. Therefore, compared with the prior art, the method is simpler and easier to implement, and greatly simplifies the calculation amount in the image synthesizing process.
Based on the above method for synthesizing the high frequency components of the pixel points, if all the pixel points of the original images at the same position have no high frequency component, that is, the pixel points of the positions of all the original images only have low frequency components. These small amounts of low frequency components can be further extracted when extracting the high frequency components of the pixels of the original image. For example, when extracting the high frequency component, the frequency component needing to be filtered, which is lower than the threshold value of the high pass filter, may be additionally stored, and for the same position of different original images, if there are all frequency components needing to be filtered, the frequency component stored corresponding to the position is the low frequency component.
Determining the value of the pixel point at the corresponding position in the synthesized image according to the values of the pixel points at the position of all the original images with the low-frequency components at the same position, wherein the specific processing can adopt two processing methods: (1) directly carrying out weighted average on the RGB values of the pixel points at the positions of the original images, wherein for N original images, the weighting coefficients are respectively 1/N, namely the RGB value of the pixel point at the corresponding position in the final synthesized image is the average value of the sum of the RGB values of the pixel points at the corresponding positions in all the original images; (2) and taking the RGB value of the pixel point of any original image at the position as the RGB value of the pixel point at the corresponding position in the synthetic image. And finally, processing the low-frequency synthetic image and the synthetic image obtained in the previous step to obtain a synthetic image containing high-frequency components and low-frequency components.
Fig. 9 is a schematic structural diagram of a first image synthesis apparatus according to the present invention, as shown in fig. 9, the apparatus includes: a luminance information extraction unit, a high-frequency component extraction unit, and a high-frequency synthesis unit. The function of each unit is specifically described below.
The brightness information extraction unit is used for extracting the brightness information of each pixel point in each original image and providing the brightness information of each pixel point in each original image to the high-frequency component extraction unit.
The high-frequency component extraction unit is used for obtaining the high-frequency component of each pixel point in each original image according to the brightness information of each pixel point in each original image and providing the high-frequency component to the high-frequency synthesis unit.
The high-frequency synthesis unit is used for determining the value of the pixel point at the corresponding position in the synthetic image according to the high-frequency component of each pixel point in each original image, namely synthesizing the value of the corresponding pixel point in the original image into the value of the corresponding pixel point in the synthetic image.
When the values of the pixels at the corresponding positions in the synthesized image are determined according to the high-frequency components of each pixel in each original image in different manners, the high-frequency synthesizing unit may include different sub-units, and the structure of the high-frequency synthesizing unit is described below with reference to the schematic diagram in sequence.
Fig. 10 is a schematic structural diagram of a medium-high frequency synthesizing unit in an image synthesizing apparatus of the present invention, which includes: the device comprises a high-frequency component intensity value calculation unit, a comparison unit and an assignment unit.
The high-frequency component intensity value calculation unit is used for calculating the high-frequency component intensity value of the corresponding pixel point according to the high-frequency component of each pixel point in each original image and providing the high-frequency component intensity value of each pixel point in each original image to the comparison unit.
The comparison unit is used for comparing the high-frequency component intensity values of the pixel points at the same positions in different original images, determining the original image with the maximum high-frequency component intensity value of each pixel point at the position and informing the assignment unit.
And the assignment unit is used for determining the value of the pixel point at the corresponding position in the synthesized image according to the original image with the maximum high-frequency component intensity value of each pixel point at the position. The assignment unit can directly use the value of the pixel point at the corresponding position of the original image with the maximum high-frequency component intensity value of the pixel point at each position as the value of the pixel point at the corresponding position of the synthesized image.
In addition, the assignment unit may also perform some processing according to the original image with the maximum high-frequency component intensity value of each position pixel point, and finally obtain the value of the pixel point at the corresponding position of the synthesized image, as shown in fig. 11, the assignment unit includes: the device comprises a mask plane setting unit, a smoothing filtering unit and a threshold value comparing unit. The specific functions of each unit are as follows:
the mask plane setting unit is used for setting a mask plane with the same size as the original image, determining the value of a point at a corresponding position on the mask plane according to the original image with the maximum high-frequency component intensity value of each position pixel point, and providing the mask plane with the assigned values of all the positions to the smoothing filtering unit.
For example, if there are N original images, and the comparison unit determines that the highest intensity value of the high-frequency component of the pixel point at the position is the ith original image for a same position in each original image, the mask plane setting unit may set the value of the point corresponding to the position in the mask plane as i. And for the value of the point at each position, ensuring that the determined values of the points at the positions on the original image and the mask plane are in one-to-one correspondence, namely that the determined original images are different, and the set values of the points at the corresponding positions on the mask plane are different.
The smoothing filtering unit is used for setting a threshold according to the values of the points at all positions of the mask plane, performing smoothing filtering on the received mask plane, and providing the set threshold and the smoothing plane obtained after filtering to the threshold comparison unit. The smoothing filter unit may be implemented by a normalized smoothing filter. The smoothing filtering can eliminate the noise left in the original image when the high-frequency components of the pixel points are extracted.
And the threshold comparison unit is used for comparing the value of each position point on the smooth plane with the threshold value, and taking the value of the corresponding position pixel point of the corresponding original image as the value of the corresponding position pixel point in the synthesized image according to the comparison result.
Fig. 12 is a schematic structural diagram of a medium-high frequency synthesizing unit in the first image synthesizing apparatus according to the present invention. The method comprises the following steps: a high-frequency component intensity value calculation unit, a weighting coefficient calculation unit and a weighting unit. Wherein each unit functions as follows:
the high-frequency component intensity value calculation unit is used for calculating the high-frequency component intensity value of the corresponding pixel point according to the high-frequency component of each pixel point in each original image and providing the high-frequency component intensity value of each pixel point in each original image to the weighting coefficient calculation unit.
The weighting coefficient calculation unit is used for calculating the weighting coefficient of each pixel point in each original image according to the high-frequency component intensity value of each pixel point in each original image and providing the weighting coefficient to the weighting unit.
And the weighting unit is used for weighting and synthesizing the values of the pixel points at the same positions of all the original images into the value of the pixel point at the corresponding position in the synthesized image according to the weighting coefficient of each pixel point in each original image.
Fig. 13 is a schematic structural diagram of a medium-high frequency synthesizing unit in the first image synthesizing apparatus of the present invention, which includes a first high-frequency component intensity value calculating unit, a first storage unit, a third storage unit, a two-image synthesizing unit, a second storage unit, a second luminance information extracting unit, a second high-frequency component intensity value calculating unit, and a counting unit, wherein the units function as follows:
the first high-frequency component intensity value calculation unit is used for calculating the high-frequency component intensity value of the corresponding pixel point according to the high-frequency component of each pixel point in each original image and providing the high-frequency component intensity value of each pixel point in each original image to the first storage unit.
The first storage unit is used for storing the high-frequency component intensity value of each pixel point in each original image and numbering the high-frequency component intensity values in sequence by taking the original image as a unit; in addition, the first storage unit is further configured to select a high-frequency component intensity value of each pixel point in the original image corresponding to the current count value provided by the counting unit, and provide the high-frequency component intensity value of each pixel point in the original image to the two-image synthesizing unit. For example, if the current count value provided by the counting unit is i, the first storage unit provides the stored high-frequency component intensity value of each pixel point in the ith original image to the two-image synthesizing unit.
The third storage unit is used for storing each original image and sequentially numbering each original image, and is also used for selecting the original image with the number corresponding to the current counting value according to the current counting value provided by the counting unit and providing the original image for the two-image synthesis unit.
The third storage unit is used for storing the high-frequency component intensity values of the pixels in the original image, wherein the serial number sequence of the original image coded by the third storage unit is consistent with the serial number sequence of the high-frequency component intensity values of the pixels in the same original image in the first storage unit. For example, if the current count value provided by the counting unit is i, the first storage unit provides the high-frequency component intensity value of each pixel point in the stored ith original image to the two-image synthesis unit, and the third storage unit provides the stored ith original image to the two-image synthesis unit.
The two-image synthesizing unit is used for determining the value of the pixel point at the corresponding position in the intermediate synthesized image according to the received high-frequency component intensity value of each pixel point in the two images, the specific processing can be carried out according to a comparison method or a weighting method aiming at the two images, the obtained intermediate synthesized image is provided for the second storage unit, and in addition, the two-image synthesizing unit also provides a counting notification message for the counting unit after obtaining the intermediate synthesized image. The first image synthesis processing by the two-image synthesis unit is performed for two original images, and the subsequent image synthesis processing is performed for one original image and one intermediate synthesized image.
In an initial situation, the first storage unit may directly provide the high-frequency component intensity values of the pixel points in the first original image and the second original image to the two-image synthesizing unit, and the two-image synthesizing unit synthesizes the two corresponding images into the first intermediate synthesized image according to a comparison method or a weighting method for the two images.
The second storage unit is used for storing the intermediate synthetic image and respectively providing the stored intermediate synthetic image to the second brightness information extraction unit and the two-image synthesis unit according to the current counting value provided by the counting unit; further, the second storage unit is also configured to output the intermediate composite image as a final composite image according to the output notification message provided by the counting unit.
The second brightness information extraction unit is used for extracting the brightness information of each pixel point in the intermediate synthetic image and providing the brightness information of each pixel point in the intermediate synthetic image to the second high-frequency component extraction unit.
The second high-frequency component extraction unit is used for obtaining the high-frequency component of each pixel point according to the brightness information of each pixel point in the intermediate synthetic image and providing the high-frequency component to the second high-frequency component intensity value calculation unit.
The second high-frequency component intensity value calculation unit is used for calculating the high-frequency component intensity value of each pixel point according to the high-frequency component of each pixel point in the intermediate synthetic image and providing the high-frequency component intensity value of each pixel point in the intermediate synthetic image to the two-image synthesis unit.
And the counting unit is used for adding 1 to the counting value as a current counting value when receiving the counting notification message, judging whether all the original images are synthesized according to the current counting value, if so, sending an output notification message to the second storage unit, and otherwise, respectively providing the current counting value to the first storage unit and the third storage unit.
The counting unit may be implemented by a counter, for example, the initial value of the counter may be set to 2, the counter may be incremented by 1 when receiving the count notification message, for example, for N original images, the current count value of the counter is i before the count notification message, when receiving the count notification message, the count value of the counter becomes i +1, and i +1 is taken as the current count value. And judging whether all the original images are synthesized or not according to the current count value, namely judging whether the current count value is greater than N or not, if so, indicating that all the original images participate in image synthesis, and sending an output notification message to a second storage unit, otherwise, respectively providing the current count value to a first storage unit and a third storage unit by a counting unit.
As shown in fig. 14, a schematic configuration diagram of a second image synthesizing apparatus according to the present invention includes: the device comprises a first storage unit, a second storage unit, a brightness information extraction unit, a high-frequency component intensity value calculation unit, a two-graph synthesis unit and a counting unit. Wherein each unit functions as follows:
the first storage unit is used for storing each original image and numbering each original image in sequence; in addition, the first storage unit is used for selecting an original image corresponding to the current count value according to the current count value provided by the counting unit and providing the original image to the two-image synthesizing unit. For example, if the current count value provided by the counting unit is i, the first storage unit provides the stored ith original image to the two-image synthesizing unit.
The second storage unit is used for storing the intermediate synthetic image currently provided by the two-image synthesis unit and directly providing the intermediate synthetic image to the brightness information extraction unit when receiving the current counting value provided by the counting unit; further, the second storage unit is also configured to output the intermediate composite image as a final composite image according to the output notification message provided by the counting unit.
The brightness information extraction unit is used for extracting the brightness information of each pixel point in the image and providing the brightness information of each pixel point in the image to the high-frequency component extraction unit.
The high-frequency component extraction unit is used for obtaining the high-frequency component of each pixel point according to the brightness information of each pixel point in the image and providing the high-frequency component to the high-frequency component intensity value calculation unit.
The high-frequency component intensity value calculation unit is used for calculating the high-frequency component intensity value of each pixel point according to the high-frequency component of each pixel point in the image and providing the high-frequency component intensity value to the two-image synthesis unit.
The images processed in the luminance information extraction unit are the original image and the intermediate composite image, respectively, so that the images processed in the high-frequency component extraction unit and the high-frequency component intensity value calculation unit are also the original image and the intermediate composite image, respectively.
The two-image synthesizing unit is used for determining the value of the pixel point at the corresponding position in the intermediate synthesized image according to the received high-frequency component intensity value of each pixel point in the two images, the specific processing can be carried out according to a comparison method or a weighting method aiming at the two images, the intermediate synthesized image is provided for the second storage unit, and in addition, the two-image synthesizing unit also provides a counting notification message for the counting unit after obtaining the intermediate synthesized image.
In an initial situation, the first storage unit may directly provide the high-frequency component intensity values of the pixel points in the first original image and the second original image to the two-image synthesizing unit, and the two-image synthesizing unit synthesizes the two corresponding images into the first intermediate synthesized image according to a comparison method or a weighting method for the two images.
And the counting unit is used for adding 1 to the counting value as a current counting value when receiving the counting notification message, judging whether all the original images are synthesized according to the current counting value, if so, sending an output notification message to the second storage unit, and otherwise, respectively providing the current counting value to the first storage unit and the second storage unit.
The counting unit may be implemented by a counter, for example, the initial value of the counter may be set to 2, the counter may be incremented by 1 when receiving the count notification message, for example, for N original images, the current count value of the counter is i before the count notification message, when receiving the count notification message, the count value of the counter becomes i +1, and i +1 is taken as the current count value. And judging whether all the original images are synthesized or not according to the current count value, namely judging whether the current count value is greater than N or not, if so, indicating that all the original images participate in image synthesis, and at the moment, sending an output notification message to a second storage unit by a counting unit, otherwise, respectively providing the current count value to a first storage unit and a second storage unit by the counting unit.
All of the above-described image synthesizing apparatuses provided by the present invention further include a low-frequency component extracting unit, a low-frequency synthesizing unit, and a synthesizing unit. Referring to fig. 15, the high-frequency image synthesizing apparatus is the apparatus shown in fig. 9 and 14, and each additional unit functions as follows:
the low-frequency component extraction unit is used for acquiring low-frequency components of pixel points in each original image, selecting pixel points at corresponding positions of different original images with low-frequency components at the same position, and providing values of the pixel points to the low-frequency synthesis unit. The implementation of obtaining the low-frequency component of each pixel point in each original image roughly includes the following two ways. One mode is that the brightness information extraction unit can further provide the brightness information of each pixel point in each original image to the low-frequency component extraction unit; the low-frequency component extraction unit acquires the low-frequency components of the pixel points lower than a set threshold value according to the brightness information of each pixel point in each original image. The threshold may be set as needed, for example, the threshold is a threshold of a high-pass filter when extracting the high-frequency component. The other mode is that the high-frequency component extraction unit is further used for providing the filtered frequency components for the low-frequency component extraction unit, and the frequency components are the low-frequency components of the pixel points in each original image finally obtained by the low-frequency component extraction unit.
The low-frequency synthesis unit is used for determining the value of the pixel point at the corresponding position in the low-frequency synthesis image according to the values of the pixel points at the position of all the original images with the same position and low-frequency components, and providing the low-frequency synthesis image for the synthesis unit.
The low-frequency synthesis unit can directly take the value of the pixel point at the corresponding position in any original image as the value of the pixel point at the corresponding position in the low-frequency synthesis image; or taking the weighted average value of the values of the pixel points at the corresponding positions of all the original images as the value of the pixel point at the corresponding position of the low-frequency synthetic image.
The synthesis unit is used for processing the received synthesis image and the low-frequency synthesis image to obtain a synthesis image containing a high-frequency component and a low-frequency component. The synthesized image received by the synthesizing unit can be from the high-frequency synthesizing unit in FIG. 9 or the two-image synthesizing unit in FIG. 14; the synthesized image received by the synthesizing unit may specifically come from the assigning unit in fig. 10, the threshold comparing unit in fig. 11, the weighting unit in fig. 12, and the two-map synthesizing unit in fig. 13, corresponding to the specific structure of the high-frequency synthesizing unit.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (7)

1、一种图像合成方法,其特征在于,在多幅原始图像中,该方法包括以下步骤:1. An image synthesis method, characterized in that, in multiple original images, the method comprises the following steps: A、分别得到每幅原始图像中每个像素点的亮度信息;A. Obtain the brightness information of each pixel in each original image respectively; B、从所述像素点的亮度信息中提取相应像素点的高频分量;B. extracting the high-frequency component of the corresponding pixel from the brightness information of the pixel; C、根据各原始图像中各像素点的高频分量,确定合成图像中相应位置像素点的值;C, according to the high-frequency components of each pixel in each original image, determine the value of the corresponding pixel in the composite image; 所述步骤C包括以下步骤:Described step C comprises the following steps: C11、得到各原始图像中每个像素点的高频分量强度值;C11, obtaining the high-frequency component intensity value of each pixel in each original image; C12、比较不同原始图像中各相同位置像素点的高频分量强度值的大小,针对于每个位置确定每个位置像素点的高频分量强度值最大的原始图像;C12. Comparing the magnitude of the high-frequency component intensity value of each pixel point at the same position in different original images, and determining the original image with the largest high-frequency component intensity value at each position pixel point for each position; C13、根据所述每个位置像素点的高频分量强度值最大的原始图像确定合成图像中相应位置像素点的值。C13. Determine the value of the pixel at the corresponding position in the synthesized image according to the original image with the largest intensity value of the high-frequency component of the pixel at each position. 2、如权利要求1所述的图像合成方法,其特征在于,所述步骤C13包括:2. The image synthesis method according to claim 1, characterized in that said step C13 comprises: 直接将各位置像素点的高频分量强度值最大的那幅原始图像中对应位置像素点的值作为合成图像中相应位置像素点的值;或者,Directly use the value of the pixel at the corresponding position in the original image whose high-frequency component intensity value of the pixel at each position is the largest as the value of the pixel at the corresponding position in the composite image; or, 设置与原始图像同样大小的掩模平面,根据所述每个位置像素点的高频分量强度值最大的原始图像设定掩模平面中对应位置点的值,各原始图像中所有像素点的高频分量强度值比较完毕后,对所述掩模平面进行平滑滤波得到平滑平面,根据平滑平面上所有点的取值设定阈值,比较所述平滑平面中每个点的值与所述阈值的大小,根据比较结果将原始图像中对应位置像素点的值作为合成图像中相应位置像素点的值。A mask plane with the same size as the original image is set, and the value of the corresponding position point in the mask plane is set according to the original image with the largest high-frequency component intensity value of the pixel at each position, and the height of all pixels in each original image After the comparison of the frequency component intensity values is completed, the mask plane is smoothed and filtered to obtain a smooth plane, the threshold is set according to the values of all points on the smooth plane, and the value of each point in the smooth plane is compared with the value of the threshold According to the comparison result, the value of the pixel at the corresponding position in the original image is used as the value of the pixel at the corresponding position in the synthesized image. 3、如权利要求1或2所述的图像合成方法,其特征在于,3. The image synthesis method according to claim 1 or 2, characterized in that, 所述步骤B中从所述像素点的亮度信息中提取相应像素点的高频分量时进一步包括:提取原始图像中像素点的低频分量;When extracting the high-frequency component of the corresponding pixel point from the brightness information of the pixel point in the step B further includes: extracting the low-frequency component of the pixel point in the original image; 所述步骤C之后进一步包括:根据所有原始图像中相同位置都存在低频分量的像素点的值确定低频合成图像中对应位置像素点的值,对所述低频合成图像和所述合成图像进行处理,得到包含高频分量和低频分量的合成图像。After the step C, it further includes: determining the value of the pixel at the corresponding position in the low-frequency composite image according to the values of the pixels with low-frequency components at the same position in all original images, and processing the low-frequency composite image and the composite image, A composite image containing high frequency components and low frequency components is obtained. 4、如权利要求3所述的图像合成方法,其特征在于,所述确定低频合成图像中对应位置像素点的值,包括:4. The image synthesis method according to claim 3, wherein said determining the value of the corresponding pixel in the low-frequency synthesized image comprises: 将各原始图像相应位置都存在低频分量的各像素点的值的加权平均作为低频合成图像中对应位置像素点的值;或者,Taking the weighted average of the values of the pixels of the low-frequency components in the corresponding positions of each original image as the value of the corresponding pixel in the low-frequency composite image; or, 将各原始图像相应位置都存在低频分量的其中一幅原始图像的对应位置像素点的值作为低频合成图像中对应位置像素点的值。The value of the pixel at the corresponding position of one of the original images in which the low-frequency component exists at the corresponding position of each original image is taken as the value of the pixel at the corresponding position in the low-frequency composite image. 5、一种图像合成装置,其特征在于,该装置包括:亮度信息提取单元、高频分量提取单元和高频合成单元,其中,5. An image synthesis device, characterized in that the device comprises: a brightness information extraction unit, a high-frequency component extraction unit, and a high-frequency synthesis unit, wherein, 所述亮度信息提取单元用于提取各原始图像中每个像素点的亮度信息;The brightness information extraction unit is used to extract the brightness information of each pixel in each original image; 所述高频分量提取单元用于根据所述各原始图像中每个像素点的亮度信息获得各原始图像中各像素点的高频分量;The high-frequency component extraction unit is used to obtain the high-frequency component of each pixel in each original image according to the brightness information of each pixel in each original image; 所述高频合成单元用于根据所述各原始图像中各像素点的高频分量确定合成图像中相应位置像素点的值;The high-frequency synthesis unit is used to determine the value of the pixel at the corresponding position in the composite image according to the high-frequency component of each pixel in the original images; 所述高频合成单元包括高频分量强度值计算单元、比较单元和赋值单元,其中,The high-frequency synthesis unit includes a high-frequency component intensity value calculation unit, a comparison unit and an assignment unit, wherein, 所述高频分量强度值计算单元用于根据所述各原始图像中每个像素点的高频分量计算相应像素点的高频分量强度值,The high-frequency component intensity value calculation unit is used to calculate the high-frequency component intensity value of the corresponding pixel according to the high-frequency component of each pixel in each original image, 所述比较单元用于对不同原始图像中各相同位置像素点的所述高频分量强度值的大小进行比较,确定每个位置像素点的高频分量强度值最大的原始图像,The comparison unit is used to compare the magnitudes of the high-frequency component intensity values of the pixels at the same position in different original images, and determine the original image with the largest high-frequency component intensity value at each position pixel, 所述赋值单元用于根据每个位置像素点的高频分量强度值最大的所述原始图像确定合成图像中相应位置像素点的值。The evaluation unit is configured to determine the value of the pixel at the corresponding position in the synthesized image according to the original image in which the intensity value of the high-frequency component of the pixel at each position is the largest. 6、如权利要求5所述的图像合成装置,其特征在于,所述赋值单元包括掩模平面设置单元、平滑滤波单元和阈值比较单元,其中,6. The image synthesis device according to claim 5, wherein the value assignment unit includes a mask plane setting unit, a smoothing filter unit, and a threshold comparison unit, wherein, 所述掩模平面设置单元用于设置与原始图像同样大小的掩模平面,根据所述每个位置像素点的高频分量强度值最大的原始图像确定掩模平面上对应位置点的值;The mask plane setting unit is used to set a mask plane with the same size as the original image, and determine the value of the corresponding position point on the mask plane according to the original image with the largest high-frequency component intensity value of each position pixel; 所述平滑滤波单元用于在各原始图像中所有像素点的高频分量强度值比较完毕后,对所述掩模平面进行平滑滤波得到平滑平面;The smoothing filter unit is used to perform smoothing filtering on the mask plane to obtain a smoothing plane after the comparison of the high-frequency component intensity values of all pixels in each original image is completed; 所述阈值比较单元用于根据所述平滑平面上所有点的取值设定阈值,比较所述平滑平面上每个位置的点的值与所述阈值的大小,并根据比较结果将相应原始图像的对应位置像素点的值作为合成图像中相应位置像素点的值。The threshold comparison unit is used to set a threshold according to the values of all points on the smooth plane, compare the value of a point at each position on the smooth plane with the magnitude of the threshold, and convert the corresponding original image to The value of the pixel at the corresponding position of is taken as the value of the pixel at the corresponding position in the composite image. 7、如权利要求5或6所述的图像合成装置,其特征在于,该装置进一步包括:低频分量提取单元、低频合成单元和合成单元,其中,7. The image synthesis device according to claim 5 or 6, characterized in that the device further comprises: a low-frequency component extraction unit, a low-frequency synthesis unit and a synthesis unit, wherein, 所述低频分量提取单元与所述亮度信息提取单元或所述高频分量提取单元连接,用于获取各原始图像中像素点的低频分量,选取不同原始图像中同一位置都存在低频分量的像素点;The low-frequency component extraction unit is connected to the brightness information extraction unit or the high-frequency component extraction unit, and is used to obtain low-frequency components of pixels in each original image, and select pixels with low-frequency components at the same position in different original images ; 所述低频合成单元与所述低频分量提取单元连接,用于根据所有原始图像中相同位置都存在低频分量的像素点的值确定低频合成图像中相应位置像素点的值;The low-frequency synthesis unit is connected to the low-frequency component extraction unit, and is used to determine the value of the pixel at the corresponding position in the low-frequency composite image according to the value of the pixel with the low-frequency component at the same position in all original images; 所述合成单元与所述低频合成单元以及所述高频合成单元连接,用于对所述合成图像和所述低频合成图像进行处理,得到包含高频分量和低频分量的合成图像。The synthesis unit is connected to the low-frequency synthesis unit and the high-frequency synthesis unit, and is used to process the synthesized image and the low-frequency synthesized image to obtain a synthesized image including high-frequency components and low-frequency components.
CNB2007100647399A 2007-03-23 2007-03-23 An image synthesis method and device Active CN100562894C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2007100647399A CN100562894C (en) 2007-03-23 2007-03-23 An image synthesis method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2007100647399A CN100562894C (en) 2007-03-23 2007-03-23 An image synthesis method and device

Publications (2)

Publication Number Publication Date
CN101021945A CN101021945A (en) 2007-08-22
CN100562894C true CN100562894C (en) 2009-11-25

Family

ID=38709701

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2007100647399A Active CN100562894C (en) 2007-03-23 2007-03-23 An image synthesis method and device

Country Status (1)

Country Link
CN (1) CN100562894C (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4508279B2 (en) * 2008-07-17 2010-07-21 ソニー株式会社 Image processing apparatus, image processing method, and program
CN101930606A (en) * 2010-05-14 2010-12-29 深圳市海量精密仪器设备有限公司 Field depth extending method for image edge detection
JP5901614B6 (en) * 2011-04-08 2018-06-27 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Image processing apparatus and image processing method
JP5358039B1 (en) * 2011-11-30 2013-12-04 パナソニック株式会社 Imaging device
CN102609931B (en) * 2012-02-01 2014-04-09 广州市明美光电技术有限公司 Field depth expanding method and device of microscopic image
CN104169970B (en) * 2012-04-18 2018-01-05 索尼公司 For the method and optical system of the depth map for determining image
US20130335594A1 (en) * 2012-06-18 2013-12-19 Microsoft Corporation Enhancing captured data
JP5664626B2 (en) * 2012-10-12 2015-02-04 カシオ計算機株式会社 Image processing apparatus, image processing method, and program
CN104270560B (en) * 2014-07-31 2018-01-12 三星电子(中国)研发中心 A kind of multi-spot method and apparatus
CN104394308B (en) * 2014-11-28 2017-11-07 广东欧珀移动通信有限公司 Method and terminal that dual camera is taken pictures with different visual angles
JP6453694B2 (en) * 2015-03-31 2019-01-16 株式会社モルフォ Image composition apparatus, image composition method, image composition program, and recording medium
CN104882097B (en) * 2015-06-08 2017-11-21 西安电子科技大学 Method for displaying image and system based on ambient light
CN105608716B (en) * 2015-12-21 2020-12-18 联想(北京)有限公司 Information processing method and electronic equipment
CN105844606A (en) * 2016-03-22 2016-08-10 博康智能网络科技股份有限公司 Wavelet transform-based image fusion method and system thereof
CN107506361A (en) * 2016-11-07 2017-12-22 北京辰安科技股份有限公司 Raster data polymerization and device, raster data decoupling method and apparatus and system
CN107465777A (en) * 2017-08-07 2017-12-12 京东方科技集团股份有限公司 Mobile terminal and its imaging method
CN107993218B (en) * 2018-01-30 2021-09-07 重庆邮电大学 Image fusion method based on algebraic multigrid and watershed segmentation
CN109345493A (en) * 2018-09-05 2019-02-15 上海工程技术大学 A method of multifocal image fusion of non-woven fabrics
CN109949258B (en) * 2019-03-06 2020-11-27 北京科技大学 An Image Restoration Method Based on NSCT Transform Domain
CN111083386B (en) * 2019-12-24 2021-01-22 维沃移动通信有限公司 Image processing method and electronic device
CN110913144B (en) * 2019-12-27 2021-04-27 维沃移动通信有限公司 Image processing method and imaging device
CN111861959A (en) * 2020-07-15 2020-10-30 广东欧谱曼迪科技有限公司 An ultra-long depth of field and ultra-wide dynamic image synthesis algorithm
CN114326090B (en) * 2022-02-28 2023-12-15 山东威高手术机器人有限公司 Binocular endoscope with extended depth of field, binocular endoscope system and binocular imaging method
CN115578301A (en) * 2022-10-26 2023-01-06 中国农业银行股份有限公司 An image processing method, device, equipment and medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
小波局部高频替代融合方法. 哈斯巴干,马建文,李启青,刘志丽,韩秀珍.中国图象图形学报,第7卷第10期. 2002 *
采用不同融合方法对土地利用现状进行自动分类的研究. 赵鲁燕,尹君,杨敏.河北省科学院学报,第22卷第2期. 2005 *

Also Published As

Publication number Publication date
CN101021945A (en) 2007-08-22

Similar Documents

Publication Publication Date Title
CN100562894C (en) An image synthesis method and device
KR102266649B1 (en) Image processing method and device
CN108055452B (en) Image processing method, device and equipment
JP5460173B2 (en) Image processing method, image processing apparatus, image processing program, and imaging apparatus
JP7285791B2 (en) Image processing device, output information control method, and program
CN101821772B (en) Method for processing digital object and related system
JP5760727B2 (en) Image processing apparatus and image processing method
JP4415188B2 (en) Image shooting device
JP5831033B2 (en) Imaging apparatus and distance information acquisition method
TW201029443A (en) Method and device for generating a depth map
JP2010011441A (en) Imaging apparatus and image playback device
US11282176B2 (en) Image refocusing
CN108154514A (en) Image processing method, device and equipment
JP2013242658A (en) Image processing apparatus, imaging device, image processing method, and image processing program
KR20120027712A (en) Method and apparatus for processing image
JP2013041117A (en) Imaging apparatus and distance information acquisition method
JP2022179514A (en) Control device, imaging device, control method and program
JP5765893B2 (en) Image processing apparatus, imaging apparatus, and image processing program
JP2010220207A (en) Image processing apparatus and image processing program
JP7374582B2 (en) Image processing device, image generation method and program
JP6624785B2 (en) Image processing method, image processing device, imaging device, program, and storage medium
CN112529773B (en) QPD image post-processing method and QPD camera
JP2016059051A (en) Imaging device and distance information acquisition method
KR101437898B1 (en) Apparatus and method for generating a High Dynamic Range image using single image
JP6938282B2 (en) Image processing equipment, image processing methods and programs

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20171221

Address after: 100083 Haidian District, Xueyuan Road, No. 35, the world building, the second floor of the building on the ground floor, No. 16

Patentee after: Zhongxing Technology Co.,Ltd.

Address before: 100083, Haidian District, Xueyuan Road, Beijing No. 35, Nanjing Ning building, 15 Floor

Patentee before: VIMICRO Corp.

TR01 Transfer of patent right
CP01 Change in the name or title of a patent holder

Address after: 100083 Haidian District, Xueyuan Road, No. 35, the world building, the second floor of the building on the ground floor, No. 16

Patentee after: Zhongxing Technology Co.,Ltd.

Address before: 100083 Haidian District, Xueyuan Road, No. 35, the world building, the second floor of the building on the ground floor, No. 16

Patentee before: Zhongxing Technology Co.,Ltd.

CP01 Change in the name or title of a patent holder
CP03 Change of name, title or address

Address after: 519031 Guangdong Province, Hengqin New Area, Zhuhai City, Huajin Street No. 58, Unit 1201-2

Patentee after: Zhongxing Micro Technology Co.,Ltd.

Country or region after: China

Address before: 100083 Haidian District, Xueyuan Road, No. 35, the world building, the second floor of the building on the ground floor, No. 16

Patentee before: Zhongxing Technology Co.,Ltd.

Country or region before: China

CP03 Change of name, title or address