[go: up one dir, main page]

CN116167956B - ISAR and VIS image fusion method based on asymmetric multi-layer decomposition - Google Patents

ISAR and VIS image fusion method based on asymmetric multi-layer decomposition Download PDF

Info

Publication number
CN116167956B
CN116167956B CN202310313924.6A CN202310313924A CN116167956B CN 116167956 B CN116167956 B CN 116167956B CN 202310313924 A CN202310313924 A CN 202310313924A CN 116167956 B CN116167956 B CN 116167956B
Authority
CN
China
Prior art keywords
layer
image
fusion
spatial frequency
sigma
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310313924.6A
Other languages
Chinese (zh)
Other versions
CN116167956A (en
Inventor
赵东
严伟明
张嘉嘉
徐星臣
王青
迟荣华
张昊睿
周磊
张黎可
黄瑞
胡剑凌
胡斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yuanchi Jiangsu Information Technology Co ltd
Original Assignee
Wuxi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi University filed Critical Wuxi University
Priority to CN202310313924.6A priority Critical patent/CN116167956B/en
Publication of CN116167956A publication Critical patent/CN116167956A/en
Application granted granted Critical
Publication of CN116167956B publication Critical patent/CN116167956B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an ISAR and VIS image fusion method based on asymmetric multi-layer decomposition, which loads an inverse synthetic aperture radar image and a visible light image with the same spatial resolution, compares the weighted spatial frequency variance of the inverse synthetic aperture radar image and the visible light image, and divides the two images into a detail image I a And coarse image I b The method comprises the steps of carrying out a first treatment on the surface of the Decomposing a frame pair I using a multi-layer Gaussian edge window filter a And I b Respectively decomposing to obtain I a Detail-preserving layer S of (2) da Edge retention layer S ea Basic energy layer S ga 、I b Detail-preserving layer S of (2) db Edge retention layer S eb And a basic energy layer S gb The method comprises the steps of carrying out a first treatment on the surface of the By S obtained da For S db Obtaining I by conducting guidance fusion strategy b Final asymmetric detail preserving fusion layer S fb The method comprises the steps of carrying out a first treatment on the surface of the Constructing a discriminant pair S using local variance and spatial frequency da And S is fb Fusing to obtain a final detail preserving fusion layer S fd The method comprises the steps of carrying out a first treatment on the surface of the By omega vs S ea And S is eb Fusing to obtain a final edge preservation fusion layer S fe The method comprises the steps of carrying out a first treatment on the surface of the Will S ga And S is equal to gb The fusion is carried out,obtaining a final basic energy layer S fg The method comprises the steps of carrying out a first treatment on the surface of the Will S fd ,S fe And S is fg Adding to obtain final fusion image I f

Description

ISAR and VIS image fusion method based on asymmetric multi-layer decomposition
Technical Field
The invention belongs to the field of image processing, and particularly relates to an ISAR and VIS image fusion method based on asymmetric multi-layer decomposition.
Background
Image fusion is an important branch in the field of image processing, and is widely applied to various civil and military fields such as medical diagnosis, multi-source data decision, multi-depth-of-field imaging and the like. An image obtained from one source can only acquire part of the information in the scene due to limitations in sensor physical characteristics and environmental factors. Due to the limitations of the visible light sensor, the resulting image cannot obtain the motion state and material information of the object within one frame. Inverse Synthetic Aperture Radar (ISAR) images may receive motion information of a target according to their imaging principles, but such images have poor ability to obtain information of a relatively stationary object, resulting in a large area of dark area of the image. Therefore, it is necessary to fuse various key information from different types of sensors to better understand the scene.
Common image fusion methods are largely divided into three main categories, a multi-scale transformation method, a sparse representation method and a hybrid model. These approaches attempt to fuse two source images by converting the images into a conversion space with independent features. Sparse representation based methods attempt to represent different components in the source image by sparse coefficients, hybrid models attempt to otherwise re-represent image features, and multi-scale transform methods attempt to decompose the image into different layers containing different details. In most image sources, the amount of information is different, and conventional image decomposition methods typically employ the same approach, which can lead to feature layer mismatch, affecting the final fusion performance.
Disclosure of Invention
Accordingly, the present invention is directed to an ISAR and VIS image fusion method based on asymmetric multi-layer decomposition.
In order to achieve the above purpose, the technical scheme of the invention is realized as follows:
the embodiment of the invention provides an ISAR and VIS image fusion method based on asymmetric multi-layer decomposition, which comprises the following steps:
step one, loading inverse synthetic aperture radar Image (ISAR) I with same spatial resolution 1 And visible light image (VIS) I 2
Step two, determining I 1 Spatial frequency F of (2) 1 And pass through F 1 Determining I 1 Is of weighted spatial frequency variance sigma 1 The method comprises the steps of carrying out a first treatment on the surface of the Determining I 2 Spatial frequency F of (2) 2 And pass through F 2 Determining I 2 Is of weighted spatial frequency variance sigma 2
Step three, comparing sigma 1 And sigma (sigma) 2 If sigma 1 Greater than or equal to sigma 2 Will I 1 Recorded as detail image I a ,I 2 Recorded as coarse image I b The method comprises the steps of carrying out a first treatment on the surface of the If sigma 1 Less than sigma 2 Will I 2 Recorded as detail image I a ,I 1 Recorded as coarse image I b
Step four, decomposing the frame pair I through a plurality of layers of Gaussian edge window filters a And I b Respectively carrying out multi-layer decomposition to obtain I a Detail-preserving layer S of (2) da 、I a Edge preserving layer S of (2) ea 、I a Is a basic energy layer S of (1) ga 、I b Detail-preserving layer S of (2) db 、I b Edge preserving layer S of (2) eb And I b Is a basic energy layer S of (1) gb
Step five, through S da For S db Performing guided fusion to obtain I b Detail preserving fusion layer S of (2) fb
Step six, determining S da Local variance L of (2) da (k, o) and spatial frequency F da Determining S fb Local variance L of (2) fb (k, o) and spatial frequency F fb Through L da (k,o)、L fb (k,o)、F da And F fb Structural discriminant pair S da And S is fb Fusing to obtain a final detail preserving fusion layer S fd
Step seven, determining S ea Is of weighted spatial frequency variance sigma ea And S is eb Is of weighted spatial frequency variance sigma eb Through sigma ea Sum sigma eb Determining a weight coefficient omega, and comparing S with omega ea And S is eb Weighted fusion is carried out to obtain a final edge preserving fusion layer S fe
Step eight, S is carried out ga And S is equal to gb Fusion to obtain the final basic energy fusion layer S fg
Step nine, through S fd 、S fe And S is fg The sum of the additions determines the final fusion image I f
In the above scheme, the second step is specifically implemented by the following steps:
(201) I is determined according to 1 Line frequency R of (2) 1 And I 1 Column frequency C of (2) 1 Is that
Wherein M and N respectively represent I 1 Length and width of I 1 (x, y) represents I 1 Gray values of pixel points in the x-th row and the y-th column, x is {1,.. M }, y is {1,.. A., N }, and sigma (·) represents a summation operation;
(202) I is determined according to 2 Line frequency R of (2) 2 And I 2 Column frequency C of (2) 2 Is that
Wherein M and N respectively represent I 2 Length and width of I 2 (x, y) represents I 2 Gray values of pixel points in the x-th row and y-th column, x epsilon { 1..once, M }, y epsilon { 1..once, N };
(203) I is determined according to 1 Spatial frequency F of (2) 1 And I 2 Spatial frequency F of (2) 2 Is that
The spatial frequency operation of one image can be completed by utilizing the steps (201) - (203);
(204) I is determined according to 1 Average value A of gray scale of (2) 1 And I 2 Average value A of gray scale of (2) 2 Is that
(205) I is determined according to 1 Gray value variance V of (2) 1 And I 2 Gray value variance V of (2) 2 Is that
(206) I is determined according to 1 Is of weighted spatial frequency variance sigma 1 And I 2 Is of weighted spatial frequency variance sigma 2 Is that
In the above scheme, the fourth step is specifically implemented by the following steps:
(301) Gauss high frequency extraction method in decomposition framework of multilayer Gauss side window filter a And I b Respectively decomposing, and determining I according to the following formula a Ith low frequency information I through gaussian filter gi ,I b Mth low frequency information I through gaussian filter fm Is that
Wherein I is {1,2,3}, m is {1,2,3}, I g0 =I a ,I f0 =I b GF (·) represents performing a gaussian filtering operation;
(302) Edge window high frequency extraction method in decomposition frame of multi-layer Gaussian edge window filter a And I b Respectively decomposing, and determining I according to the following formula a Jth low frequency information I through side window filter dj And I b Nth low frequency information I through side window filter hn Is that
Where j is {1,2,3}, n is {1,2,3}, I d0 =I a ,I h0 =I b SWF (·) represents performing a side window filtering operation, r represents the radius of the filtering window, e represents the number of filter iterations;
(303) I is determined according to a Detail-preserving layer S of (2) da And I b Detail-preserving layer S of (2) db Is that
(304) I is determined according to a Edge preserving layer S of (2) ea And I b Edge preserving layer S of (2) eb Is that
(305) I is determined according to a Is a basic energy layer S of (1) ga And I b Is a basic energy layer S of (1) gb Is that
In the above scheme, the fifth step is specifically implemented by the following steps:
(401) Using guided filtering operations, using S da For S db Guiding to obtain asymmetric guiding reinforcement layer G as
G=GUF(S da ,S db ) (12)
Wherein, GUF (·) represents performing a guided filtering operation;
(402) I is determined according to b Asymmetric detail preserving fusion layer S of (1) fb Is that
S fb =S db +G (13)。
In the above scheme, the sixth step is specifically implemented by the following steps:
(501) S is determined according to the following da Local variance L of (2) da (k, o) and S fb Local variance L of (2) fb (k,o) is
Wherein at S da Randomly generates a region with the length of P and the width of Q, and at S fb A region of the size P and Q is also generated, k represents the abscissa of the region center position, o represents the ordinate of the region center position, w represents the abscissa pixel index in the region, z represents the ordinate pixel index in the region, μ da Represent S da Average value of pixel gray values in this region, mu fb Represent S fb An average value of pixel gradation values in this region;
(502) S is determined according to the following da Spatial frequency F of (2) da And S is fb Spatial frequency F of (2) fb Is that
Wherein C is da Represent S da R is the column frequency of da Represent S da Line frequency of C fb Represent S fb R is the column frequency of fb Represent S fb Is a line frequency of (2);
(503) Determining a final detail preserving fusion layer S according to fd Is that
In the above scheme, the seventh step is specifically implemented by the following steps:
(601) S is determined according to the following ea Is of weighted spatial frequency variance sigma ea And S is eb Is of weighted spatial frequency variance sigma eb Is that
Wherein F is ea Represent S ea Spatial frequency of V ea Represent S ea Gray value variance of F eb Represent S eb Spatial frequency of V eb Represent S eb Is a gray value variance of (2);
(602) Determining the weight coefficient omega as follows
(603) Determining the final edge preserving fusion layer S according to the following fe Is that
S fe =ω×S ea +(1-ω)×S eb (19)。
In the above scheme, the step eight specifically includes: the final basic energy fusion layer S is determined according to the following formula fg Is that
In the above scheme, the step nine specifically includes: the final fused image I is determined according to the following formula f Is that
I f =S fd +S fe +S fg (21)。
Compared with the prior art, the method has the advantages that the weighted spatial frequency variance is used as an image information richness discrimination standard to divide two images into the detail image and the rough image, an asymmetric decomposition method is adopted to decompose the two images to prevent the loss of image information, and the detail retaining layer of the rough image is used for guiding and fusing the detail retaining layer of the rough image so as to strengthen the details of the detail retaining layer of the rough image; and then adopting three different fusion strategies to fuse different types of layers, and finally adding different fusion results to obtain a final fusion result.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is an inverse synthetic aperture radar image of an input of the present invention;
FIG. 3 is a visible light image of an input of the present invention;
FIG. 4 is a detail preserving layer of a coarse image of the present invention;
FIG. 5 is a guiding reinforcement layer of the present invention;
FIG. 6 is an asymmetric detail preserving fusion layer of a coarse image in the present invention;
FIG. 7 is a final detail-preserving fusion layer in the present invention;
FIG. 8 is a final edge preserving fusion layer in the present invention;
FIG. 9 is a final basic energy fusion layer of the present invention;
FIG. 10 is a graph of the final fusion results in the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
The embodiment of the invention provides an ISAR and VIS image fusion method based on asymmetric multi-layer decomposition, which is shown in figure 1, and comprises the following steps:
step one, loading inverse synthetic aperture radar Image (ISAR) I with same spatial resolution 1 And visible light image (VIS) I 2
Specifically, FIG. 2 is a loaded inverse synthetic aperture radar Image (ISAR) I 1 Containing certain aircraft speed information, FIG. 3 is a loaded visible light image (VIS) I 2 Contains certain scene information.
Step two, determining inverse synthetic aperture radar Image (ISAR) I 1 Spatial frequency F of (2) 1 By F 1 Determining I 1 Is of weighted spatial frequency variance sigma 1 The method comprises the steps of carrying out a first treatment on the surface of the Determining visible light image (VIS) I 2 Spatial frequency F of (2) 2 By F 2 Determining I 2 Is of weighted spatial frequency variance sigma 2
(201) I is determined according to 1 Line frequency R of (2) 1 And I 1 Column frequency C of (2) 1 Is that
Wherein M and N respectively represent I 1 Length and width of I 1 (x, y) represents I 1 Gray values of pixel points in the x-th row and the y-th column, x is {1,.. M }, y is {1,.. A., N }, and sigma (·) represents a summation operation;
specifically, I 1 Length M of 100, I 1 Is 100, I 1 Line frequency R of (2) 1 0.0456, I 1 Column frequency C 1 0.0275.
(202) I is determined according to 2 Line frequency R of (2) 2 And I 2 Column frequency C of (2) 2 Is that
Wherein M and N respectively represent I 2 Length and width of I 2 (x, y) represents I 2 Gray values of pixel points in the x-th row and y-th column, x epsilon { 1..once, M }, y epsilon { 1..once, N };
specifically, I 2 Length M of 100, I 2 Is 100, I 2 Line frequency R of (2) 2 0.0572, I 2 Column frequency C of (2) 2 0.0684.
(203) I is determined according to 1 Spatial frequency F of (2) 1 And I 2 Spatial frequency F of (2) 2 Is that
The spatial frequency operation of one image can be completed by utilizing the steps (201) - (203);
specifically, I 1 Spatial frequency F of (2) 1 0.0533, I 2 Spatial frequency F of (2) 2 0.0892.
(204) I is determined according to 1 Average value A of gray scale of (2) 1 And I 2 Average value A of gray scale of (2) 2 Is that
Specifically, I 1 Average value A of gray scale of (2) 1 0.0463, I 2 Average value A of gray scale of (2) 2 0.0459.
(205) I is determined according to 1 Gray value variance V of (2) 1 And I 2 Gray value variance V of (2) 2 Is that
Specifically, I 1 Gray value variance V of (2) 1 0.000107, I 2 Gray value variance V of (2) 2 0.000269.
(206) I is determined according to 1 Is of weighted spatial frequency variance sigma 1 And I 2 Is of weighted spatial frequency variance sigma 2 Is that
Specifically, I 1 Is of weighted spatial frequency variance sigma 1 0.000107, I 2 Is of weighted spatial frequency variance sigma 2 For 0.000268, the weighted spatial frequency variance is a global evaluation parameter, and the larger the weighted spatial frequency variance is, the more abundant the information carried by the image is.
Step three, comparing sigma 1 And sigma (sigma) 2 If sigma 1 Greater than or equal to sigma 2 Will I 1 Recorded as detail image I a ,I 2 Recorded as coarse image I b The method comprises the steps of carrying out a first treatment on the surface of the If sigma 1 Less thanσ 2 Will I 2 Recorded as detail image I a ,I 1 Recorded as coarse image I b
In particular, due to sigma 1 0.000107, sigma 2 0.000268, sigma 1 Less than sigma 2 So will I 2 Recorded as detail image I a ,I 1 Recorded as coarse image I b
Step four, decomposing the frame pair I by using a multi-layer Gaussian edge window filter a And I b Respectively carrying out multi-layer decomposition to obtain I a Detail-preserving layer S of (2) da 、I a Edge preserving layer S of (2) ea 、I a Is a basic energy layer S of (1) ga 、I b Detail-preserving layer S of (2) db 、I b Edge preserving layer S of (2) eb And I b Is a basic energy layer S of (1) gb
(301) Gauss high frequency extraction method in decomposition framework of multilayer Gauss side window filter a And I b Respectively decomposing, and determining I according to the following formula a Ith low frequency information I through gaussian filter gi And I b Mth low frequency information I through gaussian filter fm Is that
Wherein I is {1,2,3}, m is {1,2,3}, I g0 =I a ,I f0 =I b GF (·) represents performing a gaussian filtering operation;
(302) Edge window high frequency extraction method in decomposition frame of multi-layer Gaussian edge window filter a And I b Respectively decomposing, and determining I according to the following formula a Jth low frequency information I through side window filter dj And I b Nth low frequency information I through side window filter hn Is that
Where j is {1,2,3}, n is {1,2,3}, I d0 =I a ,I h0 =I b SWF (·) represents performing a side window filtering operation, r represents the radius of the filtering window, e represents the number of filter iterations;
specifically, the radius r of the filter window is 1 and the number e of filter iterations is 7.
(303) I is determined according to a Detail-preserving layer S of (2) da And I b Detail-preserving layer S of (2) db Is that
(304) I is determined according to a Edge preserving layer S of (2) ea And I b Edge preserving layer S of (2) eb Is that
(305) I is determined according to a Is a basic energy layer S of (1) ga And I b Is a basic energy layer S of (1) gb Is that
Step five, using a guide filter, utilizing S da For S db Performing guided fusion to obtain I b Detail preserving fusion layer S of (2) fb
(401) Using guided filtering operations, using S da For S db Guiding to obtain asymmetric guiding reinforcement layer G as
G=GUF(S da ,S db ) (12)
Wherein, GUF (·) represents performing a guided filtering operation;
(402) I is determined according to b Asymmetric detail preserving fusion layer S of (1) fb Is that
S fb =S db +G (13)。
Specifically, FIG. 4 is I b Detail-preserving layer S of (2) db FIG. 5 shows the use of S by a pilot filtering operation da For S db Guiding to obtain a guiding reinforcement layer G, FIG. 6 is I b Final detail preserving fusion layer S fb
Step six, determining S da Local variance L of (2) da (k, o) and S da Spatial frequency F of (2) da Determining S fb Local variance L of (2) fb (k, o) and S fb Spatial frequency F of (2) fb By L da (k,o)、L fb (k,o)、F da And F fb Structural discriminant pair S da And S is fb Fusing to obtain a final detail preserving fusion layer S fd
(501) S is determined according to the following da Local variance L of (2) da (k, o) and S fb Local variance L of (2) fb (k, o) is
Wherein at S da Randomly generates a region with the length of P and the width of Q, and at S fb A region of the size P and Q is also generated, k represents the abscissa of the region center position, o represents the ordinate of the region center position, w represents the abscissa pixel index in the region, z represents the ordinate pixel index in the region, μ da Represent S da Average value of pixel gray values in this region, mu fb Represent S fb An average value of pixel gradation values in this region;
specifically, the randomly generated regions have a length P of 3, a width Q of 3, a value of k of 25, a value of o of 37, μ da Has a value of 0.0215 mu fb Has a value of 0.0201, L da (k, o) has a value of 0.000175, L fb The value of (k, o) is 00.000154.
(502) S is determined according to the following da Spatial frequency F of (2) da And S is fb Spatial frequency F of (2) fb Is that
Wherein C is da Represent S da R is the column frequency of da Represent S da Line frequency of C fb Represent S fb R is the column frequency of fb Represent S fb Is a line frequency of (2);
specifically, C da Has a value of 0.0472, R da Has a value of 0.0602, C fb Has a value of 0.0432, R fb Has a value of 0.0253, F da Has a value of 0.0765, F fb Is 0.0501.
(503) Determining a final detail preserving fusion layer S according to fd Is that
Specifically F fb Less than F da ,L fb (k, o) is less than L da (k, o), so S fd Equal to S da FIG. 7 shows a final detail-preserving fusion layer S fd
Step seven, determining S ea Is of weighted spatial frequency variance sigma ea And S is eb Is of weighted spatial frequency variance sigma eb Using sigma ea Sum sigma eb Determining a weight coefficient omega, using omega to S ea And S is eb Weighted fusion is carried out to obtain a final edge preserving fusion layer S fe
(601) S is determined according to the following ea Is of weighted spatial frequency variance sigma ea And S is eb Is of weighted spatial frequency variance sigma eb Is that
Wherein F is ea Represent S ea Spatial frequency of V ea Represent S ea Gray value variance of F eb Represent S eb Spatial frequency of V eb Represent S eb Is a gray value variance of (2);
specifically F ea Has a value of 0.0762, V ea Has a value of 0.000246, F eb Has a value of 0.0489, V eb Has a value of 0.000101, sigma ea Has a value of 0.000245, sigma eb The value of (2) is 0.0001.
(602) Determining the weight coefficient omega as follows
Specifically, ω has a value of 0.71.
(603) Determining the final edge preserving fusion layer S according to the following fe Is that
S fe =ω×S ea +(1-ω)×S eb (19)。
Specifically, FIG. 8 shows a final edge preserving fusion layer S fd
Step eight, S is carried out according to the following steps ga And S is equal to gb Fusion to obtain the final basic energy fusion layer S fg Is that
Specifically, fig. 9 is the final basic energy fusion layer.
Step nine, utilize S fd 、S fe And S is fg The sum of the additions determines the final fusion image I f Is that
I f =S fd +S fe +S fg (21)
Specifically, fig. 10 is a final fusion result diagram, the red frame is marked as a detail after the ISAR and VIS are fused, the blue frame is marked as a visible background area, and the clearer the red frame area and the brighter the blue frame are, indicating that the better the fusion result is.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the present invention.

Claims (5)

1. An ISAR and VIS image fusion method based on asymmetric multi-layer decomposition is characterized in that the method comprises the following steps:
step one, loading an inverse synthetic aperture radar image ISAR with the same spatial resolution: i 1 And a visible light image VIS: i 2
Step two, determining I 1 Spatial frequency F of (2) 1 And pass through F 1 Determining I 1 Is of weighted spatial frequency variance sigma 1 The method comprises the steps of carrying out a first treatment on the surface of the Determining I 2 Spatial frequency F of (2) 2 And pass through F 2 Determining I 2 Is of weighted spatial frequency variance sigma 2
The method is realized by the following steps:
(201) I is determined according to 1 Line frequency R of (2) 1 And I 1 Column frequency C of (2) 1 Is that
Wherein M and N respectively represent I 1 Length and width of I 1 (x, y) represents I 1 Gray values of pixel points in the x-th row and the y-th column, x is {1,.. M }, y is {1,.. A., N }, and sigma (·) represents a summation operation;
(202) I is determined according to 2 Line frequency R of (2) 2 And I 2 Column frequency C of (2) 2 Is that
Wherein M and N respectively represent I 2 Length and width of I 2 (x, y) represents I 2 Gray values of pixel points in the x-th row and y-th column, x epsilon { 1..once, M }, y epsilon { 1..once, N };
(203) I is determined according to 1 Spatial frequency F of (2) 1 And I 2 Spatial frequency F of (2) 2 Is that
The spatial frequency operation of one image can be completed by utilizing the steps (201) - (203);
(204) I is determined according to 1 Average value A of gray scale of (2) 1 And I 2 Average value A of gray scale of (2) 2 Is that
(205) I is determined according to 1 Gray value variance V of (2) 1 And I 2 Gray value variance V of (2) 2 Is that
(206) I is determined according to 1 Is of weighted spatial frequency variance sigma 1 And I 2 Is of weighted spatial frequency variance sigma 2 Is that
Step three, comparing sigma 1 And sigma (sigma) 2 If sigma 1 Greater than or equal to sigma 2 Will I 1 Recorded as detail image I a ,I 2 Recorded as coarse image I b The method comprises the steps of carrying out a first treatment on the surface of the If sigma 1 Less than sigma 2 Will I 2 Recorded as detail image I a ,I 1 Recorded as coarse image I b
Step four, decomposing the frame pair I through a plurality of layers of Gaussian edge window filters a And I b Respectively carrying out multi-layer decomposition to obtain I a Detail-preserving layer S of (2) da 、I a Edge preserving layer S of (2) ea 、I a Is a basic energy layer S of (1) ga 、I b Detail-preserving layer S of (2) db 、I b Edge preserving layer S of (2) eb And I b Is a basic energy layer S of (1) gb
The method is realized by the following steps:
(301) Gauss high frequency extraction method in decomposition framework of multilayer Gauss side window filter a And I b Respectively decomposing, and determining I according to the following formula a Ith low frequency information I through gaussian filter gi ,I b Mth low frequency information I through gaussian filter fm Is that
Wherein I is {1,2,3}, m is {1,2,3}, I g0 =I a ,I f0 =I b GF (·) represents performing a gaussian filtering operation;
(302) Edge window high frequency extraction method in decomposition frame of multi-layer Gaussian edge window filter a And I b Respectively decomposing, and determining I according to the following formula a Jth low frequency information I through side window filter dj And I b Nth low frequency information I through side window filter hn Is that
Where j is {1,2,3}, n is {1,2,3}, I d0 =I a ,I h0 =I b SWF (·) represents performing a side window filtering operation, r represents the radius of the filtering window, e represents the number of filter iterations;
(303) I is determined according to a Detail-preserving layer S of (2) da And I b Detail-preserving layer S of (2) db Is that
(304) I is determined according to a Edge preserving layer S of (2) ea And I b Edge preserving layer S of (2) eb Is that
(305) I is determined according to a Is a basic energy layer S of (1) ga And I b Is a basic energy layer S of (1) gb Is that
Step five, through S da For S db Performing guided fusion to obtain I b Detail preserving fusion layer S of (2) fb
Step six, determining S da Local variance L of (2) da (k, o) and spatial frequency F da Determining S fb Local variance L of (2) fb (k, o) and spatial frequency F fb Through L da (k,o)、L fb (k,o)、F da And F fb Structural discriminant pair S da And S is fb Fusing to obtain a final detail preserving fusion layer S fd
The method is realized by the following steps:
(501) S is determined according to the following da Local variance L of (2) da (k, o) and S fb Local variance L of (2) fb (k, o) is
Wherein at S da Randomly generates a region with the length of P and the width of Q, and at S fb A region of the size P and Q is also generated, k represents the abscissa of the region center position, o represents the ordinate of the region center position, w represents the abscissa pixel index in the region, z represents the ordinate pixel index in the region, μ da Represent S da Average value of pixel gray values in this region, mu fb Represent S fb An average value of pixel gradation values in this region;
(502) S is determined according to the following da Spatial frequency F of (2) da And S is fb Spatial frequency F of (2) fb Is that
Wherein C is da Represent S da R is the column frequency of da Represent S da Line frequency of C fb Represent S fb R is the column frequency of fb Represent S fb Is a line frequency of (2);
(503) Determining a final detail preserving fusion layer S according to fd Is that
Step seven, determining S ea Is of weighted spatial frequency variance sigma ea And S is eb Is of weighted spatial frequency variance sigma eb Through sigma ea Sum sigma eb Determining a weight coefficient omega, and comparing S with omega ea And S is eb Weighted fusion is carried out to obtain a final edge preserving fusion layer S fe
Step eight, S is carried out ga And S is equal to gb Fusion to obtain the final basic energy fusion layer S fg
Step nine, through S fd 、S fe And S is fg The sum of the additions determines the final fusion image I f
2. The method for merging an ISAR image and a VIS image based on asymmetric multi-layer decomposition according to claim 1, wherein the fifth step is specifically implemented by:
(201) Using guided filtering operations, using S da For S db Guiding to obtain asymmetric guiding reinforcement layer G as
G=GUF(S da ,S db ) (12)
Wherein, GUF (·) represents performing a guided filtering operation;
(202) I is determined according to b Asymmetric detail preserving fusion layer S of (1) fb Is that
S fb =S db +G (13)。
3. The method for merging ISAR and VIS images based on asymmetric multi-layer decomposition according to claim 2, wherein said step seven is specifically implemented by:
(301) S is determined according to the following ea Is of weighted spatial frequency variance sigma ea And S is eb Is of weighted spatial frequency variance sigma eb Is that
Wherein F is ea Represent S ea Spatial frequency of V ea Represent S ea Gray value variance of F eb Represent S eb Spatial frequency of V eb Represent S eb Is a gray value variance of (2);
(302) Determining the weight coefficient omega as follows
(303) Determining the final edge preserving fusion layer S according to the following fe Is that
S fe =ω×S ea +(1-ω)×S eb (19)。
4. The method for merging an ISAR and a VIS image based on an asymmetric multi-layer decomposition according to claim 3, wherein said step eight specifically comprises: the final basic energy fusion layer S is determined according to the following formula fg Is that
5. The method for merging an ISAR image and a VIS image based on asymmetric multi-layer decomposition according to claim 4, wherein said step nine is specifically: the final fused image I is determined according to the following formula f Is that
I f =S fd +S fe +S fg (21)。
CN202310313924.6A 2023-03-28 2023-03-28 ISAR and VIS image fusion method based on asymmetric multi-layer decomposition Active CN116167956B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310313924.6A CN116167956B (en) 2023-03-28 2023-03-28 ISAR and VIS image fusion method based on asymmetric multi-layer decomposition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310313924.6A CN116167956B (en) 2023-03-28 2023-03-28 ISAR and VIS image fusion method based on asymmetric multi-layer decomposition

Publications (2)

Publication Number Publication Date
CN116167956A CN116167956A (en) 2023-05-26
CN116167956B true CN116167956B (en) 2023-11-17

Family

ID=86416492

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310313924.6A Active CN116167956B (en) 2023-03-28 2023-03-28 ISAR and VIS image fusion method based on asymmetric multi-layer decomposition

Country Status (1)

Country Link
CN (1) CN116167956B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010051207A1 (en) * 2010-11-12 2012-05-16 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method for three-dimensional imaging e.g. weapon hidden under cloth of people, involves producing three-dimensional radar image of object from radar data for image representation in e.g. correct position on three-dimensional surface model
CN109035188A (en) * 2018-07-16 2018-12-18 西北工业大学 A kind of intelligent image fusion method based on target signature driving
CN109063729A (en) * 2018-06-20 2018-12-21 上海电力学院 A kind of Multisensor Image Fusion Scheme based on PSO-NSCT
CN109816618A (en) * 2019-01-25 2019-05-28 山东理工大学 An Image Fusion Algorithm for Area Energy Photon Counting Based on Adaptive Threshold
CN110175970A (en) * 2019-05-20 2019-08-27 桂林电子科技大学 Based on the infrared and visible light image fusion method for improving FPDE and PCA
AU2020100199A4 (en) * 2020-02-08 2020-03-19 Cao, Sihua MR A medical image fusion method based on two-layer decomposition and improved spatial frequency
CN113920047A (en) * 2021-09-30 2022-01-11 广东双电科技有限公司 A Fusion Method of Infrared and Visible Light Images Based on Hybrid Curvature Filter
KR102388831B1 (en) * 2021-02-09 2022-04-21 인천대학교 산학협력단 Apparatus and Method for Fusing Intelligent Multi Focus Image
CN114418913A (en) * 2022-01-03 2022-04-29 中国电子科技集团公司第二十研究所 A pixel-level fusion method of ISAR and infrared images based on wavelet transform
CN115330653A (en) * 2022-08-16 2022-11-11 西安电子科技大学 A Multi-source Image Fusion Method Based on Side Window Filtering
CN115345909A (en) * 2022-10-18 2022-11-15 无锡学院 Hyperspectral target tracking method based on depth space spectrum convolution fusion characteristics

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008130907A1 (en) * 2007-04-17 2008-10-30 Mikos, Ltd. System and method for using three dimensional infrared imaging to identify individuals
FR3048800B1 (en) * 2016-03-11 2018-04-06 Bertin Technologies IMAGE PROCESSING METHOD
CN111062905B (en) * 2019-12-17 2022-01-04 大连理工大学 An infrared and visible light fusion method based on saliency map enhancement

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010051207A1 (en) * 2010-11-12 2012-05-16 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method for three-dimensional imaging e.g. weapon hidden under cloth of people, involves producing three-dimensional radar image of object from radar data for image representation in e.g. correct position on three-dimensional surface model
CN109063729A (en) * 2018-06-20 2018-12-21 上海电力学院 A kind of Multisensor Image Fusion Scheme based on PSO-NSCT
CN109035188A (en) * 2018-07-16 2018-12-18 西北工业大学 A kind of intelligent image fusion method based on target signature driving
CN109816618A (en) * 2019-01-25 2019-05-28 山东理工大学 An Image Fusion Algorithm for Area Energy Photon Counting Based on Adaptive Threshold
CN110175970A (en) * 2019-05-20 2019-08-27 桂林电子科技大学 Based on the infrared and visible light image fusion method for improving FPDE and PCA
AU2020100199A4 (en) * 2020-02-08 2020-03-19 Cao, Sihua MR A medical image fusion method based on two-layer decomposition and improved spatial frequency
KR102388831B1 (en) * 2021-02-09 2022-04-21 인천대학교 산학협력단 Apparatus and Method for Fusing Intelligent Multi Focus Image
CN113920047A (en) * 2021-09-30 2022-01-11 广东双电科技有限公司 A Fusion Method of Infrared and Visible Light Images Based on Hybrid Curvature Filter
CN114418913A (en) * 2022-01-03 2022-04-29 中国电子科技集团公司第二十研究所 A pixel-level fusion method of ISAR and infrared images based on wavelet transform
CN115330653A (en) * 2022-08-16 2022-11-11 西安电子科技大学 A Multi-source Image Fusion Method Based on Side Window Filtering
CN115345909A (en) * 2022-10-18 2022-11-15 无锡学院 Hyperspectral target tracking method based on depth space spectrum convolution fusion characteristics

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
An ISAR and Visible Image Fusion Algorithm Based on Adaptive Guided Multi-Layer Side Window Box Filter Decomposition;Jiajia Zhang等;《Advances and Challenges on Multisource Remote Sensing Image Fusion: Datasets, New Technologies, and Applications》;1-29 *
基于双边滤波和NSST的红外与可见光图像融合;徐丹萍;王海梅;;计算机测量与控制(04);201-204 *
基于多层小波分析的图像融合技术研究;罗益荣;;计算机应用与软件(12);108-112 *
基于邻域方差加权平均的多聚焦图像融合法;郑睿;庞全;;机械制造(09);33-36 *
多传感器图像融合应用评述;夏明革, 何友, 黄晓冬, 夏仕昌;舰船电子对抗(05);38-44 *

Also Published As

Publication number Publication date
CN116167956A (en) 2023-05-26

Similar Documents

Publication Publication Date Title
Chen et al. Median filtering forensics based on convolutional neural networks
CN117079139B (en) Remote sensing image target detection method and system based on multi-scale semantic features
CN108447041B (en) Multi-source image fusion method based on reinforcement learning
CN105657402B (en) A kind of depth map restoration methods
CN114821580B (en) A noisy image segmentation method with phased integration of denoising modules
Xu et al. COCO-Net: A dual-supervised network with unified ROI-loss for low-resolution ship detection from optical satellite image sequences
CN108376259A (en) In conjunction with the image denoising method of Bayes's Layered Learning and empty spectrum joint priori
Shit et al. An encoder‐decoder based CNN architecture using end to end dehaze and detection network for proper image visualization and detection
CN115018748A (en) A Fusion Method of Aerospace Remote Sensing Image Combining Model Structure Reconstruction and Attention Mechanism
CN116486203B (en) Single-target tracking method based on twin network and online template updating
CN113159158B (en) License plate correction and reconstruction method and system based on generation countermeasure network
CN118334512B (en) SAR image target recognition method and system based on SSIM and cascade deep neural network
Kumar et al. Underwater image enhancement using deep learning
CN118154937A (en) Camouflage target detection method based on visual large model
CN118521767A (en) Infrared small target detection method based on learning guided filtering
Cahill et al. Exploring the viability of bypassing the image signal processor for cnn-based object detection in autonomous vehicles
Liu et al. Joint dehazing and denoising for single nighttime image via multi-scale decomposition
CN116167956B (en) ISAR and VIS image fusion method based on asymmetric multi-layer decomposition
Bistroń et al. Optimization of Imaging Reconnaissance Systems Using Super-Resolution: Efficiency Analysis in Interference Conditions
Zhang et al. Iterative multi‐scale residual network for deblurring
CN119399045A (en) Infrared-guided low-light image enhancement method based on global and local multi-scale fusion
CN110489584B (en) Image classification method and system based on densely connected MobileNets model
Hu et al. Pyramid feature boosted network for single image dehazing
CN115358962B (en) End-to-end visual odometer method and device
CN117934814A (en) Infrared small target identification method based on distraction mining network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240411

Address after: Room 1007, Building 1, No. 688, Zhenze Road, the Taihu Lake Street, Wuxi Economic Development Zone, Jiangsu Province, 214000

Patentee after: Yuanchi (Jiangsu) Information Technology Co.,Ltd.

Country or region after: China

Address before: No.333 Xishan Avenue, Wuxi City, Jiangsu Province

Patentee before: Wuxi University

Country or region before: China

TR01 Transfer of patent right