[go: up one dir, main page]

CN108564532B - Mosaic method of large-scale ground-distance spaceborne SAR images - Google Patents

Mosaic method of large-scale ground-distance spaceborne SAR images Download PDF

Info

Publication number
CN108564532B
CN108564532B CN201810299312.5A CN201810299312A CN108564532B CN 108564532 B CN108564532 B CN 108564532B CN 201810299312 A CN201810299312 A CN 201810299312A CN 108564532 B CN108564532 B CN 108564532B
Authority
CN
China
Prior art keywords
image
processed
images
sar
spliced
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810299312.5A
Other languages
Chinese (zh)
Other versions
CN108564532A (en
Inventor
郎文辉
余不凡
石聪聪
赵子航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei University of Technology
Original Assignee
Hefei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei University of Technology filed Critical Hefei University of Technology
Priority to CN201810299312.5A priority Critical patent/CN108564532B/en
Publication of CN108564532A publication Critical patent/CN108564532A/en
Application granted granted Critical
Publication of CN108564532B publication Critical patent/CN108564532B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/32Determination of transform parameters for the alignment of images, i.e. image registration using correlation-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

本发明提供大尺度地距星载SAR图像镶嵌方法,涉及卫星SAR(合成孔径雷达)图像镶嵌领域,先对原始图像进行必要的预处理,然后对经过预处理的图像提取地理位置信息,以地理位置信息作为先验知识,对图像进行互相关矫正,得到镶嵌的误差信息,利用镶嵌误差信息对图像镶嵌误差进行补偿;经过镶嵌误差补偿之后,运用Wallis滤波器对输入的SAR图像以搜索更新的方式进行匀色处理,并且最终生成镶嵌后图像;本发明通过根据不同地形适配互相关法或基于过零点的二值匹配统计法,从而为匹配偏移量的确定提供了可选方案;提供一种搜索更新策略,保证了多重叠区域、多个方向的匀色处理效果,且能很好地保护图像中的纹理细节。

Figure 201810299312

The invention provides a large-scale ground-distance spaceborne SAR image mosaic method, which relates to the field of satellite SAR (Synthetic Aperture Radar) image mosaic. The position information is used as prior knowledge to correct the cross-correlation of the image to obtain the mosaic error information, and use the mosaic error information to compensate the image mosaic error; after the mosaic error compensation, the Wallis filter is used to search the input SAR image for updated Color equalization processing is carried out in a way, and finally a mosaicked image is generated; the present invention provides an optional solution for the determination of the matching offset by adapting the cross-correlation method or the zero-crossing-based binary matching statistical method according to different terrains; A search and update strategy ensures the effect of color leveling in multiple overlapping areas and multiple directions, and can well protect the texture details in the image.

Figure 201810299312

Description

Large-scale ground distance satellite-borne SAR image mosaic method
Technical Field
The invention relates to the field of satellite SAR (synthetic aperture radar) image mosaic, in particular to a large-scale ground distance satellite-borne SAR image mosaic method.
Background
Space-borne Synthetic Aperture Radar (SAR) has become an important tool in the fields of mapping and geoscience research due to its large area and all-weather and day-night imaging capabilities. Mosaicing of multiple SAR images from the same and adjacent imaging tracks is particularly useful for monitoring and analyzing spatial and temporal variations of global processes. The satellite-borne SAR mosaic product can be widely applied to the fields of land utilization, land coverage change, map drawing, coastline, desertification, wetland and glacier monitoring and the like. The key of the ground-distance spaceborne SAR mosaic is to eliminate the geometrical structural dislocation and the radiation difference between mosaic images and is not limited by a large-capacity memory.
At present, digital mosaic technology of satellite-borne SAR images aiming at special purposes is researched and developed, such as Shimada et al (2010) generates a PALSAR mosaic data set based on an oblique distance image, and Grandi et al (2004) generates a JERS-1SAR mosaic data set by using a ground distance image, but the quantity is limited, and most radar mosaic experiments or products are completed based on data collected by an airborne radar system. These methods usually require registration of the overlapping image data using the homonymous points of visual localization or visual features extracted by some algorithm, the whole mosaicing process is laborious and time consuming; on the other hand, the simple gradual-in and gradual-out color homogenizing can only realize gradual transition of a single-direction overlapping area, and the problem of color difference caused by overlapping of areas in multiple directions is difficult to eliminate; in addition, some methods have high requirements on the memory capacity of the computer.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and better realize the large-scale mosaic of the satellite-borne ground range SAR image so as to solve the problems in the prior art.
The invention is realized by the following technical scheme:
the invention provides a large-scale ground distance satellite-borne SAR image mosaic method, which comprises a plurality of numbered ground distance satellite-borne SAR images, and comprises the following steps:
step 1, image preprocessing
Performing parameter correction on each numbered ground range satellite-borne SAR image to obtain a to-be-processed SAR image containing geographic positioning information;
step 2, calculating longitude and latitude coordinates of each pixel in each SAR image to be processed;
step 3, judging whether all SAR images to be processed are overlapped;
judging whether all the SAR images to be processed are overlapped or not according to the longitude and latitude coordinates of each pixel in each SAR image to be processed obtained in the step 2;
if the SAR images are overlapped, recording the numbers corresponding to all the SAR images to be processed with the overlapped SAR images and the overlapped region range between every two SAR images, and generating a number group containing the overlapped region range information for storage;
step 4, calculating the matching offset between the corresponding overlapping area ranges of each numbering group;
for the area with low peak-to-average ratio, calculating the matching offset between the corresponding overlapping area ranges of each numbering group by adopting a cross-correlation method;
for the region with high peak-to-average ratio, calculating the matching offset between the corresponding overlapping region ranges of each numbering group by adopting a binary matching statistical method based on zero crossing points;
step 5, creating all-zero pixel images and global structure variables
Obtaining the corrected geographic coordinates of all SAR images to be processed according to the longitude and latitude coordinates of each pixel in the SAR images to be processed obtained in the step (2) and the matching offset between the corresponding overlapping area ranges of each numbering group obtained in the step (4), and obtaining the images to be spliced corresponding to each corrected SAR image to be processed;
creating an all-zero pixel image with the same size as the finally generated mosaic image, and simultaneously establishing a global structure variable;
step 6, performing color homogenization on the overlapping area range corresponding to each number group obtained in the step 5 by adopting a Wallis filter;
step 7, splicing all images to be spliced after color equalization
The splicing method comprises the following steps:
a. setting splicing color homogenizing marks for all images to be spliced in the global structure variable, and resetting to be 0;
b. inputting an image to be spliced, and finding out all images to be spliced which are overlapped with the image to be spliced and have uniform color marks of 0 based on a numbering group;
c. assuming that the total number of the images to be spliced which are overlapped with the currently input image to be spliced and have the uniform color mark of 0 is Num, taking the currently input image to be spliced as a reference image, sequentially performing Wallis color homogenizing on the Num images to be spliced according to the step 6, updating corresponding parameters in a Wallis filter for each color homogenizing, setting the uniform color mark to be 1, and storing the homogenized images;
writing the input image to be spliced into the all-zero pixel image created in the step 5 after the num times of Wallis color homogenization, and setting a splicing color homogenization mark to be 1;
e. and (d) repeating the steps b, c and d on all the images to be spliced, and obtaining a final mosaic result after all the images to be spliced are processed.
Further, the parameter modification in step 1 specifically includes the following steps:
carrying out speckle filtering on each numbered ground distance satellite-borne SAR image to remove speckle noise and obtain an SAR image without speckle noise;
carrying out radiometric calibration on the SAR image without speckle noise to eliminate system radiation error, and expressing the calibrated SAR image by using a backscattering coefficient to obtain the SAR image expressed by the backscattering coefficient;
and performing ellipsoid correction on the SAR image represented by the backscattering coefficient, and writing the geographical positioning information in the auxiliary information into the SAR image to obtain the SAR image to be processed.
Further, in step 3, a calculation formula of longitude and latitude coordinates of each pixel in each to-be-processed SAR image is as follows:
E_geo=A(0)+X_pix×A(1)+Y_pix×A(2) (1)
N_geo=A(3)+X_pix×A(4)+Y_pix×A(5) (2)
wherein: e _ geo and N _ geo respectively represent longitude and latitude coordinates of pixels in the SAR image to be processed;
x _ pix and Y _ pix respectively represent the column coordinate and the row coordinate of the pixel in the SAR image to be processed;
the geographical positioning information is specifically composed of a geographical positioning array, which is specifically composed of six parameters a (0), a (1),.. a (5), wherein:
a (0) and A (3) represent longitude and latitude coordinates respectively representing the upper left corner of the image to be processed;
a (1) and A (5) respectively represent the transverse resolution and the longitudinal resolution of the image to be processed;
a (2) and A (4) respectively represent rotation coefficients of pixels in the image to be processed in the latitude and longitude directions.
Further, the cross-correlation method in step 4 specifically includes:
selecting one image in any one image to be processed in two images to be processed corresponding to each numbering group as a template image t (u, v) in the range of the corresponding overlapping area of each numbering group, and selecting the other image to be processed as a search image I (w, h);
wherein: t (U, V) with dimension U × V;
i (W, H) with dimension W × H;
stacking t (u, v) on I (w, h) to perform up-and-down translation search, and making the overlapping area of I (w, h) and t (u, v) as: sub-diagram Omn;
wherein: m and n are coordinates of the lower left corner of the template image on I (w, h);
and: m is more than or equal to 1 and less than or equal to W-U, n is more than or equal to 1 and less than or equal to H-V;
calculating Omn a two-dimensional cross-correlation coefficient R (m, n) with t (u, v);
namely:
Figure GDA0003284042840000031
calculating Omn normalized correlation coefficient rho (m, n) with t (u, v) as similarity measure of the two;
namely:
Figure GDA0003284042840000032
after all searches are completed, selecting the maximum value rho (m) of rho (m, n)max,nmax) Corresponding subgraph
Figure GDA0003284042840000033
Is a matching target;
and taking the offset of the matching target relative to the template image in the horizontal and vertical directions as the matching offset between the two images to be processed corresponding to the number group.
Further, the zero-crossing point-based binary matching statistical method in step 4 specifically includes:
a. obtaining the contour feature M (x, y) of each pixel in each image to be processed;
Figure GDA0003284042840000041
wherein: i (x, y) is an image, and x and y are respectively the horizontal and vertical coordinates of each pixel of the image to be processed in the processing;
Figure GDA0003284042840000042
is the laplacian of gaussian operator;
b. performing binary matching on the contour features of each pixel in each image to be processed, assigning a value of 1 when M (x, y) > 0, and assigning a value of 0 when M (x, y) ≦ 0 to obtain a binary image of each image to be processed;
c. selecting one of any two-valued image from two images to be processed corresponding to each number group as a two-valued template image Ma, wherein the other image to be processed corresponding to the number group is Mb; the binary template image Ma is located in the range of the overlapping area corresponding to the number group;
d. superposing Ma on Mb to perform up-and-down translation search, wherein the covered area on Mb is a sub-binary image Mc; counting the number S with the same value on the corresponding binary image at the same position of the Ma and the Mb;
e. after all searches are finished, selecting a sub-binary image corresponding to the maximum value Smax of S as a matching object;
f. taking the offset of the matching object relative to the binary template image in the horizontal and vertical directions as the matching offset between the two images to be processed corresponding to the number group;
further, the expression of the Wallis filter is:
Figure GDA0003284042840000043
wherein: two images to be spliced corresponding to each number group are respectively represented by I and T,
Figure GDA0003284042840000044
the gray scale of the image after I is subjected to uniform color adjustment;
μT、σTand muI、σIRespectively representing T and I to-be-processed SAR image gray levels in overlapped areaIntensity mean and standard deviation;
the dimension of I is W multiplied by H, I is more than or equal to 1 and less than or equal to W, and j is more than or equal to 1 and less than or equal to H.
Further, the updated corresponding parameter in step 7 is specifically μT、σTAnd muI、σI
Compared with the prior art, the invention has the following advantages:
(1) by establishing a global structure variable, the corrected geographic coordinates of all the images to be spliced are stored, so that at most two images are stored in a computer memory at the same time, and the requirement on a large-capacity storage is reduced;
(2) because radar backscattering is sensitive to changes in imaging geometry and the radiation characteristics of SAR images tend to decorrelate images from the same region imaged at different times, for relatively flat regions, a cross-correlation method can be used to determine registration errors between images, and for regions of complex terrain, a zero-crossing-based binary matching statistical method is used, thereby providing an alternative for the determination of registration errors;
(3) because the phenomenon that one SAR image and a plurality of SAR images are overlapped simultaneously possibly exists in the actual embedding process, when the global filter is used for color homogenizing treatment, a search updating strategy is designed, so that the color homogenizing treatment effect of multiple overlapped areas and multiple directions can be ensured, and the texture details in the SAR images can be well protected;
(4) the invention provides a complete large-scale embedding process of the satellite-borne ground distance SAR image, which is not only suitable for a satellite-borne ground distance strip product, but also suitable for a satellite-borne ground distance wide mode product.
Drawings
FIG. 1 is a flow chart of a large-scale ground-distance spaceborne SAR image mosaic method of the invention;
FIG. 2 is a schematic diagram of template matching in the correlation method provided by the present invention;
FIG. 3 is a raw map a of three raw SAR images used in an exemplary embodiment;
FIG. 4 is a raw map b of three raw SAR images used in a specific embodiment;
FIG. 5 is a raw map c of three raw SAR images used in a particular embodiment;
FIG. 6 is a diagram a1 obtained by preprocessing the original diagram a of the three original SAR images used in the embodiment;
FIG. 7 is a diagram b1 obtained by preprocessing the original diagram b of the three original SAR images used in the embodiment;
FIG. 8 is a diagram c1 obtained by preprocessing the original diagram c of the three original SAR images used in the embodiment;
fig. 9 is a graph of the effects of splicing after the a1, b1 and c1 treatments are completed.
Detailed Description
The following examples are given for the detailed implementation and specific operation of the present invention, but the scope of the present invention is not limited to the following examples.
The large-scale ground-distance satellite-borne SAR image mosaic method comprises the following steps:
(1) image preprocessing: in order to ensure the quality of the mosaic result, firstly, preprocessing a satellite-borne ground range width mode SAR image; the three SAR images selected in the step are all Chinese internal SAR images shot by a sentry one satellite transmitted by the European space agency, and as shown in figures 3, 4 and 5, the three SAR images are three original SAR images:
the longitude range of the image is 73 degrees to 135 degrees from east longitude, the latitude range is 3 degrees to 53 degrees from north latitude, and the whole preprocessing process is completed on the version 3.0 of a processing platform SNAP (sentines Application platform) provided by the European space agency; the SAR image usually has Speckle noise, Speckle Filtering is carried out on each numbered spaceborne SAR image, a proper filter is selected to remove the Speckle noise, here, a Lee-Sigma filter is selected, the window size is 7 x 7, the target window size is 3 x 3, the Sigma value is 0.9, compared with other common filters (such as a median filter, a mean filter and the like), the SAR image has obvious advantages in the aspects of Speckle noise suppression and retention of detail information such as edges, textures and the like, and particularly, the operation in the SNAP3.0 platform is only to select a speed Filtering option under a Radar option card; in addition, the radar sensor has systematic radiation error, and the SAR image represented by backscattering coefficient value (i) can be obtained by radiometric calibration, which is exemplified by a sentinel product of European space agency, and the radiometric calibration process can be represented by the following formula:
Figure GDA0003284042840000061
wherein DNi(Digital Number) and AiValues were obtained from four radiometric calibration Look-Up Tables (Look Up Tables) provided by sentinel one; and writing the geographical positioning information in the auxiliary information into the SAR image through ellipsoid correction, taking SNAP3.0 as an example, selecting the Geometric under the Radar tab to perform ellipsoid correction, and finally obtaining the SAR image to be processed. Three SAR images to be processed obtained through preprocessing are shown in figures 6, 7 and 8;
(2) calculating longitude and latitude coordinates of each pixel in the SAR image to be processed according to the following calculation formula:
E_geo=A(0)+X_pix×A(1)+Y_pix×A(2) (1)
N_geo=A(3)+X_pix×A(4)+Y_pix×A(5) (2)
wherein: e _ geo and N _ geo respectively represent longitude and latitude coordinates of pixels in the SAR image to be processed
X _ pix and Y _ pix respectively represent the column coordinates and the row coordinates of the pixels in the SAR image to be processed
The geographical positioning information is specifically composed of a geographical positioning array, which is specifically composed of six parameters a (0), a (1),.. a (5), wherein:
a (0) and A (3) represent longitude and latitude coordinates respectively representing the upper left corner of the image to be processed
A (1) and A (5) represent the horizontal and vertical resolutions of the image to be processed, respectively
A (2) and A (4) respectively represent rotation coefficients of pixels in the image to be processed in the latitude and longitude directions;
(3) judging whether all SAR images to be processed are overlapped;
judging whether all the SAR images to be processed are overlapped or not according to the longitude and latitude coordinates of each corner point pixel in each SAR image to be processed obtained in the step 2;
if the SAR images are overlapped, recording the numbers corresponding to all the SAR images to be processed with the overlapped SAR images and the overlapped region range between every two SAR images, and generating a number group containing the overlapped region range information for storage;
(4) calculating the matching offset between the corresponding overlapping area ranges of each numbering group;
the SAR image can be registered through the geographic coordinate information extracted in the step (2), but the registration usually causes matching offset due to insufficient accuracy of the geographic coordinate information; therefore, for the area with low peak-to-average ratio, the matching offset between the corresponding overlapping area ranges of each numbering group is calculated by adopting a cross-correlation method;
and for the region with high peak-to-average ratio, calculating the matching offset between the corresponding overlapping region ranges of each number group by adopting a binary matching statistical method based on zero crossing points. The method comprises the following specific steps:
(a) cross-correlation method
As shown in FIG. 2, one of the two images to be processed corresponding to each number group is selected as a template image t (u, v) within the range of the overlapping area corresponding to the number group, and the other image to be processed is a search image I (w, h)
Wherein: t (U, V) with dimension U × V;
i (W, H) with dimension W × H;
stacking t (u, v) on I (w, h) to perform up-and-down translation search, and making the overlapping area of I (w, h) and t (u, v) as: subfigure Omn
Wherein: m and n are coordinates of the lower left corner of the template image on I (w, h);
and: m is more than or equal to 1 and less than or equal to W-U, n is more than or equal to 1 and less than or equal to H-V;
calculating OmnTwo-dimensional cross-correlation coefficients R (m, n) with t (u, v);
namely:
Figure GDA0003284042840000071
calculating OmnNormalized correlation coefficient ρ (m, n) with t (u, v) as a similarity measure of the two;
namely:
Figure GDA0003284042840000072
after all searches are completed, selecting the maximum value rho (m) of rho (m, n)max,nmax) Corresponding subgraph
Figure GDA0003284042840000073
Is a matching target;
taking the offset of the matching target relative to the template image in the horizontal and vertical directions as the matching offset between the two images to be processed corresponding to the number group, wherein the actual search range can be controlled to be 1-i-5 and 1-j-5 as the geographic coordinate information, namely longitude and latitude coordinates, is known;
the frequency domain implementation of the method can be completed in the GPU, so that the purpose of acceleration is achieved.
(b) Binary matching statistical method based on zero crossing point
a. Obtaining the contour feature M (x, y) of each pixel in each image to be processed;
Figure GDA0003284042840000074
wherein: i (x, y) is an image, and x and y are respectively the horizontal and vertical coordinates of each pixel of the image to be processed in the processing;
Figure GDA0003284042840000075
is the laplacian of gaussians operator
b. Performing binary matching on the contour features of each pixel in each image to be processed, assigning a value of 1 when M (x, y) > 0, and assigning a value of 0 when M (x, y) ≦ 0 to obtain a binary image of each image to be processed;
c. selecting one block of any binary image from the two images to be processed corresponding to each number group as a binary template image MaThe other image to be processed corresponding to the number group is Mb
Wherein the binary template image MaIn the range of the overlapping area corresponding to the number group;
d. will MaIs stacked on MbUp-down horizontal search, MbThe covered region is a sub-binary image Mc
d. Counting the number S with the same value on the corresponding binary image at the same position of the Ma and the Mb;
e. after all searches are finished, the maximum value S of S is selectedmaxThe corresponding sub binary image is a matching object;
f. and taking the offset of the matching object relative to the binary template image in the horizontal and vertical directions as the matching offset between the two images to be processed corresponding to the number group.
(5) Creating all-zero-pixel images and global structure variables
Obtaining the corrected geographic coordinates of all SAR images to be processed according to the longitude and latitude coordinates of each pixel in the SAR images to be processed obtained in the step (2) and the matching offset between the corresponding overlapping area ranges of each numbering group obtained in the step (4), and obtaining the images to be spliced corresponding to each corrected SAR image to be processed;
and creating an all-zero pixel image with the same size as the mosaic image to be finally generated, and establishing a global structure variable.
(6) Performing global color homogenizing and splicing on the overlapped SAR images by adopting a Wallis filter;
because the intensity differences in multiple directions may exist between the satellite-borne ground range SAR images, the transition of the spliced image overlapping region is unnatural, and therefore, global color homogenizing processing is required. Here we use a Wallis filter to level the overlapped SAR images;
the expression for the Wallis filter is: the expression for the Wallis filter is:
Figure GDA0003284042840000081
wherein: two images to be spliced corresponding to each number group are respectively represented by I and T,
Figure GDA0003284042840000082
the gray scale of the image after I is subjected to uniform color adjustment;
μT、σTand muI、σIRespectively representing the mean value and the standard deviation of the gray level intensity of the SAR image to be processed, of which T and I are positioned in the overlapping area;
the dimension of I is W multiplied by H, I is more than or equal to 1 and less than or equal to W, and j is more than or equal to 1 and less than or equal to H;
the uniform color splicing method comprises the following steps:
a. setting splicing color homogenizing marks for all images to be spliced in the global structure variable, and resetting to be 0;
b. inputting an image to be spliced, and finding out all images to be spliced which are overlapped with the image to be spliced and have uniform color marks of 0 based on a numbering group;
c. assuming that the total number of the images to be spliced which are overlapped with the current input image to be spliced and have the uniform color mark of 0 is Num, taking the current input image to be spliced as a reference image, and sequentially performing the following steps
Figure GDA0003284042840000083
Wallis color homogenizing is carried out on the Num images to be spliced, and the mu is updated every time of color homogenizingT、σTAnd muI、σISetting the color homogenizing mark as 1 and storing the image after color homogenizing;
writing the input image to be spliced into the all-zero pixel image created in the step 5 after the num times of Wallis color homogenization, and setting a splicing color homogenization mark to be 1;
e. repeating the steps b, c and d for all the images to be spliced, and obtaining the final mosaic result after processing all the images to be spliced as shown in FIG. 9:
the mosaic result obtained by the process is shown in fig. 9, when the method is used for splicing a plurality of SAR images, seams in the mosaic result are smooth in radiation, and the integral chromatic aberration is small.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (7)

1. The large-scale ground range spaceborne SAR image mosaic method comprises a plurality of numbered ground range spaceborne SAR images and is characterized in that: the method comprises the following steps:
step 1, preprocessing an image;
performing parameter correction on each numbered ground range satellite-borne SAR image to obtain a to-be-processed SAR image containing geographic positioning information;
step 2, calculating longitude and latitude coordinates of each pixel in each SAR image to be processed;
step 3, judging whether all SAR images to be processed are overlapped;
judging whether all the SAR images to be processed are overlapped or not according to the longitude and latitude coordinates of each pixel in each SAR image to be processed obtained in the step 2;
if the SAR images are overlapped, recording the numbers corresponding to all the SAR images to be processed with the overlapped SAR images and the overlapped region range between every two SAR images, and generating a number group containing the overlapped region range information for storage;
step 4, calculating the matching offset between the corresponding overlapping area ranges of each numbering group;
for the area with low peak-to-average ratio, calculating the matching offset between the corresponding overlapping area ranges of each numbering group by adopting a cross-correlation method;
for the region with high peak-to-average ratio, calculating the matching offset between the corresponding overlapping region ranges of each numbering group by adopting a binary matching statistical method based on zero crossing points;
step 5, creating all-zero pixel images and global structure variables
Obtaining the corrected geographic coordinates of all SAR images to be processed according to the longitude and latitude coordinates of each pixel in the SAR images to be processed obtained in the step (2) and the matching offset between the corresponding overlapping area ranges of each numbering group obtained in the step (4), and obtaining the images to be spliced corresponding to each corrected SAR image to be processed;
creating an all-zero pixel image with the same size as the finally generated mosaic image, and simultaneously establishing a global structure variable;
step 6, performing color homogenization on the overlapping area range corresponding to each number group obtained in the step 5 by adopting a Wallis filter;
step 7, splicing all images to be spliced after color homogenization;
the splicing method comprises the following steps:
a. setting splicing color homogenizing marks for all images to be spliced in the global structure variable, and resetting to be 0;
b. inputting an image to be spliced, and finding out all images to be spliced which are overlapped with the image to be spliced and have uniform color marks of 0 based on a numbering group;
c. assuming that the total number of the images to be spliced which are overlapped with the currently input image to be spliced and have the uniform color mark of 0 is Num, taking the currently input image to be spliced as a reference image, sequentially performing Wallis color homogenizing on the Num images to be spliced according to the step 6, updating corresponding parameters in a Wallis filter for each color homogenizing, setting the uniform color mark to be 1, and storing the homogenized images;
writing the input image to be spliced into the all-zero pixel image created in the step 5 after the num times of Wallis color homogenization, and setting a splicing color homogenization mark to be 1;
e. and (d) repeating the steps b, c and d on all the images to be spliced, and obtaining a final mosaic result after all the images to be spliced are processed.
2. The large-scale spaceborne SAR image mosaic method according to claim 1, characterized in that the parameter correction in step 1 is specifically the following process:
carrying out speckle filtering on each numbered ground distance satellite-borne SAR image to remove speckle noise and obtain an SAR image without speckle noise;
carrying out radiometric calibration on the SAR image without speckle noise to eliminate system radiation error, and expressing the calibrated SAR image by using a backscattering coefficient to obtain the SAR image expressed by the backscattering coefficient;
and performing ellipsoid correction on the SAR image represented by the backscattering coefficient, and writing the geographical positioning information in the auxiliary information into the SAR image to obtain the SAR image to be processed.
3. The large-scale spaceborne SAR image mosaic method according to claim 2, characterized in that, in step 3, the calculation formula of longitude and latitude coordinates of each pixel in each SAR image to be processed is as follows:
E_geo=A(0)+X_pix×A(1)+Y_pix×A(2) (1)
N_geo=A(3)+X_pix×A(4)+Y_pix×A(5) (2)
wherein: e _ geo and N _ geo respectively represent longitude and latitude coordinates of pixels in the SAR image to be processed;
x _ pix and Y _ pix respectively represent the column coordinate and the row coordinate of the pixel in the SAR image to be processed;
the geographical positioning information is specifically composed of a geographical positioning array, which is specifically composed of six parameters a (0), a (1),.. a (5), wherein:
a (0) and A (3) represent longitude and latitude coordinates respectively representing the upper left corner of the image to be processed;
a (1) and A (5) respectively represent the transverse resolution and the longitudinal resolution of the image to be processed;
a (2) and A (4) respectively represent rotation coefficients of pixels in the image to be processed in the latitude and longitude directions.
4. The large-scale spaceborne SAR image mosaic method according to claim 3, characterized in that the cross-correlation method in step 4 specifically comprises:
selecting one image in any one image to be processed in two images to be processed corresponding to each numbering group as a template image t (u, v) in the range of the corresponding overlapping area of each numbering group, and selecting the other image to be processed as a search image I (w, h);
wherein: t (U, V) with dimension U × V;
i (W, H) with dimension W × H;
stacking t (u, v) on I (w, h) to perform up-and-down translation search, and making the overlapping area of I (w, h) and t (u, v) as: subfigure Omn
Wherein: m and n are coordinates of the lower left corner of the template image on I (w, h);
and: m is more than or equal to 1 and less than or equal to W-U, n is more than or equal to 1 and less than or equal to H-V;
calculating Omn a two-dimensional cross-correlation coefficient R (m, n) with t (u, v);
namely:
Figure FDA0003284042830000031
calculating Omn normalized correlation coefficient rho (m, n) with t (u, v) as similarity measure of the two;
namely:
Figure FDA0003284042830000032
after all searches are completed, selecting the maximum value rho (m) of rho (m, n)max,nmax) Corresponding subgraph
Figure FDA0003284042830000033
Is a matching target;
and taking the offset of the matching target relative to the template image in the horizontal and vertical directions as the matching offset between the two images to be processed corresponding to the number group.
5. The large-scale range space-borne SAR image mosaic method according to claim 4, characterized in that the binary matching statistical method based on zero crossing points in step 4 is specifically:
a. obtaining the contour feature M (x, y) of each pixel in each image to be processed;
Figure FDA0003284042830000034
wherein: i (x, y) is an image, and x and y are respectively the horizontal and vertical coordinates of each pixel of the image to be processed in the processing;
Figure FDA0003284042830000035
is the laplacian of gaussian operator;
b. performing binary matching on the contour features of each pixel in each image to be processed, assigning a value of 1 when M (x, y) > 0, and assigning a value of 0 when M (x, y) ≦ 0 to obtain a binary image of each image to be processed;
c. selecting one of any two-valued image from two images to be processed corresponding to each number group as a two-valued template image Ma, wherein the other image to be processed corresponding to the number group is Mb; the binary template image Ma is located in the range of the overlapping area corresponding to the number group;
d. superposing Ma on Mb to perform up-and-down translation search, wherein the covered area on Mb is a sub-binary image Mc; counting the number S with the same value on the corresponding binary image at the same position of the Ma and the Mb;
e. after all searches are finished, selecting a sub-binary image corresponding to the maximum value Smax of S as a matching object;
f. and taking the offset of the matching object relative to the binary template image in the horizontal and vertical directions as the matching offset between the two images to be processed corresponding to the number group.
6. The large-scale range-to-space SAR image mosaic method according to claim 5, characterized in that the Wallis filter has the expression:
Figure FDA0003284042830000036
wherein: two images to be spliced corresponding to each number group are respectively represented by I and T,
Figure FDA0003284042830000041
the gray scale of the image after I is subjected to uniform color adjustment;
μT、σTand muI、σIRespectively representing the mean value and the standard deviation of the gray level intensity of the SAR image to be processed, of which T and I are positioned in the overlapping area;
the dimension of I is W multiplied by H, I is more than or equal to 1 and less than or equal to W, and j is more than or equal to 1 and less than or equal to H.
7. The large-scale spaceborne SAR image mosaic method according to claim 6, characterized in that, the updated corresponding parameter in step 7 is μT、σTAnd muI、σI
CN201810299312.5A 2018-03-30 2018-03-30 Mosaic method of large-scale ground-distance spaceborne SAR images Active CN108564532B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810299312.5A CN108564532B (en) 2018-03-30 2018-03-30 Mosaic method of large-scale ground-distance spaceborne SAR images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810299312.5A CN108564532B (en) 2018-03-30 2018-03-30 Mosaic method of large-scale ground-distance spaceborne SAR images

Publications (2)

Publication Number Publication Date
CN108564532A CN108564532A (en) 2018-09-21
CN108564532B true CN108564532B (en) 2022-02-15

Family

ID=63534102

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810299312.5A Active CN108564532B (en) 2018-03-30 2018-03-30 Mosaic method of large-scale ground-distance spaceborne SAR images

Country Status (1)

Country Link
CN (1) CN108564532B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111028152B (en) * 2019-12-02 2023-05-05 哈尔滨工程大学 A method for super-resolution reconstruction of sonar images based on terrain matching
CN111738929B (en) * 2020-05-08 2022-08-30 中国科学院空天信息创新研究院 Image processing method and device, electronic equipment and storage medium
CN113096043B (en) * 2021-04-09 2023-02-17 杭州睿胜软件有限公司 Image processing method and device, electronic device and storage medium
CN116503274B (en) * 2023-04-07 2023-12-22 中山大学 Image color homogenizing method and device based on image overlapping area
CN117541469B (en) * 2024-01-10 2024-05-10 中山大学 SAR image stitching method and device based on graph theory

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103761709A (en) * 2014-01-07 2014-04-30 西安电子科技大学 Parallel real-time SAR image spot and noise reducing method based on multiple DSPs
CN106127683A (en) * 2016-06-08 2016-11-16 中国电子科技集团公司第三十八研究所 A kind of real-time joining method of unmanned aerial vehicle SAR image

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8401330B2 (en) * 2009-10-09 2013-03-19 At&T Intellectual Property I, L.P. No-reference spatial aliasing measure for digital image resizing

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103761709A (en) * 2014-01-07 2014-04-30 西安电子科技大学 Parallel real-time SAR image spot and noise reducing method based on multiple DSPs
CN106127683A (en) * 2016-06-08 2016-11-16 中国电子科技集团公司第三十八研究所 A kind of real-time joining method of unmanned aerial vehicle SAR image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
采用自适应块拼接的北极海冰SAR图像合成;郎文辉等;《仪器仪表学报》;20140930;第35卷(第9期);全文 *

Also Published As

Publication number Publication date
CN108564532A (en) 2018-09-21

Similar Documents

Publication Publication Date Title
CN108564532B (en) Mosaic method of large-scale ground-distance spaceborne SAR images
AU2017100064A4 (en) Image Stitching
US8078004B2 (en) Geometric registration of images by similarity transformation using two reference points
Li et al. Rigorous photogrammetric processing of HiRISE stereo imagery for Mars topographic mapping
CN111144350B (en) Remote sensing image positioning accuracy evaluation method based on reference base map
CN109523585B (en) A feature matching method for multi-source remote sensing images based on directional phase consistency
GB2557398A (en) Method and system for creating images
Shang et al. Automatic overlapping area determination and segmentation for multiple side scan sonar images mosaic
CN105139375A (en) Satellite image cloud detection method combined with global DEM and stereo vision
Oh et al. Automated RPCs bias compensation for KOMPSAT imagery using orthoimage GCP chips in Korea
KR101677230B1 (en) Apparatus and method for estimating quality of heterogenic image fusion algorithm
CN114241022B (en) Unmanned aerial vehicle image automatic registration method and system
CN109493298B (en) A kind of airborne sweep type high-spectral data fast geometric bearing calibration
CN113280789B (en) Method for taking laser height measurement points of relief area as image elevation control points
Sohn et al. Rational function model‐based image matching for digital elevation models
CN109886988A (en) Method, system, device and medium for measuring positioning error of microwave imager
CN113298713A (en) On-orbit rapid registration method capable of resisting cloud interference
CN114155287B (en) Laser altimetry data and high-resolution stereo image registration method and device
CN119052662B (en) A method and device for acquiring equivalent low-orbit satellite remote sensing images
CN116521927B (en) Remote sensing image matching method and system based on network map tiles
CN119130800B (en) A method for stitching UAV hyperspectral images based on shallow water bottom feature matching
CN118691510B (en) A method, device and medium for repairing holes in SAR images
CN118691515B (en) SAR image correction method, device, equipment and medium
CN111127525A (en) Incremental farmland boundary precision calibration method and device with constraint point set registration
Kim Eliminating extrapolation using point distribution criteria in scattered data interpolation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant