[go: up one dir, main page]

Next Article in Journal
Enhancing Alfalfa Biomass Prediction: An Innovative Framework Using Remote Sensing Data
Next Article in Special Issue
Resource Allocation of Netted Opportunistic Array Radar for Maneuvering Target Tracking under Uncertain Conditions
Previous Article in Journal
ICTH: Local-to-Global Spectral Reconstruction Network for Heterosource Hyperspectral Images
Previous Article in Special Issue
Four-Dimensional Parameter Estimation for Mixed Far-Field and Near-Field Target Localization Using Bistatic MIMO Arrays and Higher-Order Singular Value Decomposition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Thorough Understanding and 3D Super-Resolution Imaging for Forward-Looking Missile-Borne SAR via a Maneuvering Trajectory

by
Tong Gu
1,
Yifan Guo
2,*,
Chen Zhao
3,
Jian Zhang
4,
Tao Zhang
5 and
Guisheng Liao
1
1
Academy of Advanced Interdisciplinary Research, Xidian University, Xi’an 710071, China
2
School of Marine Science and Technology, Northwestern Polytechnical University, Xi’an 710071, China
3
State Key Laboratory of Millimeter Wave, Beijing Institute of Remote Sensing Equipment, Beijing 100080, China
4
Tianjin Jinhang Institute of Technical Physics, Tianjin 300300, China
5
AVIC LEIHUA Electronic Technology Research Institute, Wuxi 214063, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(18), 3378; https://doi.org/10.3390/rs16183378
Submission received: 31 July 2024 / Revised: 29 August 2024 / Accepted: 6 September 2024 / Published: 11 September 2024
(This article belongs to the Special Issue Array and Signal Processing for Radar)
Figure 1
<p>Multiple simulated actual ballistics. (<b>a</b>) The 3D spatial trajectory of ballistics. (<b>b</b>) The 2D profile trajectory of ballistics in the Y–Z plane. (<b>c</b>) The 2D profile trajectory of ballistics in the X–Y plane.</p> ">
Figure 2
<p>The angle change caused by maneuvering trajectory.</p> ">
Figure 3
<p>Wavenumber spectrum geometry, in which <b>k<sub>x</sub>-k<sub>y</sub>-k<sub>z</sub></b> denotes the standard wavenumber spectrum geometry and <b>Kh-Kv-Ku</b> denotes the wavenumber spectrum geometry along the line of sight direction. In addition, the green and blue squares correspond to the <b>Kh-Kv</b> plane and <b>k<sub>x</sub>-k<sub>y</sub></b> plane, respectively.</p> ">
Figure 4
<p>The wavenumber spectrum projection geometry. (<b>a</b>) The projection along the <b>Kh</b> axis, in which the blue square correspond to <b>k<sub>x</sub>-k<sub>y</sub></b> plane; (<b>b</b>) the projection along the <b>Kv</b> axis, in which the green and blue squares correspond to the kz-Kv plane and kx-ky plane, respectively.</p> ">
Figure 5
<p>3D envelope of the scenario center. (<b>a</b>) The envelope of the <b>k<sub>y</sub></b> axis and <b>k<sub>z</sub></b> axis; (<b>b</b>) the envelope of the <b>k<sub>x</sub></b> axis.</p> ">
Figure 6
<p>The influence analysis of center frequency, transmitting bandwidth, azimuth and pitch angle on the 3D resolution; (<b>a</b>) the <b>k<sub>z</sub></b> axis resolution with [30°, 50°]; (<b>b</b>) the <b>k<sub>x</sub></b> axis resolution with [30°, 50°]; (<b>c</b>) the <b>k<sub>y</sub></b> axis resolution with [30°, 50°]; (<b>d</b>) the <b>k<sub>z</sub></b> axis resolution with [30°, 80°]; (<b>e</b>) the <b>k<sub>x</sub></b> axis resolution with [30°, 80°]; (<b>f</b>) the <b>k<sub>y</sub></b> axis resolution with [30°, 80°]; (<b>g</b>) the <b>k<sub>z</sub></b> axis resolution with [60°, 50°]; (<b>h</b>) the <b>k<sub>x</sub></b> axis resolution with [60°, 50°]; (<b>i</b>) the <b>k<sub>y</sub></b> axis resolution with [60°, 50°]; (<b>j</b>) the <b>k<sub>z</sub></b> axis resolution with [60°, 80°]; (<b>k</b>) the <b>k<sub>x</sub></b> axis resolution with [60°, 80°]; (<b>l</b>) the <b>k<sub>y</sub></b> axis resolution with [60°, 80°].</p> ">
Figure 6 Cont.
<p>The influence analysis of center frequency, transmitting bandwidth, azimuth and pitch angle on the 3D resolution; (<b>a</b>) the <b>k<sub>z</sub></b> axis resolution with [30°, 50°]; (<b>b</b>) the <b>k<sub>x</sub></b> axis resolution with [30°, 50°]; (<b>c</b>) the <b>k<sub>y</sub></b> axis resolution with [30°, 50°]; (<b>d</b>) the <b>k<sub>z</sub></b> axis resolution with [30°, 80°]; (<b>e</b>) the <b>k<sub>x</sub></b> axis resolution with [30°, 80°]; (<b>f</b>) the <b>k<sub>y</sub></b> axis resolution with [30°, 80°]; (<b>g</b>) the <b>k<sub>z</sub></b> axis resolution with [60°, 50°]; (<b>h</b>) the <b>k<sub>x</sub></b> axis resolution with [60°, 50°]; (<b>i</b>) the <b>k<sub>y</sub></b> axis resolution with [60°, 50°]; (<b>j</b>) the <b>k<sub>z</sub></b> axis resolution with [60°, 80°]; (<b>k</b>) the <b>k<sub>x</sub></b> axis resolution with [60°, 80°]; (<b>l</b>) the <b>k<sub>y</sub></b> axis resolution with [60°, 80°].</p> ">
Figure 7
<p>Wavenumber spectrum projection analysis of trajectory II. (<b>a</b>) 3D wavenumber spectrum. (<b>b</b>) <b>k<sub>x</sub>-k<sub>y</sub></b> wavenumber spectrum projection plane. (<b>c</b>) <b>k<sub>z</sub>-k<sub>x</sub></b> and <b>k<sub>z</sub>-k<sub>y</sub></b> wavenumber spectrum projection plane.</p> ">
Figure 8
<p>The 3D resolution verification after axis rotation. (<b>a</b>) The comparison between the <b>k<sub>z</sub>/k<sub>y</sub></b> axis and the <b>k<sub>z</sub>′/k<sub>y</sub>′</b> axis. (<b>b</b>) The comparison between the <b>k<sub>x</sub></b> axis and <b>k<sub>x</sub>’</b> axis. (<b>c</b>) The comparison between <b>k<sub>y</sub>-k<sub>z</sub></b> and <b>k<sub>y</sub>′-k<sub>z</sub>′</b> focused plane.</p> ">
Figure 9
<p>The flowchart of the proposed dimension-reduction super-resolution 3D imaging algorithm.</p> ">
Figure 10
<p>3D imaging process of five scatters with varying reflection coefficients. (<b>a</b>) <b>k<sub>y</sub>′-k<sub>z</sub>′</b> focused plane; (<b>b</b>) <b>k<sub>x</sub>′</b>-reflection coefficient extraction; (<b>c</b>) final 3D imaging result.</p> ">
Figure 11
<p>3D imaging process of seven scatters with the same reflection coefficients. (<b>a</b>) <b>k<sub>y</sub>′-k<sub>z</sub>′</b> focused plane; (<b>b</b>) <b>k<sub>x</sub>′</b>-reflection coefficient extraction; (<b>c</b>) final 3D imaging result.</p> ">
Figure 12
<p>The effect analysis of signal-to-noise and sampling rate. (<b>a</b>) 3D imaging result with SNR = 10 dB, M = 12,000; (<b>b</b>) 3D imaging result with SNR = 10 dB, M = 8000; (<b>c</b>) 3D imaging result with SNR = 10 dB, M = 8000; (<b>d</b>) 3D imaging result with SNR = 0 dB, M = 12,000; (<b>e</b>) 3D imaging result with SNR = 0 dB, M = 8000; (<b>f</b>) 3D imaging result with SNR = 0 dB, M = 4000; (<b>g</b>) 3D imaging result with SNR = −10 dB, M = 12,000; (<b>h</b>) 3D imaging result with SNR = −10 dB, M = 8000; (<b>i</b>) 3D imaging result with SNR = −10 dB, M = 4000.</p> ">
Figure 13
<p>The <b>k<sub>x</sub></b> axis resolution ability analysis with the sampling grids number M and SNR.</p> ">
Figure 14
<p>The 3D imaging result for an actual complex tank object. (<b>a</b>) 3D imaging result; (<b>b</b>) <b>k<sub>y</sub>′-k<sub>z</sub>′</b> focused plane result; (<b>c</b>) <b>k<sub>x</sub>′-k<sub>y</sub>′</b> focused plane result; (<b>d</b>) <b>k<sub>x</sub>′-k<sub>z</sub>′</b> focused plane result.</p> ">
Versions Notes

Abstract

:
For missile-borne platforms, traditional SAR technology consistently encounters two significant shortcomings: geometric distortion of 2D images and the inability to achieve forward-looking imaging. To address these issues, this paper explores the feasibility of using a maneuvering trajectory to enable forward-looking and three-dimensional imaging by analyzing the maneuvering characteristics of an actual missile-borne platform. Additionally, it derives the corresponding resolution characterization model, which lays a theoretical foundation for future applications. Building on this, the paper proposes a three-dimensional super-resolution imaging algorithm that combines axis rotation with compressed sensing. The axis rotation not only realizes the dimensionality reduction of data, but also can expand the observation scenario in the cross-track dimension. The proposed algorithm first focuses on the track-vertical plane to extract 2D position parameters. Then, a compressed sensing-based process is applied to extract reflection coefficients and super-resolution cross-track position parameters, thereby achieving precise 3D imaging reconstruction. Finally, numerical simulation results confirm the effectiveness and accuracy of the proposed algorithm.

1. Introduction

Thanks to the ability to provide high-resolution microwave imagery of the observed area regardless of weather conditions, synthetic aperture radar (SAR) [1,2,3,4] has become one of the most attractive radar techniques. Meanwhile, with the rapid development of electronic technology and the miniaturization of components in recent years, missile-borne SAR has become possible. However, different from traditional side-looking SAR, missile-borne SAR [5,6,7] often requires the antenna to present a large squint observation angle in order to detect longer distances and a forward target. Further, various algorithms have been proposed to solve the above requirements, such as time domain algorithms [8,9] (back-projection algorithm (BPA) and the fast factorized back-projection algorithm (FFBPA), etc.) and frequency domain algorithms [10,11,12,13,14,15,16,17,18] (Range–Doppler algorithm (RDA), Chirp Scaling algorithm, Nonlinear Chirp Scaling algorithm (NCSA) and Frequency Scaling algorithm (FSA), etc.). Nevertheless, both the geometric model and the imaging algorithm of the above missile-borne SAR entail 2D imaging detection through traditional linear trajectory, the limitations of which are twofold, as follows: the geometric distortion of 3D object on 2D images; the large squint cannot detect the front object. For solving these issues, the concept of 3D forward-looking imaging has gained the attention of scholars.
For achieving 3D forward-looking imaging detection, the primary task is to solve the issue of a missing aperture compared to traditional SAR. To date, there are two main ways to complete the aperture: Firstly, multi-channel interference [19,20,21,22,23,24,25,26], such as TomoSAR, HoloSAR and Linear array SAR, which has no requirements for the flight trajectory of the platform, but needs to be able to form a large aperture antenna. However, due to the limited volume of the missile platform, the above technical approaches are difficult to apply. Secondly, trajectory deviation [27,28,29,30,31,32,33], such as CSAR, CLSAR. Their principle is to form the third dimensional aperture through the trajectory, thereby achieving 3D object detection. From the technical point of view, it is more suitable for the missile platform, but the trajectory deviation of the above technical approach is concentrated in the plane and does not involve the research and analysis of 3D deviation similar to the missile platform. Following the second technical idea and combining it with an actual ballistic study, a novel 3D forward-looking super-resolution imaging method for missile-borne SAR via maneuvering trajectory is proposed, in which we first validate the feasibility of using maneuvering trajectory for 3D forward-looking through multiple actual ballistic data, and then the 3D resolution is derived, including an analysis of the influence of different parameters on the 3D resolution. Based on this, an effective 3D forward-looking imaging algorithm is proposed, which consist of two steps: axis rotation and super-resolution extraction through compressed sensing [34,35,36,37,38,39,40]. The operation of axis rotation is used to widen the observation scenario, thereby extracting the position parameter of the vertical and track dimension. Then, compressed sensing is applied to extract the reflection coefficient and cross-track position. Finally, the 3D image will be reconstructed effectively.
This paper is organized as follows: Section 2 assesses the feasibility of using maneuvering trajectory for the 3D forward-looking approach through multiple actual ballistics, and then analyzes its 3D resolution in detail; Section 3 describes a 3D forward-looking super-resolution imaging algorithm, combining axis rotation and compressed sensing; Section 4 further tests the proposed algorithm through several simulation experiments. Finally, summations of the whole paper are provided in Section 5.

2. The Understanding of Maneuvering Trajectory for Three-Dimensional and Forward-Looking Missile-Borne SAR

As illustrated in Figure 1, we here simulate multiple actual ballistic trajectories (note that the data source cannot be disclosed). Each trajectory exhibits a significant deviation arc caused by ballistic maneuvers, and these maneuvering paths are typically underutilized during the guidance process. Given this, we are exploring the potential value of these trajectories. From Figure 1b,c, it is evident that the maneuvering trajectories include pull deviations in the X-cross-track, Y-track, and Z-vertical dimensions. From a detection standpoint, this implies three-dimensional degrees of freedom/apertures, which could enable the possibility of three-dimensional target detection. To further clarify this, we also provide the aperture size of the maneuvering trajectories in the standard coordinate system, as shown in Table 1. It is clear from these data that utilizing these maneuvering trajectories for three-dimensional imaging detection is indeed feasible.
Using the parameters in Table 1, we analyze the changes in the view angle between the observation angle and the target along the maneuvering trajectory segment. Typically, an angle within a range of plus or minus 5 degrees is considered to fall within the forward-looking category. As shown in Figure 2, the overall angle change caused by the maneuvering trajectory remains within 3 degrees, with a slight reduction in the later stages due to the trajectory maneuvering back. In summary, it can be concluded that this maneuvering trajectory enables both forward-looking and 3D imaging simultaneously, offering a distinct advantage over the traditional large squint SAR mode.
After completing the feasibility analysis of three-dimensional forward-looking imaging, another crucial factor to consider is the resolution capability afforded by the maneuvering trajectory, as this directly determines its practical application potential. For simplicity, we assume the target is located at the origin, and the position coordinates of the missile platform at the i - th slow time are x i , y i , z i . The pitch and azimuth angles from the missile platform relative to the target are denoted as φ t i and θ t i , respectively. Thus, the vector representation of the line between the target and the platform can be expressed as
r φ τ i , θ τ i = r i cos φ τ i cos θ τ i cos φ τ i sin θ τ i sin φ τ i
in which r i = x i 2 + y i 2 + z i 2 . Assuming that the radar system sends broadband linear frequency modulation signals and processes them with dechirp, then the echo signal can be written as
S τ i , θ i , φ i = r e c t t 2 r i / c T p exp j k r i
where k = 4 π f c + γ t / c ( f c is the center frequency, γ is the frequency modulation rate and c is the light speed). Define Ω τ i , φ i , θ i = k r i as the phase history of signal echo propagation; then, its spatial frequencies at the standard wavenumber spectrum geometry—kx-ky-kz, as shown in Figure 3—are as follows:
k x = Ω x = k cos φ τ i cos θ τ i k y = Ω y = k cos φ τ i sin θ τ i k z = Ω z = k sin φ τ i
In theory, its 3D resolution corresponds to the wavenumber spectrum widths projected on the kx, ky and kz axes, that is,
δ k x = c / 2 / max k x min k x δ k y = c / 2 / max k y min k y δ k z = c / 2 / max k z min k z
However, it is relatively difficult to solve Formula (4) directly due to the irregular, asymmetric and non-straight maneuvering trajectory. Therefore, we give an alternative—Kh-Kv-Ku, as shown in Figure 3, which is established based on the maneuvering trajectory. The advantage of this coordinate system design is that it allows for the detailed calculation of resolution based on the actual ballistic trajectory. Here, we assume that the azimuth angle and pitch angle at the initial time of maneuvering trajectory are θ τ 0 and φ τ 0 . Then, the Kh-Kv-Ku coordinate system can be realized by rotating θ τ 0 along the kz axis and π / 2 φ τ 0 along the kx axis, respectively. The red grid area denotes the 3D wavenumber spectrum, which begins at the initial time of the maneuvering trajectory and ends at the end of the maneuvering trajectory on the horizontal plane. The width is determined by the bandwidth B = γ T p of the transmitted signal. Based on the above coordinate system design, we define the range dimension by the line-of-sight at initial time, and the normalized Ku axis can be written as ( Λ notes the normalized operation)
K u Λ = cos φ τ 0 cos θ τ 0 , cos φ τ 0 sin θ τ 0 , sin φ τ 0
At this moment, the projection of the wavenumber spectrum along the Ku axis can be easily calculated, as follows:
B K u = max 4 π f c + B c K u Λ × cos φ τ k cos θ τ k cos φ τ k sin θ τ k sin φ τ k min 4 π f c c K u Λ × cos φ τ l cos θ τ l cos φ τ l sin θ τ l sin φ τ l
where k ,     l k , l = 1 , 2 , , N is the k / l - th sampling position of the maneuvering trajectory. Form Formula (6), we can infer the value of B K u in advance, while the maneuvering trajectory is determined. Then, the resolution can be written as
δ K u = c 2 B K u
Furthermore, Figure 4 gives the projection of the wavenumber spectrum along the Kh and Kv axis, and the definition of normalized Kh axis can be written as
K h Λ = sin θ τ 0 , cos θ τ 0 , 0
As shown in Figure 4a, we first project the wavenumber spectrum onto the kx-ky plane with the range of value being [ 4 π f c cos φ τ i / c , 4 π f c + B cos φ τ j / c , i , j = 1 , 2 , , N ], and then project it vertically onto the Kh axis. So, the bandwidth on the Kh axis can be written as
B K h = max 4 π f c + B c cos φ τ k sin θ τ k θ 0 min 4 π f c c cos φ τ l sin θ τ l θ 0
where θ 0 is the azimuth angle at the initial time. Similarly, Figure 4b gives the process of projection onto the Kv axis, and its bandwidth can be written as
B K v = max 4 π f c + B c 1 cos φ τ k 2 sin θ τ k θ 0 2 sin φ 0 φ n e w _ k min 4 π f c c 1 cos φ τ l 2 sin θ τ l θ 0 2 sin φ 0 φ n e w _ l
where φ 0 is the pitch angle at the initial time and the definition of φ n e w is
φ n e w = a sin sin φ τ 1 cos φ τ 2 sin θ τ θ 0
Based on the above analysis, we can determine the 3D resolution of the rotated Ku-Kh-Kv coordinate system. This resolution can also be used to calculate the 3D resolution in the standard coordinate system, as follows:
B k x B k y B k z = cos φ 0 cos θ 0 sin θ 0 sin φ 0 cos θ 0 cos φ 0 sin θ 0 cos θ 0 sin φ 0 sin θ 0 sin φ 0 0 cos φ 0 B K u B K h B K v
and
δ k x c 2 B k x = c 2 B K u cos φ 0 cos θ 0 + B K v sin φ 0 cos θ 0 + B K h sin θ 0 δ k y c 2 B k y = c 2 B K u cos φ 0 sin θ 0 + B K v sin φ 0 sin θ 0 B K h cos θ 0 δ k z c 2 B k z = c 2 B K u sin φ 0 B K v cos φ 0
To validate the accuracy of the 3D resolution derived above, we use the simulation parameters listed in Table 1. Figure 5 illustrates the 3D envelope at the scenario center, demonstrating the feasibility of 3D imaging using the maneuvering trajectory. For this simulation, we selected maneuvering trajectory 2, with a center frequency of 40 GHz and a transmitting bandwidth of 400 MHz. A comparison between the estimated and actual values of 3D resolution is presented in Table 2, showing that the estimated values align closely with the true values. However, since the 3D resolution is fundamentally a multi-dimensional variable function influenced by factors such as center frequency, transmitting bandwidth, pitch angle, and azimuth angle, we further investigate the impacts of these parameters. Here, we define the ranges of center frequency, bandwidth, pitch angle, and azimuth angle as 20   GHz ~ 40   GHz , 200   MHz ~ 800   MHz , [ 50 , 80 ] and 30 , 60 , respectively (note that while the angle range is broad, it will only apply to a small portion of this range in practice). Figure 6 illustrates the variation in 3D resolution. It can be observed that the 3D resolution is significantly influenced by bandwidth, increasing as both bandwidth and center frequency rise. From Figure 6a,d,g,j, we see that the resolution in the kz dimension is less affected by azimuth angle, though its upper bound increases with a rising pitch angle. In the kx dimension, as shown in Figure 6b,e,h,k, the upper bound of the resolution increases rapidly with the azimuth angle. Finally, Figure 6c,f,i,l indicate that the upper bound of the resolution in the ky dimension is inversely proportional to the azimuth angle but directly proportional to the pitch angle.
In summary, we have demonstrated the feasibility of 3D forward-looking imaging using the maneuvering trajectory and conducted a detailed analysis of its resolution and influencing factors. The subsequent section will focus on the image processing methodology.

3. Three-Dimensional Super-Resolution Imaging Combining Axis Rotation and Compressed Sensing

Before proceeding with imaging processing, it is essential to first analyze the representation of the echo signal. Assuming the 3D scenario is discretized into uniform 3D grids- Ω M × N × L and a linear frequency modulation signal is transmitted, the received echo signal, denoted as Formula (2), can be rewritten as:
S τ i , θ i , φ i = Ω δ Ω r e c t t 2 r i / c T p exp j k r i = Ω δ Ω r e c t t 2 r i / c T p exp j k x Ω cos φ τ i cos θ τ i + y Ω cos φ τ i sin θ τ i + z Ω sin φ τ i
where δ Ω denotes the reflection coefficient of Ω . In order to get a more intuitive understanding of Formula (14), we perform matrix processing on it, as follows:
D e f i n e : α i = exp j k x 1 cos φ τ i cos θ τ i exp j k x 2 cos φ τ i cos θ τ i exp j k x M cos φ τ i cos θ τ i β i = exp j k y 1 cos φ τ i sin θ τ i exp j k y 2 cos φ τ i sin θ τ i exp j k y N cos φ τ i sin θ τ i χ i = exp j k z 1 sin θ τ i exp j k z 2 sin θ τ i exp j k z L sin θ τ i                                   Ω δ = δ x 1 , y 1 , z 1 , δ x 2 , y 1 , z 1 , , δ x M , y 1 , z 1 , , δ x 1 , y N , z L , , δ x M , y N , z L                                                                                                                                                                                                                                                                   S = A K × M N L = α 1 β 1 χ 1 α 2 β 2 χ 2 α K β K χ K Ω δ                                                          
where denotes the Kronecker product [41]. For Formula (15), there are two ways to solve it. One way is the time domain algorithm, such as BP, FBP or FFBP, which performs imaging through point by point compensation. However, it is time-consuming and inefficient. The other way involves solving a linear programming problem efficiently. However, the computational complexity of this method is primarily determined by the sampling ratio of Ω , which corresponds to the measurement matrix A . If the size of A is too large, even traditional toolboxes become inefficient. So, can we combine these two approaches to reduce the size of A while also minimizing the 3D compensation required by the time domain algorithm?
To address this, we start by examining the wavenumber spectrum and its corresponding resolution. Figure 5 shows that the resolutions along ky and kz axes are at the sub-meter level, while that along the kx axis is on the order of ten meters. This discrepancy is also evident from the wavenumber spectrum projection results in Figure 7. Based on these characteristics, we propose an efficient super-resolution imaging algorithm that utilizes FFBP to image the ky-kz plane and compress sensing to achieve a super-resolution reconstructed along the kx axis. However, this algorithm still faces a significant issue: the imaging area range along the kx axis is equal to its theoretical resolution (i.e., the imaging area range is constrained by the kx axis resolution, and the scatters with different ky/kz positions within one kx axis resolution unit will be focused at the same position). Consequently, once the spacing of any two scatters exceeds the kx axis resolution, the false scatters may appear in the ky-kz imaging plane. So, what can be done to alleviate this issue? In other words, how does one reduce the resolution of the kx axis?
In order to solve this, we further analyze the wavenumber spectrum projection results. Upon closer inspection of Figure 7b, it becomes apparent that the projection bandwidth along the kx axis is not minimal in the current kx-ky axes definition. Therefore, we propose rotating the coordinate system to reduce the kx axis resolution and broaden the range of the observation area. As illustrated in the inset of Figure 7b, we define new kx′-ky axes by rotating the ky axis to align with the initial azimuth position (i.e., by rotating θ 0 with the kz axis as the rotation axis). The projection bandwidth of the new coordinate system can then be expressed as follows:
B k x B k y B k z = 0 1 0 cos φ 0 0 sin φ 0 sin φ 0 0 cos φ 0 B K u B K h B K v
For verification, we select two scatters separated by 18 m along the kx axis for verification. Figure 8a,b present a comparison of the 3D resolution, where the solid line corresponds to the standard coordinate system and the dashed line corresponds to the rotated co-ordinate system. Figure 8a indicates that the resolutions along the (ky, kz) and (ky′, kz) axes are approximately the same, which aligns with the findings from the previous theoretical analysis in Formula (16). In contrast, Figure 8b illustrates that the resolution along the kx axis is significantly smaller than that of the kx axis. Based on these conclusions, Figure 8c shows the ky-kz and ky′-kz reconstructed plane. The ky-kz reconstructed plane shows a false scatter due to the separation between the two scatters exceeding the resolution along the kx aixs. However, the ky′-kz reconstructed plane eliminates the false scatter, indicating that the kx axis resolution is greater than the separation between the two scatters. Therefore, we can conclude that the coordinate system rotation approach is indeed feasible.
After the ky and kz positions are extracted from the ky′-kz reconstructed plane, the next task is to get the kx positions of all scatters. As the ky and kz positions are known, the set Ω will reduce to M × J , and J N , L denotes the number of extracted positions. Therefore, both the measurement matrix A and reflection coefficient vector Ω δ will achieve dimensionality reduction, and can be rewritten as
A K × M × J × J = α 1 β 1 χ 1 , α 2 β 2 χ 2 , , α K β K χ K T Ω δ = δ x 1 , y 1 , z 1 , δ x 2 , y 1 , z 1 , , δ x M , y 1 , z 1 , , δ x 1 , y J , z J , , δ x M , y J , z J                                                                                                                                                                                                                                                     S = A Ω δ  
For Formula (17), we can convert it into the following typical linear programming problem [32]:
m i n i m i z e A Ω δ S 2 2 + λ Ω δ 1
where 1 denotes the L 1 norm and λ > 0 is the regularization parameter. To solve the Formula (18), high-quality implementations of the interior-point method including l1-magic [42] and PDCO [43] can be used, which utilize iterative algorithms, such as the conjugate gradients (CG) [44] or LSQR algorithm [45], to compute the search step. After applying compressed sensing, we obtain the reflection coefficients and positions of all scatters. Summarizing the entire processing procedure, the flowchart of the proposed dimension-reduction super-resolution 3D imaging algorithm is shown in Figure 9.

4. Simulation and Results

Here, we conduct the simulation verification in three steps. First, we verify the feasibility of the proposed 3D imaging algorithm using multiple scatters with both identical and varying reflection coefficients. Next, we explore the boundaries of the algorithm and examine the effects of signal-to-noise ratio and sampling rate on imaging performance. Finally, we validate the algorithm using point cloud models of actual complex target objects. It is important to note that the maneuvering trajectory II is used in all the simulations below.

4.1. 3D Imaging for Multi-Scatters with the Same and Varying Reflection Coefficients

To verify the feasibility of the proposed imaging algorithm, we not only present the final imaging results, but also demonstrate the corresponding parameter extraction process. First, consider Figure 10, where five scatters with varying reflection coefficients are depicted. Figure 10a,b show the ky′-kz positions and the kx positions, respectively. It is also evident that the vertical axis in Figure 10b corresponds to the reflection coefficients. In Figure 10a, it can be observed that the number of scatters is reduced to three. This reduction occurs because the three scatters distributed along the kx axis fall within a single kx axis resolution unit, causing them to overlap in the ky′-kz focused plane. To separate these three scatters, we apply compressed sensing to achieve super-resolution extraction, as shown in Figure 10b, where the five scatters reappear, and their kx positions and reflection coefficients are effectively extracted. Finally, the complete 3D imaging result and detailed comparisons are presented in Figure 10c and Table 3. Similarly, Figure 11 presents the 3D imaging process for seven scatters with identical reflection coefficients, with detailed comparisons shown in Table 4. Overall, both the 3D imaging and parameter comparison results demonstrate that the proposed algorithm is a feasible and high-performance 3D imaging processing method for the maneuvering trajectory.

4.2. Effects of the Signal-to-Noise Ratio and Sampling Rate on 3D Imaging Processing

Figure 12 illustrates the effects of signal-to-noise ratio (SNR) and sampling rate on 3D imaging processing, where real scatters are denoted by blue pentagrams and reconstructed scatters by red hexagons. The positions of the multi-scatters are referenced in Table 4. As shown in Figure 12a,d,g, the shapes of the multi-scatters are well reconstructed at a high sampling rate of M = 12,000, even with an SNR of −10 dB. In the middle row, where M = 8000, the performance slightly declines compared to M = 12,000, with a small offset appearing along the kx axis. Meanwhile, Figure 12c,f show similar performances to M = 8000, but in Figure 12i, there is a phenomenon of missing scatters, and the shapes of the multi-scatters are not properly reconstructed. To further evaluate the resolution capability in detail, we here analyze it by reconstructing the position error of a multi-point target with two perspectives: the number of kx/cross-track dimension grids and the signal-to-noise ratio of echoes. Here, the definition of reconstruction error is
P e r r o r = i = 1 K x i x ˜ i K
where K denotes the number of scatters, and x i and x ˜ i represent the true kx position and estimated kx position, respectively. Then, we conduct 100 Monte Carlo experiments to provide a relatively clear demonstration of resolution ability, as shown in Figure 13. Regarding doubts as to why this approach is used to evaluate resolution capability, here, we will provide an explanation. The three-dimensional imaging process proposed in this paper is essentially achieved by extracting and reconstructing the target scattering point position and reflection coefficient. Therefore, we evaluate the resolution capability of the proposed algorithm by using the positioning accuracy of scattered points in three-dimensional space.
Figure 13 shows three curves, each corresponding to a different number of sampling grids. Overall, when the SNR is negative, the positioning error across all curves is relatively high. However, at an SNR of 0 dB, there is a noticeable drop in positioning error, which continues to decrease as the SNR increases. A further comparison of the curves for different sampling grids reveals that the more sampling grids there are, the smaller the positioning error becomes. This can be explained by the fact that increasing the number of sampling grids results in a finer division of the kx axis space, thereby increasing the likelihood of accurately sampling the true target position. However, this also leads to a corresponding increase in computational load.

4.3. 3D Imaging Verification with the Point Cloud Models of Actual Complex Tank Object

To further validate the practical applicability of the proposed algorithm, we selected a complex tank object for verification. The 3D imaging results are presented in Figure 14, with the sampling rate set at M = 50,000 and SNR = 10 dB. In Figure 14b, the 2D position parameters of the tank object in the ky′-kz focused plane are shown, where the shape of the tank is clearly discernible. The image appears clear, with sidelobes effectively eliminated, as indicated by the ample minimum value on the color bar. Figure 14c,d display the projections of the 3D imaging result in the kx′-ky and kx′-kz planes, respectively. Although a slight offset is observed, the overall shape of the tank remains identifiable. Overall, the tank object is accurately and effectively reconstructed, demonstrating the feasibility and practicality of the proposed algorithm for the 3D imaging of complex real-world objects under a maneuvering trajectory.

5. Conclusions

Building on a comprehensive understanding of actual ballistic trajectories, this paper delves into the maneuvering trajectory characteristics of missile platforms, verifying the feasibility of 3D forward-looking imaging. This research represents a significant innovation in the field of missile technology, offering a viable technical solution for achieving forward detection in three-dimensional space.
The main contributions of this study are as follows. First, it introduces and validates the use of maneuvering trajectories for 3D forward-looking imaging—a concept not explored in the existing literature. Second, the paper provides an in-depth analysis of 3D resolution and its influencing factors, offering valuable theoretical insights that can inform future ballistic design. Finally, the study proposes a novel 3D super-resolution imaging algorithm that combines axis rotation with compressed sensing. The effectiveness and accuracy of this algorithm are validated through several rigorous experiments.

Author Contributions

Conceptualization, software and validation, writing—review and editing, T.G. and Y.G.; methodology, T.G. and Y.G.; software and validation, C.Z. and T.Z.; investigation, J.Z.; data curation, T.G.; writing—original draft preparation, T.G.; writing—review and editing, Y.G.; supervision, G.L.; funding acquisition, Y.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by several foundations, including the National Natural Science Foundation of China under Grant No. 62301438 and Grant No. 62301598, the Fundamental Research Funds for the Central Universities under Grant D5000230324, and the Natural Science Basic Research Program of Shaanxi under Grant No. 2023-JC-QN-0638.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy and highly military confidential.

Conflicts of Interest

The authors declare that they have no known financial interests or personal relationships that could have appeared to influence the work reported in this paper. All authors have contributed to this research without any bias or vested interest from external entities. Furthermore, the research was conducted independently of any commercial or governmental influence. The findings and conclusions presented are solely based on scientific analysis and data, ensuring objectivity and academic integrity. The authors affirm that there are no conflicts related to the publication of this manuscript, including affiliations, financial support, or intellectual property. Any potential conflicts that could arise in the future will be promptly disclosed and addressed according to the best ethical practices.

References

  1. Cumming, I.G.; Wong, F.H. Digital Processing of Synthetic Aperture Radar Data: Algorithm and Implementation; Artech House: Norwood, MA, USA, 2005. [Google Scholar]
  2. Moreira, A.; Prats-Iraola, P.; Younis, M.; Krieger, G.; Hajnsek, I.; Papathanassiou, K.P. A tutorial on synthetic aperture radar. IEEE Geosci. Remote Sens. Mag. 2013, 1, 563–583. [Google Scholar] [CrossRef]
  3. Chen, X.; Sun, G.-C.; Xing, M.; Li, B.; Yang, J.; Bao, Z. Ground Cartesian back-projection algorithm for high squint diving Tops SAR imaging. IEEE Trans. Geosci. Remote Sens. 2021, 59, 5812–5827. [Google Scholar] [CrossRef]
  4. Deng, Y.; Sun, G.-C.; Han, L.; Wang, Y.; Zhang, Y.; Xing, M. 2-D Wavenumber Domain Autofocusing for High-Resolution Highly Squinted SAR Imaging Based on Equivalent Broadside Model. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5220515. [Google Scholar] [CrossRef]
  5. Xu, H.P.; Zhu, Y.D.; Kang, C.H.; Zhou, Y.Q. A new deramp NECS imaging algorithm for missile borne hybrid SAR. Chin. J. Electron. 2011, 20, 769–774. [Google Scholar]
  6. Liu, D.; Shi, H.; Liu, H.; Yang, T.; Guo, J. Enhanced Forward-Looking Missile-Borne Bistatic SAR Imaging with Electromagnetic Vortex. IEEE Sens. J. 2023, 23, 8478–8490. [Google Scholar] [CrossRef]
  7. Qian, G.; Wang, Y. Analysis of Modeling and 2-D Resolution of Satellite–Missile Borne Bistatic Forward-Looking SAR. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5222314. [Google Scholar] [CrossRef]
  8. Li, X.; Zhou, S.; Yang, L. A new fast factorized back-projection algorithm with reduced topography sensibility for missile-borne SAR focusing with diving movement. Remote Sens. 2020, 12, 2616. [Google Scholar] [CrossRef]
  9. Wang, C.; Sun, H.; Zhang, X.-Y.; Zhang, R. A unified back-projection correction algorithm for squint SAR based on SPECAN processing. In Proceedings of the 2019 IEEE International Conference on Signal, Information and Data Processing (ICSIDP), Chongqing, China, 11–13 December 2019; pp. 1–4. [Google Scholar]
  10. Tang, S.; Zhang, L.; Guo, P.; Zhao, Y. An omega-K algorithm for highly squinted missile-borne SAR with constant acceleration. IEEE Geosci. Remote Sens. Lett. 2014, 11, 1569–1573. [Google Scholar] [CrossRef]
  11. Chen, S.; Zhao, H.; Zhang, S.; Chen, Y. An extended nonlinear chirp scaling algorithm for missile borne SAR imaging. Signal Process. 2014, 99, 58–68. [Google Scholar] [CrossRef]
  12. Li, Z.; Xing, M.; Liang, Y.; Gao, Y.; Chen, J.; Huai, Y.; Zeng, L.; Sun, G.C.; Bao, Z. A frequency-domain imaging algorithm for highly squinted SAR mounted on maneuvering platforms with nonlinear trajectory. IEEE Trans. Geosci. Remote Sens. 2016, 54, 4023–4038. [Google Scholar] [CrossRef]
  13. Zhang, Y.; Lu, C.; Zhang, H.; Li, H. A Modified CSA for Missile-Borne SAR with Curved Trajectory. In Proceedings of the 2020 IEEE Radar Conference (RadarConf20), Florence, Italy, 21–25 September 2020; pp. 1–6. [Google Scholar]
  14. Saeedi, J. Feasibility study and conceptual design of missile-borne synthetic aperture radar. IEEE Trans. Syst. Man Cybern. Syst. 2020, 50, 1122–1133. [Google Scholar] [CrossRef]
  15. Zhu, D.; Xiang, T.; Wei, W.; Ren, Z.; Yang, M.; Zhang, Y.; Zhu, Z. An extended two step approach to high-resolution airborne and spaceborne SAR full-aperture processing. IEEE Trans. Geosci. Remote Sens. 2021, 59, 8382–8397. [Google Scholar] [CrossRef]
  16. Tang, S.; Zhang, X.; He, Z.; Chen, Z. Practical Issue Analyses and Imaging Approach for Hypersonic Vehicle-Borne SAR with Near-Vertical Diving Trajectory. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5204316. [Google Scholar] [CrossRef]
  17. Dong, L.; Han, S.; Zhu, D.; Mao, X. A Modified Polar Format Algorithm for Highly Squinted Missile-Borne SAR. IEEE Geosci. Remote Sens. Lett. 2023, 20, 4012905. [Google Scholar] [CrossRef]
  18. Zheng, Y.; Guan, J.; Jiang, G.; Yi, W.; Yang, X.; Yin, H. A Modified Algorithm for Highly Squinted Missile-Borne SAR Imaging with Large Acceleration. IEEE Access 2024, 12, 48640–48653. [Google Scholar] [CrossRef]
  19. Zebker, H.; Goldstein, R. Topographic mapping from interferometric SAR observations. J. Geophys. Res. 1986, 91, 4993–4999. [Google Scholar] [CrossRef]
  20. Zhu, X.; Bamler, R. Tomographic SAR inversion by L1-norm regularization—The compressive sensing approach. IEEE Trans. Geosci. Remote Sens. 2010, 48, 3839–3846. [Google Scholar] [CrossRef]
  21. Bi, H.; Zhang, B.; Hong, W.; Zhou, S. Matrix-Completion-Based Airborne Tomographic SAR Inversion under Missing Data. IEEE Geosci. Remote Sens. Lett. 2015, 12, 2346–2350. [Google Scholar] [CrossRef]
  22. Reale, D.; Fornaro, G.; Pauciullo, A.; Zhu, X.; Bamler, R. Tomographic imaging and monitoring of buildings with very high resolution SAR data. IEEE Geosci. Remote Sens. Lett. 2011, 8, 661–665. [Google Scholar] [CrossRef]
  23. Gu, T.; Liao, G.; Li, Y.; Liu, Y.; Guo, Y. Airborne Downward-Looking Sparse Linear Array 3-D SAR Imaging via 2-D Adaptive Iterative Reweighted Atomic Norm Minimization. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5202513. [Google Scholar] [CrossRef]
  24. Gu, T.; Liao, G.; Li, Y.; Guo, Y.; Liu, Y. DLSLA 3-D SAR Imaging via Sparse Recovery through Combination of Nuclear Norm and Low-Rank Matrix Factorization. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5208213. [Google Scholar] [CrossRef]
  25. Shao, M.; Su, C.; Zhang, Z.; Zhang, B. The application of the alternate descent conditional gradient method in tomographic SAR off-grid imaging. In Proceedings of the IET International Radar Conference (IRC 2023), Chongqing, China, 3–5 December 2023; pp. 3259–3264. [Google Scholar]
  26. Tian, W.; Xie, X.; Deng, Y.; Yang, Z.; Hu, C. An Improved Imaging Method Based on Optimal Topographic Imaging Plane Reconstruction for Nonlinear Trajectory SAR. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5216217. [Google Scholar] [CrossRef]
  27. Meng, D.; Hu, D.; Ding, C. A New Approach to Airborne High Resolution SAR Motion Compensation for Large Trajectory Deviations. Chin. J. Electron. 2012, 21, 764–769. [Google Scholar]
  28. Gorovyi, I.M.; Bezvesilniy, O.O.; Vavriv, D.M. Estimation of uncompensated trajectory deviations and image refocusing for high-resolution SAR. In Proceedings of the 2015 German Microwave Conference, Nuremberg, Germany, 16–18 March 2015; pp. 186–189. [Google Scholar]
  29. Ran, L.; Liu, Z.; Zhang, T.; Li, T. Autofocus for correcting three dimensional trajectory deviations in synthetic aperture radar. In Proceedings of the 2016 CIE International Conference on Radar (RADAR), Guangzhou, China, 10–13 October 2016; pp. 1–4. [Google Scholar]
  30. Ran, L.; Liu, Z.; Zhang, L.; Li, T.; Xie, R. An Autofocus Algorithm for Estimating Residual Trajectory Deviations in Synthetic Aperture Radar. IEEE Trans. Geosci. Remote Sens. 2017, 55, 3408–3425. [Google Scholar] [CrossRef]
  31. Liu, Y.; Wang, W.; Pan, X.; Gu, Z.; Wang, G. Raw Signal Simulator for SAR with Trajectory Deviation Based on Spatial Spectrum Analysis. IEEE Trans. Geosci. Remote Sens. 2017, 55, 6651–6665. [Google Scholar] [CrossRef]
  32. An, Z.; Xiong, F.; Li, C. A Trajectory Tracking Method Using Convex Optimization. In Proceedings of the 2020 39th Chinese Control Conference (CCC), Shenyang, China, 27–29 July 2020; pp. 3281–3287. [Google Scholar]
  33. Chen, X.; Li, Z.; Yang, Y.; Qi, L.; Ke, R. High-Resolution Vehicle Trajectory Extraction and Denoising from Aerial Videos. IEEE Trans. Intell. Transp. Syst. 2021, 22, 3190–3202. [Google Scholar] [CrossRef]
  34. Donoho, D.L. Compressed sensing. IEEE Trans. Theory 2006, 52, 1289–1306. [Google Scholar] [CrossRef]
  35. Kim, S.-J.; Koh, K.; Lustig, M.; Boyd, S.; Gorinevsky, D. An Interior-Point Method for Large-Scale ℓ -Regularized Least Squares. IEEE J. Sel. Top. Signal Process. 2007, 1, 606–617. [Google Scholar] [CrossRef]
  36. Austin, C.D.; Ertin, E.; Moses, R.L. Sparse signal methods for 3D radar imaging. IEEE J. Sel. Topics Signal Process. 2011, 5, 408–423. [Google Scholar] [CrossRef]
  37. Tang, G.; Bhaskar, B.N.; Shah, P.; Recht, B. Compressed sensing off the grid. IEEE Trans. Inf. Theory 2013, 59, 7465–7490. [Google Scholar] [CrossRef]
  38. Qiu, W.; Zhou, J.; Zhao, H.; Fu, Q. Three-Dimensional Sparse Turntable Microwave Imaging Based on Compressive Sensing. IEEE Geosci. Remote Sens. Lett. 2015, 12, 826–830. [Google Scholar] [CrossRef]
  39. Bu, H.; Tao, R.; Bai, X.; Zhao, J. A Novel SAR Imaging Algorithm Based on Compressed Sensing. IEEE Geosci. Remote Sens. Lett 2015, 12, 1003–1007. [Google Scholar] [CrossRef]
  40. Peng, X.; Tan, W.; Hong, W.; Jiang, C.; Bao, Q.; Wang, Y. Airborne DLSLA 3-D SAR image reconstruction by combination of polar formatting and L1 regularization. IEEE Trans. Geosci. Remote Sens. 2016, 54, 213–226. [Google Scholar] [CrossRef]
  41. Weidner, R.J.; Mulholland, R.J. Kronecker product representation for the solution of the general linear matrix equation. IEEE Trans. Autom. Control. 1980, 25, 563–564. [Google Scholar] [CrossRef]
  42. Candès, E.; Romberg, J. L-Magic: A Collection of MATLAB Routines for Solving the Convex Optimization Programs Central to Compressive Sampling 2006. Available online: www.acm.caltech.edu/l1magic/ (accessed on 7 July 2024).
  43. Saunders, M. PDCO: Primal-Dual Interior Method for Convex Objectives 2002. Available online: https://github.com/mxsaunders/pdco (accessed on 5 September 2024).
  44. Hager, W.W.; Zhang, H. A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2006, 2, 35–58. [Google Scholar] [CrossRef]
  45. Paige, C.; Saunders, M. LSQR: An algorithm for sparse linear equations and sparse least squares. ACM Trans. Math. Softw. 1982, 8, 43–71. [Google Scholar] [CrossRef]
Figure 1. Multiple simulated actual ballistics. (a) The 3D spatial trajectory of ballistics. (b) The 2D profile trajectory of ballistics in the Y–Z plane. (c) The 2D profile trajectory of ballistics in the X–Y plane.
Figure 1. Multiple simulated actual ballistics. (a) The 3D spatial trajectory of ballistics. (b) The 2D profile trajectory of ballistics in the Y–Z plane. (c) The 2D profile trajectory of ballistics in the X–Y plane.
Remotesensing 16 03378 g001
Figure 2. The angle change caused by maneuvering trajectory.
Figure 2. The angle change caused by maneuvering trajectory.
Remotesensing 16 03378 g002
Figure 3. Wavenumber spectrum geometry, in which kx-ky-kz denotes the standard wavenumber spectrum geometry and Kh-Kv-Ku denotes the wavenumber spectrum geometry along the line of sight direction. In addition, the green and blue squares correspond to the Kh-Kv plane and kx-ky plane, respectively.
Figure 3. Wavenumber spectrum geometry, in which kx-ky-kz denotes the standard wavenumber spectrum geometry and Kh-Kv-Ku denotes the wavenumber spectrum geometry along the line of sight direction. In addition, the green and blue squares correspond to the Kh-Kv plane and kx-ky plane, respectively.
Remotesensing 16 03378 g003
Figure 4. The wavenumber spectrum projection geometry. (a) The projection along the Kh axis, in which the blue square correspond to kx-ky plane; (b) the projection along the Kv axis, in which the green and blue squares correspond to the kz-Kv plane and kx-ky plane, respectively.
Figure 4. The wavenumber spectrum projection geometry. (a) The projection along the Kh axis, in which the blue square correspond to kx-ky plane; (b) the projection along the Kv axis, in which the green and blue squares correspond to the kz-Kv plane and kx-ky plane, respectively.
Remotesensing 16 03378 g004
Figure 5. 3D envelope of the scenario center. (a) The envelope of the ky axis and kz axis; (b) the envelope of the kx axis.
Figure 5. 3D envelope of the scenario center. (a) The envelope of the ky axis and kz axis; (b) the envelope of the kx axis.
Remotesensing 16 03378 g005
Figure 6. The influence analysis of center frequency, transmitting bandwidth, azimuth and pitch angle on the 3D resolution; (a) the kz axis resolution with [30°, 50°]; (b) the kx axis resolution with [30°, 50°]; (c) the ky axis resolution with [30°, 50°]; (d) the kz axis resolution with [30°, 80°]; (e) the kx axis resolution with [30°, 80°]; (f) the ky axis resolution with [30°, 80°]; (g) the kz axis resolution with [60°, 50°]; (h) the kx axis resolution with [60°, 50°]; (i) the ky axis resolution with [60°, 50°]; (j) the kz axis resolution with [60°, 80°]; (k) the kx axis resolution with [60°, 80°]; (l) the ky axis resolution with [60°, 80°].
Figure 6. The influence analysis of center frequency, transmitting bandwidth, azimuth and pitch angle on the 3D resolution; (a) the kz axis resolution with [30°, 50°]; (b) the kx axis resolution with [30°, 50°]; (c) the ky axis resolution with [30°, 50°]; (d) the kz axis resolution with [30°, 80°]; (e) the kx axis resolution with [30°, 80°]; (f) the ky axis resolution with [30°, 80°]; (g) the kz axis resolution with [60°, 50°]; (h) the kx axis resolution with [60°, 50°]; (i) the ky axis resolution with [60°, 50°]; (j) the kz axis resolution with [60°, 80°]; (k) the kx axis resolution with [60°, 80°]; (l) the ky axis resolution with [60°, 80°].
Remotesensing 16 03378 g006aRemotesensing 16 03378 g006b
Figure 7. Wavenumber spectrum projection analysis of trajectory II. (a) 3D wavenumber spectrum. (b) kx-ky wavenumber spectrum projection plane. (c) kz-kx and kz-ky wavenumber spectrum projection plane.
Figure 7. Wavenumber spectrum projection analysis of trajectory II. (a) 3D wavenumber spectrum. (b) kx-ky wavenumber spectrum projection plane. (c) kz-kx and kz-ky wavenumber spectrum projection plane.
Remotesensing 16 03378 g007
Figure 8. The 3D resolution verification after axis rotation. (a) The comparison between the kz/ky axis and the kz′/ky axis. (b) The comparison between the kx axis and kx axis. (c) The comparison between ky-kz and ky′-kz focused plane.
Figure 8. The 3D resolution verification after axis rotation. (a) The comparison between the kz/ky axis and the kz′/ky axis. (b) The comparison between the kx axis and kx axis. (c) The comparison between ky-kz and ky′-kz focused plane.
Remotesensing 16 03378 g008
Figure 9. The flowchart of the proposed dimension-reduction super-resolution 3D imaging algorithm.
Figure 9. The flowchart of the proposed dimension-reduction super-resolution 3D imaging algorithm.
Remotesensing 16 03378 g009
Figure 10. 3D imaging process of five scatters with varying reflection coefficients. (a) ky′-kz focused plane; (b) kx-reflection coefficient extraction; (c) final 3D imaging result.
Figure 10. 3D imaging process of five scatters with varying reflection coefficients. (a) ky′-kz focused plane; (b) kx-reflection coefficient extraction; (c) final 3D imaging result.
Remotesensing 16 03378 g010
Figure 11. 3D imaging process of seven scatters with the same reflection coefficients. (a) ky′-kz focused plane; (b) kx-reflection coefficient extraction; (c) final 3D imaging result.
Figure 11. 3D imaging process of seven scatters with the same reflection coefficients. (a) ky′-kz focused plane; (b) kx-reflection coefficient extraction; (c) final 3D imaging result.
Remotesensing 16 03378 g011
Figure 12. The effect analysis of signal-to-noise and sampling rate. (a) 3D imaging result with SNR = 10 dB, M = 12,000; (b) 3D imaging result with SNR = 10 dB, M = 8000; (c) 3D imaging result with SNR = 10 dB, M = 8000; (d) 3D imaging result with SNR = 0 dB, M = 12,000; (e) 3D imaging result with SNR = 0 dB, M = 8000; (f) 3D imaging result with SNR = 0 dB, M = 4000; (g) 3D imaging result with SNR = −10 dB, M = 12,000; (h) 3D imaging result with SNR = −10 dB, M = 8000; (i) 3D imaging result with SNR = −10 dB, M = 4000.
Figure 12. The effect analysis of signal-to-noise and sampling rate. (a) 3D imaging result with SNR = 10 dB, M = 12,000; (b) 3D imaging result with SNR = 10 dB, M = 8000; (c) 3D imaging result with SNR = 10 dB, M = 8000; (d) 3D imaging result with SNR = 0 dB, M = 12,000; (e) 3D imaging result with SNR = 0 dB, M = 8000; (f) 3D imaging result with SNR = 0 dB, M = 4000; (g) 3D imaging result with SNR = −10 dB, M = 12,000; (h) 3D imaging result with SNR = −10 dB, M = 8000; (i) 3D imaging result with SNR = −10 dB, M = 4000.
Remotesensing 16 03378 g012
Figure 13. The kx axis resolution ability analysis with the sampling grids number M and SNR.
Figure 13. The kx axis resolution ability analysis with the sampling grids number M and SNR.
Remotesensing 16 03378 g013
Figure 14. The 3D imaging result for an actual complex tank object. (a) 3D imaging result; (b) ky′-kz focused plane result; (c) kx′-ky focused plane result; (d) kx′-kz focused plane result.
Figure 14. The 3D imaging result for an actual complex tank object. (a) 3D imaging result; (b) ky′-kz focused plane result; (c) kx′-ky focused plane result; (d) kx′-kz focused plane result.
Remotesensing 16 03378 g014
Table 1. The 3D trajectory drop of multiple simulated actual ballistics.
Table 1. The 3D trajectory drop of multiple simulated actual ballistics.
BallisticTrajectory Drop-XTrajectory Drop-YTrajectory Drop-Z
trajectory 132,910 m54,860 m22,370 m
trajectory 29060 m79,400 m2440 m
trajectory 316,490 m93,000 m15,710 m
trajectory 434,272 m82,530 m34,070 m
Table 2. Detailed comparison of 3D resolution.
Table 2. Detailed comparison of 3D resolution.
Resolutionkx/mky/mkz/m
True value16.250.5880.425
Estimate value16.230.5740.440
Table 3. Detailed parameter comparison of five scatters with varying reflection coefficients.
Table 3. Detailed parameter comparison of five scatters with varying reflection coefficients.
ParametersReflection Coefficients3D Positions
True1.002.403.601.603.00(0, 0, 0)(0, 4, 4)(0, 4, −4)(4, 0, 0)(−4, 0, 0)
Reconstructed0.9482.403.5781.5582.973(−0.02, 0, 0)(0.10, 4, 4)(−0.13, 4, −4)(4.07, 0, 0)(−4.05, 0, 0)
Table 4. Detailed parameter comparisons of seven scatters with same reflection coefficients.
Table 4. Detailed parameter comparisons of seven scatters with same reflection coefficients.
ParametersReflection Coefficients3D Positions
True10.0010.0010.0010.0010.0010.0010.00(0, 0, 0)(0, 4, 4)(0, 4, −4)(4, 0, 0)(−4, 0, 0)(0, 4, 0)(0, −4, 0)
Reconstructed9.839.949.819.939.939.909.86(−0.02, 0, 0)(0.10, 4, 4)(−0.13, 4, −4)(4.07, 0, 0)(−4.05, 0, 0)(0.05, 4, 0)(−0.08, −4, 0)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gu, T.; Guo, Y.; Zhao, C.; Zhang, J.; Zhang, T.; Liao, G. Thorough Understanding and 3D Super-Resolution Imaging for Forward-Looking Missile-Borne SAR via a Maneuvering Trajectory. Remote Sens. 2024, 16, 3378. https://doi.org/10.3390/rs16183378

AMA Style

Gu T, Guo Y, Zhao C, Zhang J, Zhang T, Liao G. Thorough Understanding and 3D Super-Resolution Imaging for Forward-Looking Missile-Borne SAR via a Maneuvering Trajectory. Remote Sensing. 2024; 16(18):3378. https://doi.org/10.3390/rs16183378

Chicago/Turabian Style

Gu, Tong, Yifan Guo, Chen Zhao, Jian Zhang, Tao Zhang, and Guisheng Liao. 2024. "Thorough Understanding and 3D Super-Resolution Imaging for Forward-Looking Missile-Borne SAR via a Maneuvering Trajectory" Remote Sensing 16, no. 18: 3378. https://doi.org/10.3390/rs16183378

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop