[go: up one dir, main page]

CN106842165A - One kind is based on different distance angular resolution radar centralization asynchronous fusion method - Google Patents

One kind is based on different distance angular resolution radar centralization asynchronous fusion method Download PDF

Info

Publication number
CN106842165A
CN106842165A CN201710156262.0A CN201710156262A CN106842165A CN 106842165 A CN106842165 A CN 106842165A CN 201710156262 A CN201710156262 A CN 201710156262A CN 106842165 A CN106842165 A CN 106842165A
Authority
CN
China
Prior art keywords
radar
grid
resolution
measurement
radars
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710156262.0A
Other languages
Chinese (zh)
Other versions
CN106842165B (en
Inventor
易伟
李洋漾
李雯
孙智
陈璐
徐璐霄
孔令讲
崔国龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201710156262.0A priority Critical patent/CN106842165B/en
Publication of CN106842165A publication Critical patent/CN106842165A/en
Application granted granted Critical
Publication of CN106842165B publication Critical patent/CN106842165B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

本发明公开了一种基于不同距离角度分辨率雷达集中式异步融合方法,涉及不同距离分辨率雷达量测对齐和多雷达数据融合技术研究。本发明首先建立在雷达的极坐标系下的目标运动和量测方程;然后采用空间栅格划分,用栅格对齐的方法把不同距离和角度分辨率的雷达量测数据对齐到相同的尺寸上,然后采用集中式异步融合的方法对多个雷达的量测数据进行数据融合,并且采用DP‑TBD方法对融合后的数据进行处理和恢复目标运动轨迹。本发明有效解决了在实际应用中的不同距离和角度分辨率的雷达难以利用基于检测前跟踪的动态规划算法进行集中式融合的问题,从而实现了对不同距离和角度分辨率雷达量测进行集中式异步融合,可以有效提升对弱目标的跟踪效果。

The invention discloses a centralized asynchronous fusion method based on radars with different distance and angle resolutions, and relates to research on measurement alignment of radars with different distance resolutions and multi-radar data fusion technology. The present invention first establishes the target movement and measurement equations under the polar coordinate system of the radar; then adopts space grid division, and aligns radar measurement data with different distances and angular resolutions to the same size by means of grid alignment , and then use the centralized asynchronous fusion method to fuse the measurement data of multiple radars, and use the DP-TBD method to process the fused data and restore the target trajectory. The present invention effectively solves the problem that radars with different distances and angular resolutions are difficult to perform centralized fusion using a dynamic programming algorithm based on tracking before detection in practical applications, thereby realizing centralized measurement of radars with different distances and angular resolutions Asynchronous fusion can effectively improve the tracking effect on weak targets.

Description

一种基于不同距离角度分辨率雷达集中式异步融合方法A centralized asynchronous fusion method based on radars with different range and angle resolutions

技术领域technical field

本发明属于雷达技术领域,涉及不同距离分辨率雷达量测对齐和多雷达数据融合技术研究。The invention belongs to the field of radar technology, and relates to the research on measurement alignment of radars with different distance resolutions and multi-radar data fusion technology.

背景技术Background technique

检测前跟踪(track-before-detect)即TBD技术是在低信杂比的情况下对目标进行检测和跟踪的一种技术。与一般的检测方法的不同之处在于,它采用一种新的思想:在单帧内并不宣布检测结果,而是将单帧回波数据信息数字化并存储起来,对多帧数据处理后同时宣布检测结果与目标的航迹。它的实质是通过对目标信号进行多帧积累,突显目标信息的同时抑制杂波干扰,用时间上的积累换取信噪比的提高,解决了传统的检测后跟踪算法对每一帧数据进行处理而导致的有用的目标信息可能丢失的问题,有效地提高了强杂波、强干扰、低信噪比下雷达对弱目标的检测和跟踪能力。Track-before-detect (TBD) technology is a technology to detect and track targets under the condition of low signal-to-noise ratio. The difference from the general detection method is that it adopts a new idea: the detection result is not announced in a single frame, but the single-frame echo data information is digitized and stored, and the multi-frame data is processed simultaneously. Announce detection results and track of the target. Its essence is to accumulate multiple frames of the target signal, highlight the target information while suppressing clutter interference, and use time accumulation in exchange for an increase in the signal-to-noise ratio, which solves the problem of processing each frame of data by the traditional tracking algorithm after detection The problem that the useful target information may be lost, effectively improves the radar's ability to detect and track weak targets under strong clutter, strong interference, and low signal-to-noise ratio.

实现TBD技术的算法有很多,包括动态规划(Dynamic Programming)、粒子滤波(Particle Filter)、霍夫变换(Hough Transform)、最大似然概率数据融合(ML-PDA)等。There are many algorithms for implementing TBD technology, including Dynamic Programming, Particle Filter, Hough Transform, and Maximum Likelihood Probability Data Fusion (ML-PDA).

其中,基于动态规划的检测前跟踪算法(DP-TBD)具有允许目标有一定机动性,易于实现等特点,已成为当前TBD领域研究的热点。近年来,DP-TBD技术作为一种探测跟踪目标的高效方法,在民用和军用领域都有着广阔的发展空间。Among them, the tracking-before-detection algorithm based on dynamic programming (DP-TBD) has the characteristics of allowing the target to have a certain mobility and being easy to implement, and has become a hot spot in the field of TBD research. In recent years, as an efficient method for detecting and tracking targets, DP-TBD technology has broad development space in both civilian and military fields.

对于基于DP-TBD的多雷达融合方法,通过利用多个雷达的量测数据,可以使检测和跟踪效果的显著提升并且可以获得目标雷达反射截面积(RCS)的空间增益。但是基于DP-TBD的多传感器融合方法研究不多,多个传感器的大量原始数据如何处理和异步的量测数据如何进行融合问题难以解决。在文献“An Amplitude Association DynamicProgramming TBD Algorithm with Multistatic Radar,Control Conference(CCC),35thChinese TCCT,5076-5079,2016”中,实现了基于DP-TBD算法的多雷达融合,并且有效提高了对弱目标的跟踪效果和取得了目标雷达反射截面积(RCS)的空间增益。但此模型只能解决相同距离和角度分辨率雷达的融合问题,不能应用于解决不同距离分辨率的雷达融合问题。For the multi-radar fusion method based on DP-TBD, by using the measurement data of multiple radars, the detection and tracking effects can be significantly improved and the spatial gain of the radar reflection cross section (RCS) of the target can be obtained. However, there are not many studies on multi-sensor fusion methods based on DP-TBD, and the problems of how to process a large amount of raw data from multiple sensors and how to fuse asynchronous measurement data are difficult to solve. In the document "An Amplitude Association Dynamic Programming TBD Algorithm with Multistatic Radar, Control Conference (CCC), 35thChinese TCCT, 5076-5079, 2016", the multi-radar fusion based on the DP-TBD algorithm is realized, and the detection of weak targets is effectively improved. The tracking effect and the spatial gain of the radar reflection cross section (RCS) of the target are achieved. However, this model can only solve the fusion problem of radars with the same range and angle resolution, and cannot be applied to solve the fusion problem of radars with different range resolutions.

发明内容Contents of the invention

本发明的目的是针对现有技术存在的缺陷,研究设计一种可以实现不同距离和角度分辨率雷达的基于DP-TBD的集中式融合方法,解决现有不同距离和角度分辨率的雷达数据不能够采用基于检测前跟踪(TBD)的动态(DP)规划算法进行处理的问题。The purpose of the present invention is to address the defects of the existing technology, research and design a centralized fusion method based on DP-TBD that can realize radars with different distances and angular resolutions, and solve the problems of existing radar data with different distances and angular resolutions. Problems that can be addressed by dynamic (DP) programming algorithms based on track-before-detect (TBD).

本发明的解决方案是是首先将空间按照其距离和角度分辨率进行栅格划分,然后在真实雷达的极坐标系下建模。量测采用空间栅格对齐的方式把不同距离分辨率的雷达数据对齐到相同的栅格尺寸上,首先选定一个确定的栅格大小,都按照这个栅格大小量化检测区域。当雷达栅格尺寸小于该个栅格大小时,取出临近范围内栅格中最大的量测作为该栅格量测。当雷达珊格大于确定栅格大小时,将这些雷达珊格的量测分别对应到指定栅格的位置,其余栅格量测全部加上均值为0方差为1的高斯分布噪声。该方法有效解决了在实际应用中不同距离和角度分辨率的雷达数据不能运用DP-TBD方法进行处理的问题,并且利用多个雷达数据的集中式融合,从而实现对弱目标的检测能力的提升。The solution of the present invention is to first divide the space into a grid according to its distance and angle resolution, and then model it in the polar coordinate system of the real radar. The measurement uses spatial grid alignment to align radar data with different distance resolutions to the same grid size. First, a certain grid size is selected, and the detection area is quantified according to this grid size. When the size of the radar grid is smaller than the grid size, the largest measurement in the grid in the vicinity is taken out as the grid measurement. When the radar grid is larger than the determined grid size, the measurements of these radar grids are respectively corresponding to the positions of the specified grid, and Gaussian distribution noise with a mean of 0 and a variance of 1 is added to all other grid measurements. This method effectively solves the problem that radar data with different distance and angular resolutions cannot be processed by the DP-TBD method in practical applications, and uses the centralized fusion of multiple radar data to improve the detection ability of weak targets .

为了方便描述本发明的内容,首先对以下术语进行解释:In order to describe content of the present invention conveniently, at first the following terms are explained:

术语1:极坐标系(polar coordinates)Term 1: Polar coordinates

极坐标系(polar coordinates)是指在平面内由极点、极轴和极径组成的坐标系。在平面上取定一点O,称为极点。从O出发引一条射线Ox,称为极轴。再取定一个长度单位,通常规定角度取逆时针方向为正。这样,平面上任一点P的位置就可以用线段OP的长度ρ以及从Ox到OP的角度θ来确定,有序数对(ρ,θ)就称为P点的极坐标,ρ称为P点的极径,θ称为P点的极角。The polar coordinate system (polar coordinates) refers to the coordinate system composed of poles, polar axes and polar diameters in a plane. Take a fixed point O on the plane and call it a pole. A ray Ox is induced from O, which is called the polar axis. Then take a unit of length, usually the anticlockwise direction of the specified angle is positive. In this way, the position of any point P on the plane can be determined by the length ρ of the line segment OP and the angle θ from Ox to OP. The ordered number pair (ρ, θ) is called the polar coordinate of point P, and ρ is called the Polar radius, θ is called the polar angle of point P.

术语2:栅格Term 2: Grid

栅格是指将雷达检测区域划分成若干个指定大小的矩形格子,是DP-TBD算法的第一步Grid refers to dividing the radar detection area into several rectangular grids of specified size, which is the first step of DP-TBD algorithm

术语3:集中式融合Term 3: Centralized Fusion

集中式融合即将每个雷达的量测数据不加处理,直接传到一个融合中心进行数据处理的融合方式Centralized fusion means that the measurement data of each radar is directly transmitted to a fusion center for data processing without processing

术语4:融合中心Term 4: Fusion Center

将传到融合中心的数据根据指定的算法的进行数据处理并且恢复出目标的运动状态估计的一个数据处理中心A data processing center that processes the data transmitted to the fusion center according to the specified algorithm and recovers the motion state estimation of the target

术语5:雷达距离和方位分辨率Term 5: Radar range and bearing resolution

雷达距离分辨率指的是两个目标物在雷达荧光屏上产生的回波数据能够区分开来的最小实际距离。角分辨率即成像系统或系统元件能有差别地区分开两相邻物体最小间距的能力。分辨本领一般用成像系统对两个最小可辨目标之间所张角度大小表示。Radar distance resolution refers to the minimum actual distance at which the echo data generated by two targets on the radar screen can be distinguished. Angular resolution is the ability of an imaging system or system component to differentiate the smallest distance between two adjacent objects. The resolving power is generally expressed by the angle between the two smallest recognizable targets of the imaging system.

术语6:雷达反射截面积Term 6: Radar Cross Section

雷达截面积Radar cross-section(RCS)是指雷达的反射截面积,雷达探测的原理是发射电磁波照射到物体表面在反射回接收天线,而雷达波照射到物体表面依原路径返回的电磁波越少,雷达截面积越小,雷达对目标的信号特征就越小,探测距离也越短。Radar cross-section (RCS) refers to the reflection cross-sectional area of the radar. The principle of radar detection is to emit electromagnetic waves to irradiate the surface of the object and then reflect back to the receiving antenna, and the less electromagnetic waves returned by the radar wave to the surface of the object according to the original path , the smaller the radar cross-sectional area, the smaller the signal characteristics of the radar to the target, and the shorter the detection distance.

本发明提出了一种基于不同距离角度分辨率雷达集中式异步融合方法,该方法包括:The present invention proposes a centralized asynchronous fusion method based on radars with different distance and angle resolutions. The method includes:

步骤1:各雷达获取量测空间的回波信号;Step 1: Each radar obtains the echo signal of the measurement space;

步骤2:各雷达根据自身的角度和距离分辨率对自身的量测空间进行栅格划分;Step 2: Each radar divides its own measurement space into grids according to its own angle and distance resolution;

步骤3:确定一个量测空间的栅格模板,将各雷达的量测空间根据选定的栅格模板进行映射,从而将各雷达量测空间进行尺寸统一;Step 3: Determine a grid template of the measurement space, map the measurement space of each radar according to the selected grid template, so as to unify the size of each radar measurement space;

步骤4:将各雷达映射后的量测空间数据传输至融合中心,融合中心按照时间顺序对各量测空间数据进行排序;Step 4: Transmit the measurement space data mapped by each radar to the fusion center, and the fusion center sorts each measurement space data in chronological order;

步骤5:对各雷达相同时间的量测数据进行融合;Step 5: Fuse the measurement data of each radar at the same time;

步骤6:根据融合后数据判断目标轨迹。Step 6: Judging the target trajectory based on the fused data.

进一步的,所述步骤2的具体方法为:Further, the specific method of the step 2 is:

步骤2.1:确定各雷达中分辨率居中的雷达;Step 2.1: Determine the radar with the centered resolution among the radars;

步骤2.2:将该雷达量测空间的进行栅格划分,确定每个栅格中包括的距离单元的个数;Step 2.2: divide the radar measurement space into grids, and determine the number of distance cells included in each grid;

步骤2.3:对其他雷达量测空间进行栅格划分,使划分后的各栅格中包含的距离单元个数与步骤2.2中雷达栅格中距离单元个数相同。Step 2.3: Carry out grid division for other radar measurement spaces, so that the number of distance cells contained in each divided grid is the same as the number of distance cells in the radar grid in step 2.2.

进一步的,所述步骤3的具体方法为:Further, the specific method of step 3 is:

步骤3.1:将各雷达分辨率居中的雷达栅格划分方式确定为栅格模板;Step 3.1: Determine the radar grid division method with the centered radar resolution as the grid template;

步骤3.2:对高于步骤3.1中雷达分辨率的高分辨率雷达进行栅格映射时,首先确定栅格模板中各栅格所处的量测空间对应于高分辨率雷达的哪些栅格,然后从高分辨雷达对应于同一模板栅格的所有栅格中选择出量测值最大的栅格,将选出的栅格映射到模板栅格上;Step 3.2: When performing grid mapping for high-resolution radars higher than the radar resolution in step 3.1, first determine which grids of the high-resolution radar the measurement space of each grid in the grid template corresponds to, and then Select the grid with the largest measurement value from all the grids of the high-resolution radar corresponding to the same template grid, and map the selected grid to the template grid;

步骤3.3:对低于步骤3.1中雷达分辨率的低分辨率雷达进行栅格映射时,采用如下公式进行栅格映射:Step 3.3: When performing raster mapping on a low-resolution radar that is lower than the radar resolution in Step 3.1, use the following formula for raster mapping:

Z″(i,j)=Z(m,n),i=nrl×m,j=nθl×n;Z"(i,j)=Z(m,n), i=n rl ×m, j=n θl ×n;

其中Z″(i,j)表示映射后位置为(i,j)的栅格的量测值,Z(m,n)为映射前位置为(m,n)的栅格的量测值,nrl表示待映射雷达的距离分辨率与栅格模板雷达的距离分辨率的比值,nθl表示待映射雷达的角度分辨率与栅格模板雷达的角度分辨率的比值,将未映射到的栅格采用均值为0,方差为1的高斯白噪声来填充。Where Z″(i,j) represents the measurement value of the grid whose position is (i,j) after mapping, Z(m,n) is the measurement value of the grid whose position is (m,n) before mapping, n rl represents the ratio of the range resolution of the radar to be mapped to the range resolution of the grid template radar, n θl represents the ratio of the angular resolution of the radar to be mapped to the angular resolution of the grid template radar, and the unmapped grid The grid is filled with Gaussian white noise with a mean of 0 and a variance of 1.

本发明的有益效果:本发明的方法利用不同距离和角度分辨率的雷达空间量测栅格对齐和集中式异步融合的方法,首先将目标在距离和方位向上进行建模,然后将空间进行栅格划分,将不同距离和角度分辨率的雷达量测数据进行栅格大小对齐,然后将对齐的量测数据送到融合中心,再对量测数据按时间顺序进行排序和运动DP-TBD算法进行处理,从而解决了不同距离和角度分辨率雷达运用DP-TBD算法无法实现数据融合的问题。本发明的优点是通过以上方法可以有效提升目标的检测跟踪效果和实现不同距离和角度分辨率的雷达数据融合,求解过程简单。本发明可以应用于雷达信号处理和预警雷达跟踪目标等领域。Beneficial effects of the present invention: the method of the present invention utilizes the method of grid alignment and centralized asynchronous fusion of radar space measurement with different distance and angle resolutions, firstly the target is modeled in the direction of distance and azimuth, and then the space is gridded Grid division, the radar measurement data with different distances and angular resolutions are aligned to the grid size, and then the aligned measurement data is sent to the fusion center, and then the measurement data are sorted in time order and performed by the motion DP-TBD algorithm. Processing, thus solving the problem that radars with different distance and angle resolutions cannot achieve data fusion using DP-TBD algorithm. The advantage of the present invention is that the above method can effectively improve the detection and tracking effect of the target and realize the fusion of radar data with different distances and angular resolutions, and the solution process is simple. The invention can be applied to the fields of radar signal processing, early warning radar tracking targets and the like.

附图说明Description of drawings

图1是本发明提供方法的流程框图。Fig. 1 is a flowchart of the method provided by the present invention.

图2是本发明具体实施方式中不同距离和角度分辨率雷达珊格划分的大小不同示意图。Fig. 2 is a schematic diagram of different sizes of grid divisions of radars with different distances and angular resolutions in a specific embodiment of the present invention.

图3是本发明具体实施方式中多雷达按照获取量测时间进行量测排序和集中式异步融合的示意图。FIG. 3 is a schematic diagram of measurement sorting and centralized asynchronous fusion performed by multiple radars according to acquisition measurement time in a specific embodiment of the present invention.

图4是本发明具体实施方式中第一个雷达对目标精确跟踪次数随着帧数的变化情况。Fig. 4 is the variation of the number of times of accurate target tracking by the first radar with the number of frames in the specific embodiment of the present invention.

图5是本发明具体实施方式中第二个雷达对目标精确跟踪次数随着帧数的变化情况。Fig. 5 is the variation of the number of times of accurate target tracking by the second radar with the number of frames in the specific embodiment of the present invention.

图6是本发明具体实施方式中第三个雷达对目标精确跟踪次数随着帧数的变化情况。Fig. 6 shows the variation of the number of accurate target tracking by the third radar with the number of frames in the specific embodiment of the present invention.

图7是本发明具体实施方式中通过集中式融合对目标精确跟踪次数随着帧数的变化情况。Fig. 7 shows the change of the number of accurate target tracking times with the number of frames through centralized fusion in the specific embodiment of the present invention.

具体实施方式detailed description

本发明主要采用仿真实验的方法进行验证,所有步骤、结论都在MatlabR2013a上验证正确。下面就具体实施方式对本发明作进一步的详细描述。The present invention mainly adopts the method of simulation experiment to verify, and all steps and conclusions are verified correctly on MatlabR2013a. The present invention will be further described in detail with regard to specific embodiments below.

步骤1:确定目标运动方程;Step 1: Determine the target motion equation;

设定检测区域里面有P个雷达,每个雷达处理F帧,目标航迹为一系列连续状态,从第1帧到第P×F帧定义如下:It is assumed that there are P radars in the detection area, each radar processes F frames, and the target track is a series of continuous states, defined as follows from the first frame to the P×F frame:

其中P×F表示融合中心进行一次动态规划的总处理帧数;Among them, P×F represents the total number of frames processed by the fusion center for a dynamic planning;

步骤2:量测空间栅格化;Step 2: Measuring space rasterization;

考虑检测区域是由距离和角度组成的区域;选定一个确定栅格尺寸的量测空间Z″′;该量测空间选择为多个雷达中的中间分辨率雷达;这是因为当分辨率较高时,雷达跟踪的计算量会大量提升,当雷达分辨率较低时,检测误差增大;Consider that the detection area is an area composed of distance and angle; select a measurement space Z"' to determine the grid size; the measurement space is selected as the intermediate resolution radar among multiple radars; this is because when the resolution is relatively small When the value is high, the calculation amount of radar tracking will be greatly increased, and when the radar resolution is low, the detection error will increase;

这个确定的量测空间在距离方向上均匀划分为Nr=a个栅格,在角度方向上划分为Nθ=b个栅格;这里的栅格划分数目是基于雷达的距离分辨率和角度分辨率;对于更高距离分辨率雷达,假设它在距离方向上的栅格数目为Nr′=a×nrh,在角度方向上的栅格数目为N′θ=b×nθh;nxh代表距离方向上和角度方向上确定的量测空间栅格尺寸和高分辨率雷达的栅格尺寸大小之比;对于低距离分辨率雷达,假定它在r方向栅格数目分别是Nr″=a/nrl和N″θ=b/nθlThis determined measurement space is evenly divided into N r =a grids in the distance direction, and N θ =b grids in the angle direction; the number of grid divisions here is based on the range resolution and angle of the radar Resolution; for radar with higher range resolution, it is assumed that the number of grids in the range direction is N r ′=a×n rh , and the number of grids in the angle direction is N′ θ =b×n θh ; n xh represents the ratio of the measurement space grid size determined in the distance direction and the angle direction to the grid size of the high-resolution radar; for the low-range resolution radar, it is assumed that the number of grids in the r direction is N r ″ = a/n rl and N″ θ = b/n θl ;

步骤3:不同距离和速度分辨率雷达珊格数目对齐;Step 3: Align radar grid numbers with different range and velocity resolutions;

对齐包括将高距离角度分辨率雷达和低距离角度分辨率雷达都对齐到确定栅格尺寸的量测空间Z″′;对于高分辨率雷达,由于高分辨率雷达的量测中存在非常多无用的量测,这些量测只会提升虚警概率但是不会对检测概率有所提升;因此,对于DP迭代,我们采用了一种更大栅格尺寸划分的方法。这些最有意义的高幅度量测值被保留在了新的栅格当中。转换公式如下Alignment includes aligning both the high-range-angle-resolution radar and the low-range-angle-resolution radar to the measurement space Z"' with a certain grid size; for high-resolution radar, there are many useless These measurements will only increase the probability of false alarms but not the probability of detection; therefore, for DP iterations, we use a method of dividing the grid size. The most meaningful high-amplitude The measurement values are retained in the new grid. The conversion formula is as follows

Zk(i,j)代表第k帧量测空间里面第(i,j)个分辨单元的量测值。Z′k代表第k帧处理之后的量测数据。并且1≤i≤a,1≤j≤b,其中M(i,j)代表对应于珊格(i,j)的所有栅格。Z k (i, j) represents the measurement value of the (i, j)th resolution unit in the measurement space of the kth frame. Z' k represents the measurement data of the kth frame after processing. And 1≤i≤a, 1≤j≤b, where M(i,j) represents all the grids corresponding to grid(i,j).

Mi,j={(m,n):m∈(np×i-np+1,...,np×i),n∈(np×j-np+1,...,np×j)} (3)M i,j ={(m,n):m∈(n p ×in p +1,...,n p ×i),n∈(n p ×jn p +1,...,n p ×j)} (3)

对于低距离分辨率的雷达,在DP迭代中,我们采用了一种划分更小栅格尺寸大小的方法。从原始的量测空间Zk(m,n)到量测空间Zk″(i,j)转换公式如下For radars with low range resolution, in the DP iteration, we adopt a method of dividing the size of the smaller grid size. The conversion formula from the original measurement space Z k (m,n) to the measurement space Z k ″(i,j) is as follows

Zk″(i,j)=Zk(m,n) (4)Z k "(i, j) = Z k (m, n) (4)

Zk″代表处理之后第k帧的量测。并且1≤i≤a,1≤j≤b,1≤m≤Nr″,并且i=nrl×m,j=nθl×n。在这个场景中,在量测Zk中每个栅格将对应于量测中Zk″中的几个栅格。因此,在转换过后,Z″k中很多栅格不包含任何量测信息。由于这些栅格内均不包含任何的目标信息,所以这些栅格的量测信息均以均值为0,方差为1的高斯白噪声来填充。Z k ″ represents the measurement of the k-th frame after processing. And 1≤i≤a, 1≤j≤b, 1≤m≤N r ″, and i=n rl ×m, j=n θl ×n. In this scenario, each grid in the measurement Zk will correspond to several grids in the measurement Zk″. Therefore, after transformation, many grids in Z″k do not contain any measurement information . Since these grids do not contain any target information, the measurement information of these grids is filled with Gaussian white noise with a mean value of 0 and a variance of 1.

步骤4:集中式融合Step 4: Centralized Fusion

在量测栅格匹配之后,不同分辨率的雷达量测栅格匹配到同样的栅格大小。然后将这些来自于不同雷达的量测数据送往融合中心。融合中心从不同分辨率和不同周期的异步雷达中收到这些量测数据Z′1…Z′k,Z″1…Z″k,Z″′1…Z″′k,然后将这些量测数据按照获得的时间顺序进行排序。排序过后,这些量测数据为Z1,Z2…ZP×F,(P代表雷达总数目,F代表每个雷达DP处理的总帧数)。Zi代表按照时间排序后的第i帧量测数据。之后采用DP算法对这些量测数据进行处理。After measurement grid matching, radar measurement grids of different resolutions are matched to the same grid size. These measurements from different radars are then sent to a fusion center. The fusion center receives these measurement data Z′ 1 …Z′ k , Z″ 1 …Z″ k , Z″′ 1 …Z″′ k from the asynchronous radars with different resolutions and different periods, and then these measurement data The data are sorted in the order in which they were obtained. After sorting, these measurement data are Z 1 , Z 2 ...Z P×F , (P represents the total number of radars, F represents the total number of frames processed by each radar DP). Z i represents the measurement data of the i-th frame sorted by time. Afterwards, the DP algorithm is used to process these measurement data.

步骤5:值函数和状态转移函数初始化Step 5: Value function and state transition function initialization

在初始时刻i=1,目标的极坐标系运动方程At the initial moment i=1, the polar coordinate system motion equation of the target and

其中I是值函数,Ψ用来记录状态转移关系,当前时刻为初始时刻,因此令其等于零。Among them, I is a value function, Ψ is used to record the state transition relationship, and the current moment is the initial moment, so let it be equal to zero.

步骤6:动态规划递归Step 6: Dynamic Programming Recursion

当2≤i≤P×F时,对第i帧的状态xi,有When 2≤i≤P×F, for the state x i of the i-th frame, we have

集合D为状态转移的区域。Set D is the area of state transition.

步骤7:航迹终止Step 7: Track Termination

对最后一帧值函数最大值进行过门限判决,超过门限就判为存在目标:The maximum value of the value function in the last frame is judged as passing the threshold, and if it exceeds the threshold, it is judged as an existing target:

其中VT代表门限值,由蒙特卡洛仿真得出,满足恒虚警。Among them, V T represents the threshold value, which is obtained by Monte Carlo simulation and satisfies the constant false alarm.

步骤8:航迹回溯:Step 8: Trajectory backtracking:

当i=P×F-1,P×F-2,……,2,1时,令right When i=P×F-1,P×F-2,...,2,1, let

得到估计的状态序列:Get the estimated state sequence:

经过步骤1到步骤7可以恢复出目标运动的轨迹。并且和目标的真实运动轨迹比较即可得到恢复出的航迹的跟踪效果。After steps 1 to 7, the trajectory of the target movement can be recovered. And the tracking effect of the restored track can be obtained by comparing it with the real moving track of the target.

图4至图7分别为实施方式采用的不同距离分辨率的三个雷达的精确跟踪效果图和集中式融合之后的精确跟踪效果图,其对应的参数表为表1。且选定的栅格大小和雷达2的栅格划分大小相同。Figures 4 to 7 are the accurate tracking renderings of three radars with different range resolutions adopted in the embodiment and the precise tracking renderings after centralized fusion, respectively, and the corresponding parameter table is Table 1. And the selected grid size is the same as the grid division size of Radar 2.

表1Table 1

通过本发明具体实施方式可以看出,本发明可以很好的实现对不同距离和角度分辨率的雷达进行集中式异步融合,并且利用DP-TBD算法提升对目标的跟踪效果。It can be seen from the specific implementation of the present invention that the present invention can well realize centralized asynchronous fusion of radars with different distances and angular resolutions, and use the DP-TBD algorithm to improve the tracking effect on targets.

本领域的普通技术人员将会意识到,这里所述的实施例是为了帮助读者理解本发明的原理,应被理解为本发明的保护范围并不局限于这样的特别陈述和实施例。本领域的普通技术人员可以根据本发明公开的这些技术启示做出各种不脱离本发明实质的其它各种具体变形和组合,这些变形和组合仍然在本发明的保护范围内。Those skilled in the art will appreciate that the embodiments described here are to help readers understand the principles of the present invention, and it should be understood that the protection scope of the present invention is not limited to such specific statements and embodiments. Those skilled in the art can make various other specific modifications and combinations based on the technical revelations disclosed in the present invention without departing from the essence of the present invention, and these modifications and combinations are still within the protection scope of the present invention.

Claims (3)

1.一种基于不同距离角度分辨率雷达集中式异步融合方法,该方法包括:1. A centralized asynchronous fusion method based on different distance angle resolution radars, the method comprising: 步骤1:各雷达获取量测空间的回波信号;Step 1: Each radar obtains the echo signal of the measurement space; 步骤2:各雷达根据自身的角度和距离分辨率对自身的量测空间进行栅格划分;Step 2: Each radar divides its own measurement space into grids according to its own angle and distance resolution; 步骤3:确定一个量测空间的栅格模板,将各雷达的量测空间根据选定的栅格模板进行映射,从而将各雷达量测空间进行尺寸统一;Step 3: Determine a grid template of the measurement space, map the measurement space of each radar according to the selected grid template, so as to unify the size of each radar measurement space; 步骤4:将各雷达映射后的量测空间数据传输至融合中心,融合中心按照时间顺序对各量测空间数据进行排序;Step 4: Transmit the measurement space data mapped by each radar to the fusion center, and the fusion center sorts each measurement space data in chronological order; 步骤5:对各雷达相同时间的量测数据进行融合;Step 5: Fuse the measurement data of each radar at the same time; 步骤6:根据融合后数据判断目标轨迹。Step 6: Judging the target trajectory based on the fused data. 2.如权利要求1所述的一种基于不同距离角度分辨率雷达集中式异步融合方法,其特征在于所述步骤2的具体方法为:2. a kind of centralized asynchronous fusion method based on different distance angle resolution radars as claimed in claim 1, is characterized in that the concrete method of described step 2 is: 步骤2.1:确定各雷达中分辨率居中的雷达;Step 2.1: Determine the radar with the centered resolution among the radars; 步骤2.2:将该雷达量测空间的进行栅格划分,确定每个栅格中包括的距离单元的个数;Step 2.2: divide the radar measurement space into grids, and determine the number of distance cells included in each grid; 步骤2.3:对其他雷达量测空间进行栅格划分,使划分后的各栅格中包含的距离单元个数与步骤2.2中雷达栅格中距离单元个数相同。Step 2.3: Carry out grid division for other radar measurement spaces, so that the number of distance cells contained in each divided grid is the same as the number of distance cells in the radar grid in step 2.2. 3.如权利要求1所述的一种基于不同距离角度分辨率雷达集中式异步融合方法,其特征在于所述步骤3的具体方法为:3. a kind of centralized asynchronous fusion method based on different distance angle resolution radars as claimed in claim 1, is characterized in that the concrete method of described step 3 is: 步骤3.1:将各雷达分辨率居中的雷达栅格划分方式确定为栅格模板;Step 3.1: Determine the radar grid division method with the centered radar resolution as the grid template; 步骤3.2:对高于步骤3.1中雷达分辨率的高分辨率雷达进行栅格映射时,首先确定栅格模板中各栅格所处的量测空间对应于高分辨率雷达的哪些栅格,然后从高分辨雷达对应于同一模板栅格的所有栅格中选择出量测值最大的栅格,将选出的栅格映射到模板栅格上;Step 3.2: When performing grid mapping for high-resolution radars higher than the radar resolution in step 3.1, first determine which grids of the high-resolution radar the measurement space of each grid in the grid template corresponds to, and then Select the grid with the largest measurement value from all the grids of the high-resolution radar corresponding to the same template grid, and map the selected grid to the template grid; 步骤3.3:对低于步骤3.1中雷达分辨率的低分辨率雷达进行栅格映射时,采用如下公式进行栅格映射:Step 3.3: When performing raster mapping on a low-resolution radar that is lower than the radar resolution in Step 3.1, use the following formula for raster mapping: Z″(i,j)=Z(m,n),i=nrl×m,j=nθl×n;Z"(i,j)=Z(m,n), i=n rl ×m, j=n θl ×n; 其中Z″(i,j)表示映射后位置为(i,j)的栅格的量测值,Z(m,n)为映射前位置为(m,n)的栅格的量测值,nrl表示待映射雷达的距离分辨率与栅格模板雷达的距离分辨率的比值,nθl表示待映射雷达的角度分辨率与栅格模板雷达的角度分辨率的比值,将未映射到的栅格采用均值为0,方差为1的高斯白噪声来填充。Where Z″(i,j) represents the measurement value of the grid whose position is (i,j) after mapping, Z(m,n) is the measurement value of the grid whose position is (m,n) before mapping, n rl represents the ratio of the range resolution of the radar to be mapped to the range resolution of the grid template radar, n θl represents the ratio of the angular resolution of the radar to be mapped to the angular resolution of the grid template radar, and the unmapped grid The grid is filled with Gaussian white noise with a mean of 0 and a variance of 1.
CN201710156262.0A 2017-03-16 2017-03-16 A Centralized Asynchronous Fusion Method of Radar Based on Different Range and Angle Resolutions Expired - Fee Related CN106842165B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710156262.0A CN106842165B (en) 2017-03-16 2017-03-16 A Centralized Asynchronous Fusion Method of Radar Based on Different Range and Angle Resolutions

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710156262.0A CN106842165B (en) 2017-03-16 2017-03-16 A Centralized Asynchronous Fusion Method of Radar Based on Different Range and Angle Resolutions

Publications (2)

Publication Number Publication Date
CN106842165A true CN106842165A (en) 2017-06-13
CN106842165B CN106842165B (en) 2020-02-18

Family

ID=59144574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710156262.0A Expired - Fee Related CN106842165B (en) 2017-03-16 2017-03-16 A Centralized Asynchronous Fusion Method of Radar Based on Different Range and Angle Resolutions

Country Status (1)

Country Link
CN (1) CN106842165B (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107340517A (en) * 2017-07-04 2017-11-10 电子科技大学 Tracking before a kind of multisensor multi frame detection
CN107346020A (en) * 2017-07-05 2017-11-14 电子科技大学 A kind of distribution for asynchronous multi-static radar system batch estimation fusion method
CN107783104A (en) * 2017-10-17 2018-03-09 杭州电子科技大学 Tracking before a kind of more asynchronous sensor single goals detection based on particle filter
CN108375761A (en) * 2018-02-08 2018-08-07 电子科技大学 For the single goal asynchronous signal detection method of multiple-input multiple-output radar system
CN108845299A (en) * 2018-06-27 2018-11-20 电子科技大学 A kind of multisensor multi-frame joint detection algorithm based on posterior information fusion
CN109256145A (en) * 2017-07-14 2019-01-22 北京搜狗科技发展有限公司 Audio-frequency processing method, device, terminal and readable storage medium storing program for executing based on terminal
CN109839633A (en) * 2019-03-08 2019-06-04 电子科技大学 Tracking before the multi frame detection of airborne early warning radar based on minimum vertex-covering airspace
CN109917373A (en) * 2019-04-04 2019-06-21 电子科技大学 A tracking-before-detection method for dynamic programming of moving-platform radar with motion-compensated search
CN110687512A (en) * 2019-07-02 2020-01-14 中国航空工业集团公司雷华电子技术研究所 Multi-machine heterogeneous radar cooperative TBD processing method based on probability matrix
CN111277765A (en) * 2020-03-11 2020-06-12 甘肃省科学院 Matrix type system for acquiring digitization of oversized picture by utilizing WiFi link
CN111427036A (en) * 2020-04-14 2020-07-17 南京莱斯电子设备有限公司 Short-baseline multi-radar signal level fusion detection method
CN111474528A (en) * 2020-05-14 2020-07-31 中国电子科技集团公司第二十八研究所 Accurate grid locking method for target composite tracking system in terminal area
CN113376612A (en) * 2021-08-12 2021-09-10 成都众享天地网络科技有限公司 Radar clutter generation method based on terrain matrixing and detection

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101447072A (en) * 2009-01-06 2009-06-03 覃征 pyramidal empirical modal analyze image merge method
CN102254311A (en) * 2011-06-10 2011-11-23 中国科学院深圳先进技术研究院 Method and system for fusing remote sensing images
CN103886566A (en) * 2014-03-18 2014-06-25 河海大学常州校区 Urban traffic dispatching system and method based on image fusion in severe weather
CN103955701A (en) * 2014-04-15 2014-07-30 浙江工业大学 Multi-level-combined multi-look synthetic aperture radar image target recognition method
CN104715467A (en) * 2015-03-06 2015-06-17 中国科学院遥感与数字地球研究所 Improved multi-source remote sensing data space-time fusion method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101447072A (en) * 2009-01-06 2009-06-03 覃征 pyramidal empirical modal analyze image merge method
CN102254311A (en) * 2011-06-10 2011-11-23 中国科学院深圳先进技术研究院 Method and system for fusing remote sensing images
CN103886566A (en) * 2014-03-18 2014-06-25 河海大学常州校区 Urban traffic dispatching system and method based on image fusion in severe weather
CN103955701A (en) * 2014-04-15 2014-07-30 浙江工业大学 Multi-level-combined multi-look synthetic aperture radar image target recognition method
CN104715467A (en) * 2015-03-06 2015-06-17 中国科学院遥感与数字地球研究所 Improved multi-source remote sensing data space-time fusion method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杜小勇等: ""多雷达融合检测中的分辨率匹配处理技术"", 《信号处理》 *
黄大羽: ""复杂环境下弱目标检测与跟踪算法研究"", 《中国博士学位论文全文数据库信息科技辑》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107340517A (en) * 2017-07-04 2017-11-10 电子科技大学 Tracking before a kind of multisensor multi frame detection
CN107346020A (en) * 2017-07-05 2017-11-14 电子科技大学 A kind of distribution for asynchronous multi-static radar system batch estimation fusion method
CN107346020B (en) * 2017-07-05 2020-02-18 电子科技大学 A Distributed Batch Estimation Fusion Method for Asynchronous Multistatic Radar Systems
CN109256145A (en) * 2017-07-14 2019-01-22 北京搜狗科技发展有限公司 Audio-frequency processing method, device, terminal and readable storage medium storing program for executing based on terminal
CN107783104A (en) * 2017-10-17 2018-03-09 杭州电子科技大学 Tracking before a kind of more asynchronous sensor single goals detection based on particle filter
CN108375761A (en) * 2018-02-08 2018-08-07 电子科技大学 For the single goal asynchronous signal detection method of multiple-input multiple-output radar system
CN108375761B (en) * 2018-02-08 2020-04-07 电子科技大学 Single-target asynchronous signal detection method for multi-transmitting and multi-receiving radar system
CN108845299A (en) * 2018-06-27 2018-11-20 电子科技大学 A kind of multisensor multi-frame joint detection algorithm based on posterior information fusion
CN109839633A (en) * 2019-03-08 2019-06-04 电子科技大学 Tracking before the multi frame detection of airborne early warning radar based on minimum vertex-covering airspace
CN109917373B (en) * 2019-04-04 2020-09-18 电子科技大学 A tracking-before-detection method for dynamic programming of moving-platform radar with motion-compensated search
CN109917373A (en) * 2019-04-04 2019-06-21 电子科技大学 A tracking-before-detection method for dynamic programming of moving-platform radar with motion-compensated search
CN110687512A (en) * 2019-07-02 2020-01-14 中国航空工业集团公司雷华电子技术研究所 Multi-machine heterogeneous radar cooperative TBD processing method based on probability matrix
CN111277765A (en) * 2020-03-11 2020-06-12 甘肃省科学院 Matrix type system for acquiring digitization of oversized picture by utilizing WiFi link
CN111427036A (en) * 2020-04-14 2020-07-17 南京莱斯电子设备有限公司 Short-baseline multi-radar signal level fusion detection method
CN111474528A (en) * 2020-05-14 2020-07-31 中国电子科技集团公司第二十八研究所 Accurate grid locking method for target composite tracking system in terminal area
CN113376612A (en) * 2021-08-12 2021-09-10 成都众享天地网络科技有限公司 Radar clutter generation method based on terrain matrixing and detection

Also Published As

Publication number Publication date
CN106842165B (en) 2020-02-18

Similar Documents

Publication Publication Date Title
CN106842165B (en) A Centralized Asynchronous Fusion Method of Radar Based on Different Range and Angle Resolutions
CN110659591B (en) SAR image change detection method based on twin network
CN104751185B (en) SAR image change detection based on average drifting genetic cluster
Klęsk et al. Fast analysis of C-scans from ground penetrating radar via 3-D Haar-like features with application to landmine detection
CN104062651B (en) A kind of based on tracking before the detection of G0 clutter background and constant target amplitude
CN106443598A (en) Convolutional neural network based cooperative radar network track deception jamming discrimination method
CN112462355B (en) An intelligent detection method for sea targets based on time-frequency three-feature extraction
CN105869146A (en) Saliency fusion-based SAR image change detection method
CN110333480B (en) A multi-target AOA positioning method for single UAV based on clustering
Chen et al. A graph-based track-before-detect algorithm for automotive radar target detection
CN109085572A (en) The motion target tracking method of millimetre-wave radar is utilized in tunnel based on multipath
CN107340514A (en) Hypersonic weak signal target RAE HT TBD integration detection methods in three dimensions
CN104569923B (en) Velocity restraint-based Hough transformation fast track starting method
CN104268574A (en) A SAR Image Change Detection Method Based on Genetic Kernel Fuzzy Clustering
CN109557533A (en) A Model-Based Joint Tracking and Recognition Method
CN106680783A (en) Method for withstanding false targets on basis of station's position error fusion algorithm
CN102129559A (en) SAR Image Target Detection Method Based on Primal Sketch Algorithm
CN107436434A (en) Track initiation method based on two-way Doppler estimation
Long et al. Object detection research of SAR image using improved faster region-based convolutional neural network
CN113466819B (en) A high-resolution three-dimensional point-trace aggregation method based on prior data
CN104198998A (en) Clustering treatment based CFAR (Constant False Alarm Rate) detection method under non-uniform background
CN103714542B (en) Extraction method for target highlight in low-resolution high-frequency sonar image
Hou et al. Dual-task GPR method: Improved generative adversarial clutter suppression network and adaptive target localization algorithm in GPR image
CN110988867B (en) Elliptical cross target positioning method for one-transmitting and double-receiving through-wall radar
Wang et al. Improved SSD framework for automatic subsurface object indentification for gpr data processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200218

CF01 Termination of patent right due to non-payment of annual fee