[go: up one dir, main page]

CN115406437A - Cooperative navigation method for UAV formation based on auxiliary constraints of line-of-sight starlight angle distance - Google Patents

Cooperative navigation method for UAV formation based on auxiliary constraints of line-of-sight starlight angle distance Download PDF

Info

Publication number
CN115406437A
CN115406437A CN202211015630.7A CN202211015630A CN115406437A CN 115406437 A CN115406437 A CN 115406437A CN 202211015630 A CN202211015630 A CN 202211015630A CN 115406437 A CN115406437 A CN 115406437A
Authority
CN
China
Prior art keywords
unmanned aerial
formation
aerial vehicle
uav
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211015630.7A
Other languages
Chinese (zh)
Inventor
陈少华
刘瑞
施航
郑毅
郭磊磊
胡慧珠
刘承
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lab
Original Assignee
Zhejiang Lab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lab filed Critical Zhejiang Lab
Priority to CN202211015630.7A priority Critical patent/CN115406437A/en
Publication of CN115406437A publication Critical patent/CN115406437A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/02Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means
    • G01C21/025Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by astronomical means with the use of startrackers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Navigation (AREA)

Abstract

本发明公开了一种基于视线星光角距辅助约束的无人机编队协同导航方法,无人机编队中的各个无人机皆自带惯性导航设备、气压计和通信测距链路设备,其中无人机编队中的某一无人机布置有伺服控制光学敏感器作为观测无人机;观测无人机对被观测无人机进行跟踪观测并提取被观测无人机质心坐标,再对视场内背景恒星进行星图匹配并识别提取质心坐标,在光学敏感器坐标系下计算观测无人机与被观测无人机及背景恒星之间视线星光角距观测信息,用于辅助提升编队无人机整体导航定位精度。本发明能有效修正单纯基于机间通信测距辅助编队无人机导航定位误差绕空间某一点旋转发散的趋势,从而进一步提高GNSS拒止环境下编队无人机整体自主导航定位精度。

Figure 202211015630

The invention discloses a UAV formation cooperative navigation method based on the auxiliary constraint of line-of-sight starlight angular distance. Each UAV in the UAV formation has its own inertial navigation equipment, barometer and communication ranging link equipment. A UAV in the UAV formation is equipped with a servo-controlled optical sensor as an observation UAV; the observation UAV tracks and observes the observed UAV and extracts the coordinates of the center of mass of the observed UAV. The background stars in the field perform star map matching and identify and extract the coordinates of the center of mass, and calculate the line-of-sight starlight angular distance observation information between the observation UAV, the observed UAV and the background stars in the optical sensor coordinate system, which is used to assist in the improvement of formation wireless Human-machine overall navigation and positioning accuracy. The present invention can effectively correct the tendency of navigation and positioning errors of formation UAVs based solely on inter-machine communication and ranging to assist formation divergence around a certain point in space, thereby further improving the overall autonomous navigation and positioning accuracy of formation UAVs in a GNSS-denied environment.

Figure 202211015630

Description

基于视线星光角距辅助约束的无人机编队协同导航方法Cooperative navigation method for UAV formation based on auxiliary constraint of line-of-sight starlight angle distance

技术领域technical field

本发明涉及无人机自主导航技术,具体涉及一种基于视线星光角距辅助约束的无人机编队协同导航方法。The invention relates to the autonomous navigation technology of unmanned aerial vehicles, in particular to an unmanned aerial vehicle formation cooperative navigation method based on the auxiliary constraint of line-of-sight starlight angular distance.

背景技术Background technique

在全球卫星导航(GNSS)信号受干扰不可用时,传统无人机编队协同导航方案中,协同组网的编队无人机之间通过数据链通信测距、测向等信息,交互利用,实时修正编队内无人机自身位置、速度、姿态等导航信息,从而实现编队无人机导航精度整体提升。在该方案中,无人机之间仅通过机间的通信测距信息进行编队无人机的整体导航定位信息观测修正,虽能遏制编队无人机相对位置误差,并在一定程度上提升编队无人机整体绝对定位精度,但仍无法遏制编队无人机整体绝对定位误差绕空间某一点旋转发散的趋势和沿某一方向整体漂移发散的趋势。When the global satellite navigation (GNSS) signal is interfered and unavailable, in the traditional UAV formation cooperative navigation scheme, the formation UAVs in the cooperative network communicate with each other through data link ranging, direction finding and other information, interactive use, real-time correction Navigation information such as the position, speed, and attitude of the UAVs in the formation can be used to improve the overall navigation accuracy of the UAVs in the formation. In this scheme, the UAVs only use the inter-machine communication and ranging information to observe and correct the overall navigation and positioning information of the formation UAVs. Although it can curb the relative position error of the formation UAVs and improve the formation The overall absolute positioning accuracy of UAVs, but it is still unable to curb the tendency of the overall absolute positioning error of formation UAVs to rotate and diverge around a certain point in space and the trend of overall drift and divergence along a certain direction.

发明内容Contents of the invention

针对GNSS拒止环境下,传统编队无人机基于机间通信测距辅助的协同自主导航方法无法遏制编队无人机整体绝对定位误差绕空间某一点旋转发散的趋势和沿某一方向整体漂移发散的趋势,本发明提出了一种基于视线星光角距辅助约束的无人机编队协同导航方法,在基于机间通信测距辅助的编队无人机整体导航定位基础上,引入编队内任意两无人机之间视线星光角距做为整体导航定位辅助观测信息。恒星星光信息的引入能为编队无人机提供惯性空间基准,该方法能有效消除编队无人机整体绝对定位误差绕空间某一点旋转发散的趋势,从而进一步提升编队无人机整体绝对导航定位精度。For the GNSS-denied environment, the cooperative autonomous navigation method based on inter-machine communication and ranging assistance of traditional formation UAVs cannot curb the tendency of the overall absolute positioning error of formation UAVs to rotate and diverge around a certain point in space and the overall drift and divergence along a certain direction. The present invention proposes a UAV formation collaborative navigation method based on the auxiliary constraint of line-of-sight starlight angular distance. The line-of-sight starlight angular distance between man and machine is used as the auxiliary observation information for overall navigation and positioning. The introduction of stellar starlight information can provide an inertial space reference for the formation of UAVs. This method can effectively eliminate the tendency of the overall absolute positioning error of the formation UAVs to rotate and diverge around a certain point in space, thereby further improving the overall absolute navigation and positioning accuracy of the formation UAVs. .

一种基于视线星光角距辅助约束的无人机编队协同导航方法,无人机编队中的各个无人机皆自带惯性导航设备、气压计和通信测距链路设备,其中无人机编队中的某一无人机布置有伺服控制光学敏感器作为观测无人机;观测无人机对被观测无人机进行跟踪观测并提取被观测无人机质心坐标,再对视场内背景恒星进行星图匹配并识别提取质心坐标,在光学敏感器坐标系下计算观测无人机与被观测无人机及背景恒星之间视线星光角距观测信息,用于辅助提升编队无人机整体导航定位精度。A UAV formation cooperative navigation method based on the auxiliary constraint of line-of-sight starlight angle distance. Each UAV in the UAV formation has its own inertial navigation equipment, barometer and communication ranging link equipment. The UAV formation One of the UAVs is equipped with a servo-controlled optical sensor as an observation UAV; the observation UAV tracks and observes the observed UAV and extracts the coordinates of the center of mass of the observed UAV, and then detects the background stars in the field of view. Perform star map matching and identify and extract the coordinates of the center of mass, and calculate the line-of-sight starlight angular distance observation information between the observation UAV, the observed UAV and the background stars in the optical sensor coordinate system, which is used to assist in improving the overall navigation of the formation UAV positioning accuracy.

所述一种基于视线星光角距辅助约束的无人机编队协同导航方法,以编队内各个无人机导航参数,包括位置、速度、姿态,作为待估状态,基于各无人机自带惯性导航设备测量数据的导航参数捷联解算过程作为KF算法状态预估过程,以编队内任意两无人机的通信测距信息作为KF第一组观测信息,以视线星光角距观测信息作为KF第二组观测信息,通过KF算法完成编队无人机整体导航定位解算。Said a UAV formation cooperative navigation method based on the auxiliary constraints of line-of-sight and starlight angular distance, uses the navigation parameters of each UAV in the formation, including position, speed, and attitude, as the state to be estimated, based on the inertia of each UAV The navigation parameter strapdown calculation process of the navigation equipment measurement data is used as the state estimation process of the KF algorithm. The communication ranging information of any two UAVs in the formation is used as the first group of KF observation information, and the line-of-sight starlight angular distance observation information is used as the KF algorithm. The second set of observation information is used to complete the overall navigation and positioning calculation of the formation UAV through the KF algorithm.

所述的一种基于视线星光角距辅助约束的无人机编队协同导航方法,步骤如下:Described a kind of unmanned aerial vehicle formation cooperative navigation method based on line-of-sight starlight angular distance auxiliary constraint, the steps are as follows:

步骤1,编队内各个无人机机载惯性导航模块基于陀螺测量数据和加速度计测量数据完成各个无人机位置、速度、姿态导航参数的积分递推,机载气压计测得高度数据用于修正无人机位置在高度方向的误差,并将导航参数预估数据发送至中心主机,将编队内负责所有无人机导航参数融合更新的无人机定义为中心无人机;Step 1, each UAV airborne inertial navigation module in the formation completes the integral recursion of each UAV position, speed, and attitude navigation parameters based on the gyro measurement data and accelerometer measurement data, and the height data measured by the airborne barometer is used for Correct the error of the UAV position in the height direction, and send the estimated navigation parameter data to the central host, and define the UAV responsible for the fusion and update of all UAV navigation parameters in the formation as the central UAV;

步骤2,编队内各无人机通过数据链完成通信测距,将机间测距信息发送至中心主机,中心主机完成基于机间测距观测信息的编队无人机整体导航定位误差修正;Step 2, each UAV in the formation completes the communication ranging through the data link, and sends the inter-machine ranging information to the central host, and the central host completes the overall navigation and positioning error correction of the formation UAV based on the inter-machine ranging observation information;

步骤3,布置于观测无人机的高精度伺服控制光学敏感器对编队内某一无人机进行跟踪观测并提取该机质心坐标,再对视场内背景恒星进行星图匹配并识别提取其质心坐标,在光学敏感器坐标系下计算观测无人机与被观测无人机及背景恒星之间视线星光角距观测信息,发送至中心主机完成基于视线星光角距观测信息约束的编队无人机整体导航定位误差修正。Step 3: The high-precision servo control optical sensor arranged on the observation UAV tracks and observes a certain UAV in the formation and extracts the coordinates of the center of mass of the UAV, and then performs star map matching on the background stars in the field of view and identifies and extracts them. The center of mass coordinates, in the coordinate system of the optical sensor, calculate the line-of-sight starlight angular distance observation information between the observation UAV, the observed UAV and the background stars, and send it to the central host to complete the formation unmanned based on the line-of-sight starlight angular distance observation information constraints Machine overall navigation and positioning error correction.

所述的一种基于视线星光角距辅助约束的无人机编队协同导航方法,编队内任一无人机观察另一无人机的视线星光角距算法模型如下:According to the UAV formation cooperative navigation method based on the auxiliary constraint of line-of-sight starlight angular distance, any UAV in the formation observes the line-of-sight starlight angular distance algorithm model of another UAV as follows:

Figure BDA0003810459670000021
Figure BDA0003810459670000021

其中,αij为编队内i无人机通过机载光学敏感器对j无人机及视场内背景恒星进行观测获取的视线星光角距,xi,yi,zi为i无人机位置坐标,xj,yj,zj为j无人机位置坐标。Among them, α ij is the line-of-sight starlight angular distance obtained by UAV i in the formation through the onboard optical sensor to observe UAV j and the background stars in the field of view, x i , y i , z i are UAV i Position coordinates, x j , y j , z j are the position coordinates of j UAV.

本发明的有益效果及优点:基于视线星光角距辅助约束的GNSS拒止环境无人机编队协同自主导航方法,能有效消除编队无人机整体绝对定位误差绕空间某一点旋转发散的趋势,从而进一步提升编队无人机整体绝对导航定位精度。Beneficial effects and advantages of the present invention: the GNSS-denied environment UAV formation cooperative autonomous navigation method based on the auxiliary constraint of line-of-sight starlight angular distance can effectively eliminate the tendency of the overall absolute positioning error of the formation UAV to rotate and diverge around a certain point in space, thereby Further improve the overall absolute navigation and positioning accuracy of formation drones.

附图说明Description of drawings

图1为视线星光角距示意图。Figure 1 is a schematic diagram of the line-of-sight starlight angular distance.

图2为基于视线星光角距辅助的约束的编队无人机协同自主导航示意图。Figure 2 is a schematic diagram of cooperative autonomous navigation of formation UAVs based on the constraint of line-of-sight starlight angular distance assistance.

具体实施方式Detailed ways

为了使本发明的目的、技术方案及优点更加清楚,以下根据各图结合具体实施例对本发明进行进一步的详细说明。In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in combination with specific embodiments according to the figures.

一种基于视线星光角距辅助约束的无人机编队协同导航方法,相关的硬件组成包括编队内各个无人机自带的惯性导航设备、气压计、通信测距链路设备以及布置于编队内某一无人机的高精度伺服控制光学敏感器。布置于编队内某一无人机的高精度伺服控制光学敏感器对编队内另一无人机进行跟踪观测并提取该机质心坐标,再对视场内背景恒星进行星图匹配并识别提取其质心坐标,便可在光学敏感器坐标系下计算观测无人机与被观测无人机及背景恒星之间视线星光角距观测信息,如图1所示,用于辅助提升编队无人机整体导航定位精度。A UAV formation cooperative navigation method based on the auxiliary constraint of line-of-sight and starlight angular distance. The relevant hardware components include the inertial navigation equipment, barometer, communication ranging link equipment and A high-precision servo-controlled optical sensor for a drone. The high-precision servo-controlled optical sensor arranged in a UAV in the formation tracks and observes another UAV in the formation and extracts the coordinates of the center of mass of the UAV, and then performs star map matching on the background stars in the field of view and identifies and extracts them. The center of mass coordinates can be used to calculate the line-of-sight starlight angular distance observation information between the observing UAV, the observed UAV and the background stars in the optical sensor coordinate system, as shown in Figure 1, which is used to assist in improving the overall formation of UAVs. Navigation positioning accuracy.

本方法涉及的基本原理为在卡尔曼滤波算法(KF)框架下,以编队内各个无人机导航状态(包括位置、速度、姿态)的误差作为待估状态,基于各无人机自带惯性导航设备捷联解算过程的误差状态方差作为KF算法状态预估过程,以编队内任意两无人机的通信测距测量信息与惯导预估测距信息之差作为KF第一组观测信息,以视线星光角距观测信息与惯导预估视线星光角距信息之差作为KF第二组观测信息,通过KF算法完成编队无人机整体导航定位解算,本方法涉及原理框图如图2所示,算法涉及公式如下:The basic principle involved in this method is that under the framework of the Kalman filter algorithm (KF), the error of the navigation state (including position, velocity, and attitude) of each UAV in the formation is used as the state to be estimated, based on the inertia of each UAV The error state variance of the navigation equipment strapdown solution process is used as the state estimation process of the KF algorithm, and the difference between the communication ranging measurement information of any two UAVs in the formation and the inertial navigation estimated ranging information is used as the first group of KF observation information The difference between the line-of-sight starlight angular distance observation information and the inertial navigation estimated line-of-sight starlight angular distance information is used as the second group of KF observation information, and the KF algorithm is used to complete the overall navigation and positioning solution of the formation UAV. The principle block diagram involved in this method is shown in Figure 2 As shown, the algorithm involves the following formulas:

1)基于编队无人机自带惯性导航设备数据的捷联解算及KF状态方程如下:1) The strapdown solution and the KF state equation based on the data of the inertial navigation equipment of the formation UAV are as follows:

Figure BDA0003810459670000031
Figure BDA0003810459670000031

Figure BDA0003810459670000032
Figure BDA0003810459670000032

Figure BDA0003810459670000033
Figure BDA0003810459670000033

Figure BDA0003810459670000034
Figure BDA0003810459670000034

Figure BDA0003810459670000035
Figure BDA0003810459670000035

Figure BDA0003810459670000036
Figure BDA0003810459670000036

Figure BDA0003810459670000037
Figure BDA0003810459670000037

Figure BDA0003810459670000038
Figure BDA0003810459670000038

Figure BDA0003810459670000039
Figure BDA0003810459670000039

Figure BDA00038104596700000310
Figure BDA00038104596700000310

Figure BDA00038104596700000311
Figure BDA00038104596700000311

其中,φEiNiUi为第i架无人机三轴姿态角误差,δvEi,δvNi,δvUi为第i架无人机三轴速度误差,δLi,δλi,δhi为第i架无人机位置误差,εxiyizi为第i架无人机陀螺三轴零偏,εEiNiUi为第i架无人机陀螺三轴零偏在导航坐标系分量,

Figure BDA00038104596700000312
为第i架无人机加速度计三轴零偏,
Figure BDA00038104596700000313
为第i架无人机加速度计三轴零偏在导航坐标系分量,ωNiUi为第i架无人机地球自转角速率在北向和天向分量,fEi,fNi,fUi为第i架无人机加速度计数据在导航坐标系分量,RMhi,RNhi为第i架无人机含高度信息的当地曲率半径。Among them, φ Ei , φ Ni , φ Ui are the three-axis attitude angle error of the i-th UAV, δv Ei , δv Ni , δv Ui are the three-axis velocity error of the i-th UAV, δL i , δλ i , δh i is the position error of the i-th UAV, ε xi , ε yi , ε zi are the three-axis zero bias of the gyro of the i-th UAV, ε Ei , ε Ni , ε Ui are the three-axis gyro of the i-th UAV Zero offset in the navigation coordinate system component,
Figure BDA00038104596700000312
is the three-axis zero bias of the i-th UAV accelerometer,
Figure BDA00038104596700000313
is the three-axis bias of the i-th UAV accelerometer in the navigation coordinate system, ω Ni , ω Ui are the north and celestial components of the i-th UAV’s earth rotation angular rate, f Ei , f Ni , f Ui are The accelerometer data of the i-th UAV is in the navigation coordinate system component, R Mhi , R Nhi is the local curvature radius of the i-th UAV with height information.

2)基于编队无人机机间测距数据的KF观测方程如下:2) The KF observation equation based on the ranging data between formation UAVs is as follows:

假设编队内i无人机和j无人机的机间测距量测值为:Assume that the inter-machine ranging measurement values of i-UAV and j-UAV in the formation are:

Figure BDA00038104596700000314
Figure BDA00038104596700000314

基于i无人机和j无人机自带惯性导航预估的机间测距为:The estimated inter-machine ranging based on the inertial navigation of the i UAV and the j UAV is:

Figure BDA00038104596700000315
Figure BDA00038104596700000315

则机间测距误差观测方程为:Then the inter-machine ranging error observation equation is:

Figure BDA0003810459670000041
Figure BDA0003810459670000041

其中,in,

Figure BDA0003810459670000042
Figure BDA0003810459670000042

Figure BDA0003810459670000043
Figure BDA0003810459670000043

Figure BDA0003810459670000044
Figure BDA0003810459670000044

Figure BDA0003810459670000045
Figure BDA0003810459670000045

其中,xIi,yIi,zIi为第i架无人机基于惯性导航递推的位置预估值,rIij在数值上可取ρIij,Dij为地球坐标系与经纬度坐标之间转化矩阵。Among them, x Ii , y Ii , z Ii are the position estimates of the i-th UAV based on inertial navigation recursion, r Iij can be numerically ρ Iij , and D ij is the transformation matrix between the earth coordinate system and the latitude and longitude coordinates .

3)基于编队无人机视线星光角距数据的KF观测方程如下:3) The KF observation equation based on the line-of-sight starlight angular distance data of formation UAVs is as follows:

假设编队内i无人机通过机载光学敏感器对j无人机及视场内背景恒星进行观测,则对应的视线星光角距测量值为:Assuming that UAV i in the formation observes UAV j and the background stars in the field of view through the onboard optical sensor, the corresponding measurement value of the line-of-sight starlight angular distance is:

Figure BDA0003810459670000046
Figure BDA0003810459670000046

基于i无人机和j无人机自带惯性导航预估的视线/星光角距为:The line-of-sight/starlight angular distance estimated based on the built-in inertial navigation of i UAV and j UAV is:

Figure BDA0003810459670000047
Figure BDA0003810459670000047

则视线星光角距误差观测方程为:Then the line-of-sight starlight angular distance error observation equation is:

Figure BDA0003810459670000048
Figure BDA0003810459670000048

其中,in,

Figure BDA0003810459670000049
Figure BDA0003810459670000049

Figure BDA00038104596700000410
Figure BDA00038104596700000410

Figure BDA00038104596700000411
Figure BDA00038104596700000411

Figure BDA0003810459670000051
Figure BDA0003810459670000051

Figure BDA0003810459670000052
Figure BDA0003810459670000052

其中,api=sx·XBi+sy·YBi+sz·ZBi

Figure BDA0003810459670000053
XBi=xi-xj,YBi=yi-yj,ZBi=zi-zj,sx,sy,sz为恒星矢量。Among them, ap i = s x · X Bi + s y · Y Bi + s z · Z Bi ,
Figure BDA0003810459670000053
X Bixi -x j , Y Bi =y i -y j , Z Bi =zi -z j , s x , s y , s z are stellar vectors.

4)KF算法流程如下:4) The KF algorithm flow is as follows:

在多无人机编队协同自主导航方案中,采用分布式序贯卡尔曼滤波作为融合滤波算法框架,以i架无人机和第j架无人机的单次序贯滤波为例,具体流程为:In the multi-UAV formation cooperative autonomous navigation scheme, the distributed sequential Kalman filter is used as the framework of the fusion filtering algorithm. Taking the single sequential filtering of the i UAV and the j UAV as an example, the specific process is as follows: :

首先基于各自惯性测量单元数据完成导航状态参数的时间更新:First, the time update of the navigation state parameters is completed based on the respective inertial measurement unit data:

第i架无人机时间更新过程如下:The time update process of the i-th UAV is as follows:

Xi,k+1/k=Φi,k+1/k·Xi,k X i,k+1/k = Φ i,k+1/k X i,k

Figure BDA0003810459670000054
Figure BDA0003810459670000054

第j架无人机时间更新过程如下:The time update process of the jth UAV is as follows:

Xj,k+1/k=Φj,k+1/k·Xj,k X j,k+1/k =Φ j,k+1/k X j,k

Figure BDA0003810459670000055
Figure BDA0003810459670000055

关联第i架无人机和第j架无人机的观测量对应的序贯量测更新过程如下:The sequential measurement update process corresponding to the observations of the i-th UAV and the j-th UAV is as follows:

Figure BDA0003810459670000056
Figure BDA0003810459670000056

Xij,k+1=Xij,k+1/k+Kij,k+1·(Zij,k+1-Hij,k+1·Xij,k+1/k)X ij,k+1 =X ij,k+1/k +K ij,k+1 (Z ij,k+1 -H ij,k+1 X ij,k+1/k )

Pij,k+1=(I-Kij,k+1·Hij,k+1)·Pij,k+1/k P ij,k+1 =(IK ij,k+1 H ij,k+1 ) P ij,k+1/k

其中,in,

Xij,k+1/k=[Xi,k+1/k,Xj,k+1/k]T X ij,k+1/k =[X i,k+1/k ,X j,k+1/k ] T

Figure BDA0003810459670000057
Figure BDA0003810459670000057

Φi,k+1/k,Φj,k+1/k为第i无人机和j无人机状态更新系统矩阵,Γi,k,Γj,k为噪声驱动矩阵,Qi,k,Qj,k为噪声矩阵,Zij,k+1关联i无人机和j无人机观测量,Hij,k+1为关联i无人机和j无人机观测量的量测矩阵,Rij,k+1为量测噪声矩阵。Φ i,k+1/k , Φ j,k+1/k is the state update system matrix of the i-th UAV and j UAV, Γ i,k , Γ j,k is the noise driving matrix, Q i, k , Q j,k is the noise matrix, Z ij,k+1 is associated with i-UAV and j-UAV observations, H ij,k+1 is the amount associated with i-UAV and j-UAV observations measurement matrix, R ij,k+1 is the measurement noise matrix.

本方法具体包括以下步骤:This method specifically comprises the following steps:

步骤1,编队内各无人机基于机载惯性导航设备和气压计进行自身导航参数的预估,并将导航参数预估数据发送至中心主机。具体过程为:编队内各个无人机机载惯性导航模块基于陀螺测量数据和加速度计测量数据完成各个无人机位置、速度、姿态等导航参数的积分递推,机载气压计测得高度数据用于修正无人机位置在高度方向的误差。Step 1. Each UAV in the formation estimates its own navigation parameters based on the onboard inertial navigation equipment and barometer, and sends the navigation parameter estimation data to the central host. The specific process is: the airborne inertial navigation module of each UAV in the formation completes the integral recursion of navigation parameters such as the position, speed, and attitude of each UAV based on the gyro measurement data and the accelerometer measurement data, and the height data measured by the airborne barometer Used to correct the error of the drone's position in the altitude direction.

步骤2,编队内各无人机通过数据链完成通信测距,将机间测距信息发送至中心主机,中心主机完成基于机间测距观测信息的编队无人机整体导航定位误差修正。Step 2. Each UAV in the formation completes the communication ranging through the data link, and sends the inter-machine ranging information to the central host, and the central host completes the overall navigation and positioning error correction of the formation UAVs based on the inter-machine ranging observation information.

步骤3,布置于主机或者任一僚机的高精度伺服控制光学敏感器对编队内某一无人机进行跟踪观测并提取该机质心坐标,再对视场内背景恒星进行星图匹配并识别提取其质心坐标,在光学敏感器坐标系下计算观测无人机与被观测无人机及背景恒星之间视线星光角距观测信息,发送至中心主机完成基于视线星光角距观测信息约束的编队无人机整体导航定位误差修正。Step 3: The high-precision servo-controlled optical sensor arranged on the main engine or any wingman tracks and observes a UAV in the formation and extracts the coordinates of the center of mass of the UAV, and then performs star map matching and identification extraction on the background stars in the field of view Its center of mass coordinates are used to calculate the line-of-sight starlight angular distance observation information between the observation UAV, the observed UAV and the background stars in the optical sensor coordinate system, and send it to the central host to complete the formation based on the line-of-sight starlight angular distance observation information constraints. Man-machine overall navigation and positioning error correction.

本发明基于视线星光角距辅助约束的GNSS拒止环境无人机编队协同自主导航方法,能有效消除编队无人机整体绝对定位误差绕空间某一点旋转发散的趋势,从而进一步提升编队无人机整体绝对导航定位精度。The present invention is based on the GNSS denial environment UAV formation cooperative autonomous navigation method based on the auxiliary constraint of line-of-sight starlight angle distance, which can effectively eliminate the tendency of the overall absolute positioning error of the formation UAV to rotate and diverge around a certain point in space, thereby further improving the formation UAV Overall absolute navigation positioning accuracy.

以上所述实施例仅表达了本发明的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对发明范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干变形和改进,这些都属于本发明的保护范围。因此,本发明的保护范围应以所附权利要求为准。The above-mentioned embodiments only express several implementation modes of the present invention, and the description thereof is relatively specific and detailed, but should not be construed as limiting the scope of the invention. It should be pointed out that those skilled in the art can make several modifications and improvements without departing from the concept of the present invention, and these all belong to the protection scope of the present invention. Therefore, the protection scope of the present invention should be determined by the appended claims.

Claims (4)

1. An unmanned aerial vehicle formation collaborative navigation method based on auxiliary constraint of sight starlight angular distance is characterized by comprising the following steps: each unmanned aerial vehicle in the formation of unmanned aerial vehicles is provided with an inertial navigation device, a barometer and a communication ranging link device, wherein a servo control optical sensor is arranged on one unmanned aerial vehicle in the formation of unmanned aerial vehicles to serve as an observation unmanned aerial vehicle; the observation unmanned aerial vehicle tracks and observes the observed unmanned aerial vehicle and extracts the centroid coordinate of the observed unmanned aerial vehicle, then carries out star map matching on the background fixed star in the visual field and identifies and extracts the centroid coordinate, calculates the observation information of the sight starlight angular distance between the observation unmanned aerial vehicle and the observed unmanned aerial vehicle and between the observation unmanned aerial vehicle and the background fixed star under the coordinate system of the optical sensor, and is used for assisting in improving the overall navigation positioning precision of the formation unmanned aerial vehicle.
2. The unmanned aerial vehicle formation collaborative navigation method based on the auxiliary constraint of sight line starlight angular distance as claimed in claim 1, wherein: with each unmanned aerial vehicle navigation parameter in the formation, including the position, the speed, the gesture, as waiting to estimate the state, navigation parameter strapdown calculation process based on each unmanned aerial vehicle is from taking inertial navigation equipment measured data predicts the process as KF algorithm state, regard as KF first group observation information with two arbitrary unmanned aerial vehicle's in the formation communication range finding information, regard as KF second group observation information with sight starlight angular distance observation information, accomplish formation unmanned aerial vehicle whole navigation positioning through the KF algorithm and solve.
3. The unmanned aerial vehicle formation collaborative navigation method based on sight line starlight angular distance auxiliary constraint is characterized in that: the method comprises the following steps:
step 1, integrating and recursion of position, speed and attitude navigation parameters of each unmanned aerial vehicle is completed by each unmanned aerial vehicle airborne inertial navigation module in a formation based on gyro measurement data and accelerometer measurement data, height data measured by an airborne barometer is used for correcting errors of the position of the unmanned aerial vehicle in the height direction, navigation parameter prediction data is sent to a central host, and the unmanned aerial vehicle in the formation, which is responsible for fusion and update of all unmanned aerial vehicle navigation parameters, is defined as the central unmanned aerial vehicle;
step 2, finishing communication ranging by all unmanned aerial vehicles in the formation through a data chain, sending inter-machine ranging information to a central host, and finishing the correction of the whole navigation positioning error of the formation unmanned aerial vehicles based on the inter-machine ranging observation information by the central host;
and 3, tracking and observing a certain unmanned aerial vehicle in the formation by a high-precision servo control optical sensor arranged on the observing unmanned aerial vehicle, extracting the centroid coordinate of the unmanned aerial vehicle, performing star map matching on a background fixed star in the visual field, identifying and extracting the centroid coordinate of the background fixed star, calculating the sight starlight angular distance observation information between the observing unmanned aerial vehicle, the observed unmanned aerial vehicle and the background fixed star under the coordinate system of the optical sensor, and sending the sight starlight angular distance observation information to the central host to finish the whole navigation and positioning error correction of the unmanned aerial vehicle in the formation based on the sight starlight angular distance observation information constraint.
4. The unmanned aerial vehicle formation collaborative navigation method based on the auxiliary constraint of sight line starlight angular distance as claimed in claim 1, wherein: the sight line starlight angular distance algorithm model for observing another unmanned aerial vehicle by any unmanned aerial vehicle in the formation is as follows:
Figure FDA0003810459660000011
wherein alpha is ij Observing the j unmanned aerial vehicle and the background fixed star in the view field by the i unmanned aerial vehicle in the formation through the airborne optical sensor to obtain the sight starlight angular distance, x i ,y i ,z i For i unmanned plane position coordinates, x j ,y j ,z j J drone position coordinates.
CN202211015630.7A 2022-08-23 2022-08-23 Cooperative navigation method for UAV formation based on auxiliary constraints of line-of-sight starlight angle distance Pending CN115406437A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211015630.7A CN115406437A (en) 2022-08-23 2022-08-23 Cooperative navigation method for UAV formation based on auxiliary constraints of line-of-sight starlight angle distance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211015630.7A CN115406437A (en) 2022-08-23 2022-08-23 Cooperative navigation method for UAV formation based on auxiliary constraints of line-of-sight starlight angle distance

Publications (1)

Publication Number Publication Date
CN115406437A true CN115406437A (en) 2022-11-29

Family

ID=84162665

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211015630.7A Pending CN115406437A (en) 2022-08-23 2022-08-23 Cooperative navigation method for UAV formation based on auxiliary constraints of line-of-sight starlight angle distance

Country Status (1)

Country Link
CN (1) CN115406437A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1869589A (en) * 2006-06-27 2006-11-29 北京航空航天大学 Strapdown intertial/celestial combined navigation semi-material emulation system
US9217643B1 (en) * 2009-01-08 2015-12-22 Trex Enterprises Corp. Angles only navigation system
CN106382927A (en) * 2016-08-19 2017-02-08 哈尔滨工业大学 A star sensor autonomous navigation method based on satellite identification
CN108225307A (en) * 2017-12-29 2018-06-29 南京航空航天大学 A kind of star pattern matching method of inertia measurement information auxiliary
CN111044075A (en) * 2019-12-10 2020-04-21 上海航天控制技术研究所 SINS error online correction method based on satellite pseudo-range/relative measurement information assistance
CN111238469A (en) * 2019-12-13 2020-06-05 南京航空航天大学 A relative navigation method of UAV formation based on inertia/data link
US20200386551A1 (en) * 2019-06-10 2020-12-10 Rockwell Collins, Inc. Multi-Aircraft Vision and Datalink Based Navigation System and Method
CN114216454A (en) * 2021-10-27 2022-03-22 湖北航天飞行器研究所 Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS rejection environment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1869589A (en) * 2006-06-27 2006-11-29 北京航空航天大学 Strapdown intertial/celestial combined navigation semi-material emulation system
US9217643B1 (en) * 2009-01-08 2015-12-22 Trex Enterprises Corp. Angles only navigation system
CN106382927A (en) * 2016-08-19 2017-02-08 哈尔滨工业大学 A star sensor autonomous navigation method based on satellite identification
CN108225307A (en) * 2017-12-29 2018-06-29 南京航空航天大学 A kind of star pattern matching method of inertia measurement information auxiliary
US20200386551A1 (en) * 2019-06-10 2020-12-10 Rockwell Collins, Inc. Multi-Aircraft Vision and Datalink Based Navigation System and Method
CN111044075A (en) * 2019-12-10 2020-04-21 上海航天控制技术研究所 SINS error online correction method based on satellite pseudo-range/relative measurement information assistance
CN111238469A (en) * 2019-12-13 2020-06-05 南京航空航天大学 A relative navigation method of UAV formation based on inertia/data link
CN114216454A (en) * 2021-10-27 2022-03-22 湖北航天飞行器研究所 Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS rejection environment

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
任昊: "编队中飞行器的协同导航方法研究", 中国优秀硕士学位论文全文数据库, 15 January 2021 (2021-01-15) *
姜笛: "星图识别及其在航天器定姿中的应用", 中国博士学位论文全文数据库, 15 May 2022 (2022-05-15) *
胡东斌等: "基于SINS/GPS/CNS的无人机组合导航算法研究", 航空计算技术, vol. 51, no. 01, 29 January 2021 (2021-01-29) *
赫志达等: "星敏感器在无人作战领域的应用及发展趋势", 兵器装备工程学报, vol. 37, no. 11, 12 December 2016 (2016-12-12) *
郝菁等: "基于惯导/数据链的动态相对定位方法", 计算机测量与控制, vol. 26, no. 10, 26 October 2018 (2018-10-26), pages 1 - 3 *

Similar Documents

Publication Publication Date Title
CN109946732B (en) Unmanned vehicle positioning method based on multi-sensor data fusion
US11802769B2 (en) Lane line positioning method and apparatus, and storage medium thereof
CN113203418B (en) GNSSINS visual fusion positioning method and system based on sequential Kalman filtering
CN109813311B (en) Unmanned aerial vehicle formation collaborative navigation method
CN107806874B (en) A kind of inertial navigation polar region Initial Alignment Method of vision auxiliary
CN104655152B (en) A real-time delivery alignment method for airborne distributed POS based on federated filtering
CN100476360C (en) A Deep Integrated Integrated Navigation Method Based on Star Sensor Calibration
CN107643088A (en) Navigation of Pilotless Aircraft method, apparatus, unmanned plane and storage medium
CN111426320A (en) A Vehicle Autonomous Navigation Method Based on Image Matching/Inertial Navigation/Odometer
CN111207745A (en) Inertia measurement method suitable for vertical gyroscope of large maneuvering unmanned aerial vehicle
CN111399023A (en) Inertial basis combined navigation filtering method based on lie group nonlinear state error
CN106672265B (en) A small celestial body fixed-point landing guidance control method based on optical flow information
CN112880669A (en) Spacecraft starlight refraction and uniaxial rotation modulation inertia combined navigation method
CN116519015A (en) A distributed cooperative navigation method and system based on relative distance constraints
CN105865488B (en) A kind of quiet pedestal dynamic fast and accuracy alignment method based on autonomous measurement information
Hu et al. Kilometer sign positioning-aided INS/odometer integration for land vehicle autonomous navigation
CN109143303A (en) Flight localization method, device and fixed-wing unmanned plane
CN108981691A (en) A kind of sky polarised light integrated navigation filters online and smoothing method
CN118565470A (en) Unmanned aerial vehicle-based inertial navigation system and navigation method
CN115406437A (en) Cooperative navigation method for UAV formation based on auxiliary constraints of line-of-sight starlight angle distance
CN116878497A (en) Neural network-assisted Lu Bangduo sensor factor graph fusion positioning method
Song et al. A high precision autonomous navigation algorithm of UAV based on MEMS sensor
CN115371707A (en) Coarse alignment method for Doppler radar assisted strapdown inertial navigation motion base under large azimuth misalignment angle
CN102818570B (en) Method for Mars acquisition by using SINS/image matching combination navigation
CN117109564A (en) UAV formation collaborative navigation method based on line of sight polarized light angular distance auxiliary constraints

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination