[go: up one dir, main page]

CN116182855A - An integrated navigation method for UAV with imitation compound eye polarization vision in weak light environment - Google Patents

An integrated navigation method for UAV with imitation compound eye polarization vision in weak light environment Download PDF

Info

Publication number
CN116182855A
CN116182855A CN202310474059.3A CN202310474059A CN116182855A CN 116182855 A CN116182855 A CN 116182855A CN 202310474059 A CN202310474059 A CN 202310474059A CN 116182855 A CN116182855 A CN 116182855A
Authority
CN
China
Prior art keywords
polarization
image
camera
navigation
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310474059.3A
Other languages
Chinese (zh)
Other versions
CN116182855B (en
Inventor
郭雷
赵岩
杨健
张腾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202310474059.3A priority Critical patent/CN116182855B/en
Publication of CN116182855A publication Critical patent/CN116182855A/en
Application granted granted Critical
Publication of CN116182855B publication Critical patent/CN116182855B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

本发明涉及一种弱光强环境下仿复眼偏振视觉无人机组合导航方法,主要包括以下步骤:首先,利用理想瑞利散射模型中偏振矢量与太阳矢量垂直关系,选择惯导误差和相机位姿为状态向量建立偏振量测方程;然后,针对弱光强场景图像特征不明显的问题,利用偏振相机获得的每一帧的偏振度图像和偏振角图像进行图像融合增强图像特征;再次,在针孔模型投影下,根据测量得到的特征点投影位置信息与惯导预测得到的相机投影点位置信息得到的重投影误差建立视觉量测方程;最后,采用偏振传感器与偏振相机得到的量测信息进行多状态约束卡尔曼滤波得到无人机组合导航信息。本发明具有计算量小、精度高和鲁棒性强等优点。

Figure 202310474059

The invention relates to a combined navigation method for an unmanned aerial vehicle imitating compound eye polarization vision in a weak light environment. The polarization measurement equation is established for the state vector; then, to solve the problem that the image characteristics of the scene with weak light and strong light are not obvious, the polarization degree image and the polarization angle image of each frame obtained by the polarization camera are used to perform image fusion to enhance the image characteristics; again, in Under the pinhole model projection, the visual measurement equation is established according to the reprojection error obtained from the measured feature point projection position information and the camera projection point position information predicted by inertial navigation; finally, the measurement information obtained by the polarization sensor and polarization camera is used Multi-state constrained Kalman filtering is performed to obtain integrated navigation information of UAV. The invention has the advantages of small calculation amount, high precision, strong robustness and the like.

Figure 202310474059

Description

一种弱光强环境下仿复眼偏振视觉无人机组合导航方法A method for combined navigation of UAVs with compound eye polarization vision in weak light environment

技术领域Technical Field

本发明属于组合导航领域,具体涉及一种弱光强环境下仿复眼偏振视觉无人机组合导航方法。The invention belongs to the field of combined navigation, and in particular relates to a combined navigation method for a UAV simulating compound eye polarization vision in a weak light environment.

背景技术Background Art

组合导航技术是无人机导航领域的一项关键技术。VINS导航系统短时间内精度较高,但长时间误差会随时间积累逐渐变大且在弱光强场景下精度较低甚至不可用。根据理想瑞利散射模型得到的偏振信息可以为组合导航系统提供航向约束,并且具有自主性强、无误差积累等优点。对于弱光强场景下相机获取图像纹理不清晰导致VINS导航系统精度大幅度下降的问题,偏振相机在弱光强场景下获取的偏振度图像所含目标信息量大、偏振角图像目标边缘和轮廓清晰,融合偏振度图像与偏振角图像后获得偏振图像,可以在弱光强场景下大幅度提高VINS系统的导航精度。为充分发挥组合导航系统各个子系统的优点,设计一种仿复眼偏振视觉无人机组合导航方法,对完成卫星导航失灵、强干扰环境和复杂位置环境的自主导航和定位任务具有重大意义。Integrated navigation technology is a key technology in the field of UAV navigation. The VINS navigation system has high accuracy in the short term, but the long-term error will gradually increase with time accumulation and the accuracy is low or even unavailable in low-light scenes. The polarization information obtained according to the ideal Rayleigh scattering model can provide heading constraints for the integrated navigation system, and has the advantages of strong autonomy and no error accumulation. In order to solve the problem that the texture of the image obtained by the camera in low-light scenes is not clear, which leads to a significant decrease in the accuracy of the VINS navigation system, the polarization image obtained by the polarization camera in low-light scenes contains a large amount of target information, and the target edge and contour of the polarization angle image are clear. The polarization image is obtained by fusing the polarization image with the polarization angle image, which can greatly improve the navigation accuracy of the VINS system in low-light scenes. In order to give full play to the advantages of each subsystem of the integrated navigation system, a compound eye polarization vision UAV integrated navigation method is designed, which is of great significance for completing autonomous navigation and positioning tasks in satellite navigation failure, strong interference environment and complex location environment.

单一导航有优势也有劣势,利用多状态约束卡尔曼滤波对各个子导航系统进行紧组合滤波来达到信息融合的目的,可以提高导航系统精度和可靠性。中国专利申请“一种基于天顶点矢量的偏振/VIO三维姿态确定方法”(申请号:CN202111381893.5),该方法仅考虑了利用偏振修正三维姿态,未考虑位置信息的修正;中国专利申请“一种基于视觉惯性偏振光融合的无人机位姿估计方法”(申请号:CN202010623718.1),该方法利用偏振导航提供航向约束修正惯导逐渐累积的误差;论文《基于偏振成像的双目立体视觉应用技术研究》,利用偏振成像提升双目视觉导航精度但无法进行长航时导航;中国专利申请“一种基于偏振光、光流矢量、双目视觉传感器的仿昆虫视觉组合导航方法”(申请号:CN202011289488.6),该方法通过优化的方法进行信息融合并利用光流失量进行位置修正。Single navigation has advantages and disadvantages. Using multi-state constrained Kalman filtering to perform tight combination filtering on each sub-navigation system to achieve the purpose of information fusion can improve the accuracy and reliability of the navigation system. Chinese patent application "A polarization/VIO three-dimensional attitude determination method based on zenith point vector" (application number: CN202111381893.5), this method only considers the use of polarization to correct the three-dimensional attitude, and does not consider the correction of position information; Chinese patent application "A UAV pose estimation method based on visual inertial polarization light fusion" (application number: CN202010623718.1), this method uses polarization navigation to provide heading constraints to correct the gradually accumulated errors of inertial navigation; the paper "Research on binocular stereo vision application technology based on polarization imaging" uses polarization imaging to improve the accuracy of binocular vision navigation but cannot perform long-duration navigation; Chinese patent application "A method of combined navigation based on insect vision based on polarized light, optical flow vector, and binocular vision sensor" (application number: CN202011289488.6), this method uses an optimized method to fuse information and uses optical flow loss for position correction.

以上发明均使用可见光视觉相机获取视觉图像并进行视觉更新,未考虑可见光视觉系统在弱光强场景下的精度下降问题。本发明使用偏振相机获取偏振图像并进行融合来进行视觉测量更新,偏振相机可在弱光强环境下提供清晰的视觉图像进行视觉测量更新,对比于可见光相机获取的视觉信息可大幅提高导航精度。The above inventions all use visible light vision cameras to obtain visual images and perform visual updates, without considering the problem of reduced accuracy of visible light vision systems in low-light intensity scenes. The present invention uses a polarization camera to obtain polarization images and fuse them for visual measurement updates. The polarization camera can provide clear visual images for visual measurement updates in low-light intensity environments, which can greatly improve navigation accuracy compared to the visual information obtained by visible light cameras.

发明内容Summary of the invention

为解决在GNSS拒止环境弱光强场景下长航时无人机高精度导航问题,克服VINS导航系统长时间误差累积的不足以及弱光强场景下导航精度大幅下降问题,本发明提供一种弱光强环境下仿复眼偏振视觉无人机组合导航方法,采用多状态约束卡尔曼滤波对组合导航信息进行滤波紧组合实现信息融合,保证无人机组合导航系统在面对弱光强环境的长航时精度。本发明为研究弱光强环境下无人机导航提供了新的思路,解决在面对弱光强长航时导航场景时VINS系统导航精度下降的问题。In order to solve the problem of high-precision navigation of long-flight UAVs in weak-light intensity scenes in GNSS-denied environments, overcome the shortcomings of long-term error accumulation of the VINS navigation system and the problem of a significant decrease in navigation accuracy in weak-light intensity scenes, the present invention provides a method for combined navigation of UAVs with simulated compound eye polarization vision in a weak-light intensity environment, and adopts a multi-state constrained Kalman filter to filter and tightly combine the combined navigation information to achieve information fusion, thereby ensuring the long-flight accuracy of the combined navigation system of the UAV in a weak-light intensity environment. The present invention provides a new idea for studying UAV navigation in a weak-light intensity environment, and solves the problem of decreased navigation accuracy of the VINS system in a weak-light intensity long-flight navigation scene.

为达到上述目的,本发明采用的技术方案为:In order to achieve the above object, the technical solution adopted by the present invention is:

一种弱光强环境下仿复眼偏振视觉无人机组合导航方法,包括如下步骤:A method for combined navigation of a UAV using compound eye polarization vision in a weak light environment comprises the following steps:

步骤1、通过点源式偏振传感器获得偏振方位角

Figure SMS_1
,计算求得导航系下的偏振矢量
Figure SMS_2
,通过太阳年历计算得到导航系下太阳矢量
Figure SMS_3
,基于理想瑞利散射模型中偏振矢量
Figure SMS_4
与导航系下太阳矢量
Figure SMS_5
的垂直关系,结合惯导状态与相机位姿为状态向量建立系统状态方程与偏振量测方程;Step 1: Obtain the polarization azimuth through a point source polarization sensor
Figure SMS_1
, calculate and obtain the polarization vector under the navigation system
Figure SMS_2
, the solar vector in the navigation system is obtained by calculating the solar calendar
Figure SMS_3
, based on the polarization vector in the ideal Rayleigh scattering model
Figure SMS_4
The sun vector with the navigation system
Figure SMS_5
The vertical relationship between the inertial navigation state and the camera posture is used as the state vector to establish the system state equation and polarization measurement equation;

步骤2、在弱光强场景下,偏振相机得到偏振度图像

Figure SMS_13
、偏振角图像
Figure SMS_7
,将二者均值滤波后得到偏振度图像
Figure SMS_20
与偏振角图像
Figure SMS_8
,相减分解得到偏振度细节图像
Figure SMS_16
与偏振角细节图像
Figure SMS_11
,将偏振度图像
Figure SMS_22
与偏振角图像
Figure SMS_17
加权融合后得到图像
Figure SMS_19
,将原偏振图像形态开运算得到选通区域
Figure SMS_6
与选通区域
Figure SMS_24
,利用选通区域
Figure SMS_10
与选通区域
Figure SMS_15
对偏振度细节图像
Figure SMS_18
与偏振角细节图像
Figure SMS_21
进行优化后得到图像
Figure SMS_9
,将加权融合后得到图像
Figure SMS_14
与图像
Figure SMS_12
融合后得到融合后的偏振图像
Figure SMS_23
;Step 2: In low light conditions, the polarization camera obtains a polarization image
Figure SMS_13
, polarization angle image
Figure SMS_7
, and then filter the two mean values to get the polarization image
Figure SMS_20
Image with polarization angle
Figure SMS_8
, subtract and decompose to obtain the polarization detail image
Figure SMS_16
Detailed image with polarization angle
Figure SMS_11
, the polarization image
Figure SMS_22
Image with polarization angle
Figure SMS_17
After weighted fusion, the image is obtained
Figure SMS_19
, the original polarization image is opened to obtain the gated area
Figure SMS_6
With gated area
Figure SMS_24
, using the gated region
Figure SMS_10
With gated area
Figure SMS_15
Polarization detail image
Figure SMS_18
Detailed image with polarization angle
Figure SMS_21
After optimization, the image is obtained
Figure SMS_9
, the image is obtained after weighted fusion
Figure SMS_14
With image
Figure SMS_12
After fusion, the fused polarization image is obtained
Figure SMS_23
;

步骤3、结合步骤2中的偏振图像

Figure SMS_25
,在针孔模型下,偏振相机得到视觉整体测量
Figure SMS_26
并提供相机的运动信息,惯导得到预测视觉测量
Figure SMS_27
,二者相减得到
Figure SMS_28
时刻第
Figure SMS_29
个特征点的重投影误差
Figure SMS_30
,通过累积叠加与QR分解得到视觉量测方程;Step 3: Combine the polarization image from step 2
Figure SMS_25
, under the pinhole model, the polarization camera obtains the visual overall measurement
Figure SMS_26
And provide the camera's motion information, inertial navigation to obtain predicted visual measurement
Figure SMS_27
, subtracting the two to get
Figure SMS_28
Moment
Figure SMS_29
The reprojection error of feature points
Figure SMS_30
, the visual measurement equation is obtained through cumulative superposition and QR decomposition;

步骤4、结合步骤1与步骤3建立的偏振量测方程与视觉量测方程,利用多状态约束卡尔曼滤波对各个导航子系统获得的数据进行信息融合,修正惯导逐渐累积的误差,解算无人机组合导航信息。Step 4: Combine the polarization measurement equation and visual measurement equation established in Step 1 and Step 3, use multi-state constrained Kalman filtering to fuse the data obtained by each navigation subsystem, correct the gradually accumulated errors of the inertial navigation, and solve the UAV combined navigation information.

进一步地,所述步骤1包括:Furthermore, the step 1 comprises:

选择导航系到载体系的小角度旋转

Figure SMS_32
,载体在导航系下的估计速度与真实速度差
Figure SMS_35
和估计位置与真实位置差
Figure SMS_38
,陀螺仪测量的角速度与真实角速度零偏差
Figure SMS_31
,加速度计测量的线速度与真实线速度零偏差
Figure SMS_34
以及每个时刻相机的位姿
Figure SMS_39
作为组合导航系统的状态变量
Figure SMS_40
,其中
Figure SMS_33
Figure SMS_36
分别表示相机在导航系下的姿态与位置,
Figure SMS_37
具体形式如下:Select a small angle rotation from the navigation system to the carrier system
Figure SMS_32
, the difference between the estimated speed and the actual speed of the carrier in the navigation system
Figure SMS_35
The difference between the estimated position and the true position
Figure SMS_38
, the angular velocity measured by the gyroscope has zero deviation from the true angular velocity
Figure SMS_31
, the linear velocity measured by the accelerometer has zero deviation from the true linear velocity
Figure SMS_34
And the camera position at each moment
Figure SMS_39
As the state variable of the integrated navigation system
Figure SMS_40
,in
Figure SMS_33
and
Figure SMS_36
Respectively represent the attitude and position of the camera in the navigation system,
Figure SMS_37
The specific form is as follows:

Figure SMS_41
Figure SMS_41
,

Figure SMS_42
Figure SMS_42
,

系统状态方程为:The system state equation is:

Figure SMS_43
Figure SMS_43
,

其中,

Figure SMS_44
为状态转移矩阵,
Figure SMS_45
为系统噪声驱动阵,
Figure SMS_46
为系统噪声;in,
Figure SMS_44
is the state transfer matrix,
Figure SMS_45
is the system noise driving array,
Figure SMS_46
is the system noise;

定义载体系下的偏振矢量为

Figure SMS_47
,太阳矢量为
Figure SMS_48
Figure SMS_49
为从载体系到导航系的坐标转换矩阵;点源式偏振传感器首先测得偏振方位角
Figure SMS_50
之后,得到在载体系下的偏振矢量
Figure SMS_51
:The polarization vector under the carrier system is defined as
Figure SMS_47
, the sun vector is
Figure SMS_48
,
Figure SMS_49
is the coordinate conversion matrix from the carrier system to the navigation system; the point source polarization sensor first measures the polarization azimuth
Figure SMS_50
After that, the polarization vector under the carrier system is obtained
Figure SMS_51
:

Figure SMS_52
Figure SMS_52
,

根据理想瑞利散射模型,导航系中的偏振矢量

Figure SMS_53
垂直于太阳矢量
Figure SMS_54
,得到:According to the ideal Rayleigh scattering model, the polarization vector in the navigation system
Figure SMS_53
Perpendicular to the sun vector
Figure SMS_54
,get:

Figure SMS_55
Figure SMS_55
,

基于此偏振测量方程为:Based on this polarization measurement equation is:

Figure SMS_56
Figure SMS_56
,

其中,

Figure SMS_57
Figure SMS_58
Figure SMS_59
为偏振量测噪声。in,
Figure SMS_57
,
Figure SMS_58
,
Figure SMS_59
is the polarization measurement noise.

进一步地,所述步骤2包括:Furthermore, the step 2 comprises:

偏振相机获得偏振度图像

Figure SMS_60
与偏振角图像
Figure SMS_61
,首先分别对二者进行均值滤波部分消除偏振图像的噪声,滤波后得到偏振度图像
Figure SMS_62
与偏振角图像
Figure SMS_63
,原偏振图像与均值滤波后的偏振图像相减分解获得偏振度细节图像与偏振角细节图像:Polarization camera obtains polarization image
Figure SMS_60
Image with polarization angle
Figure SMS_61
First, the two are respectively subjected to mean filtering to partially eliminate the noise of the polarization image, and the polarization degree image is obtained after filtering.
Figure SMS_62
Image with polarization angle
Figure SMS_63
, the original polarization image and the polarization image after mean filtering are subtracted and decomposed to obtain the polarization degree detail image and the polarization angle detail image:

Figure SMS_64
Figure SMS_64
,

Figure SMS_65
Figure SMS_65
,

其次将均值滤波后的偏振图像加权融合得到图像

Figure SMS_66
,即:Secondly, the polarization images after mean filtering are weighted fused to obtain the image
Figure SMS_66
,Right now:

Figure SMS_67
Figure SMS_67
,

其中,系数

Figure SMS_68
Figure SMS_69
应满足
Figure SMS_70
;Among them, the coefficient
Figure SMS_68
and
Figure SMS_69
Should meet
Figure SMS_70
;

再次将偏振度图像

Figure SMS_71
与偏振角图像
Figure SMS_76
进行形态开运算后得到偏振度图像
Figure SMS_80
与偏振角图像
Figure SMS_73
,计算
Figure SMS_75
Figure SMS_78
的灰度均值,分别为
Figure SMS_82
Figure SMS_74
,将形态开运算后的偏振度图像
Figure SMS_79
中的各像素灰度值
Figure SMS_83
Figure SMS_84
比较,将形态开运算后的偏振角图像
Figure SMS_72
中的各像素灰度值
Figure SMS_77
Figure SMS_81
比较,具体规则如下:Again, the polarization image
Figure SMS_71
Image with polarization angle
Figure SMS_76
After performing morphological opening operation, the polarization image is obtained
Figure SMS_80
Image with polarization angle
Figure SMS_73
,calculate
Figure SMS_75
and
Figure SMS_78
The grayscale mean values are
Figure SMS_82
and
Figure SMS_74
, the polarization image after morphological opening operation
Figure SMS_79
The gray value of each pixel in
Figure SMS_83
and
Figure SMS_84
Compare the polarization angle image after the morphological opening operation
Figure SMS_72
The gray value of each pixel in
Figure SMS_77
and
Figure SMS_81
For comparison, the specific rules are as follows:

Figure SMS_85
Figure SMS_85
,

Figure SMS_86
各个像素点整合得到偏振度图像选通区域
Figure SMS_87
;将
Figure SMS_88
各个像素点整合得到偏振角图像选通区域
Figure SMS_89
;对偏振度细节图像
Figure SMS_90
与偏振角细节图像
Figure SMS_91
进行选通加权融合得到偏振图像
Figure SMS_92
:Will
Figure SMS_86
Each pixel is integrated to obtain the polarization image gating area
Figure SMS_87
;Will
Figure SMS_88
Each pixel point is integrated to obtain the polarization angle image gating area
Figure SMS_89
; For polarization detail image
Figure SMS_90
Detailed image with polarization angle
Figure SMS_91
Perform gated weighted fusion to obtain polarization images
Figure SMS_92
:

Figure SMS_93
Figure SMS_93
,

Figure SMS_94
Figure SMS_95
融合得到最终融合后的偏振图像
Figure SMS_96
,即:Will
Figure SMS_94
and
Figure SMS_95
The final fused polarization image is obtained by fusion
Figure SMS_96
,Right now:

Figure SMS_97
Figure SMS_97
.

进一步地,所述步骤3包括:Furthermore, the step 3 comprises:

将特征点的位置信息通过投影模型在像素平面得到位置坐标,该像素平面的位置坐标对相机的运动信息和特征点的位置信息提供测量,利用特征点三角化得到特征点的三维位置坐标,在针孔模型投影下,视觉整体测量为:The position information of the feature points is projected on the pixel plane to obtain the position coordinates. The position coordinates of the pixel plane provide measurement for the camera's motion information and the position information of the feature points. The three-dimensional position coordinates of the feature points are obtained by triangulating the feature points. Under the pinhole model projection, the overall visual measurement is:

Figure SMS_98
Figure SMS_98
,

其中,

Figure SMS_99
表示第
Figure SMS_100
时刻对第
Figure SMS_101
个特征点的测量,
Figure SMS_102
表示相机在第
Figure SMS_103
时刻对于第
Figure SMS_104
个特征点三角化得到特征点的三维位置坐标,
Figure SMS_105
表示特征点投影在像素平面的横纵坐标;in,
Figure SMS_99
Indicates
Figure SMS_100
Time to
Figure SMS_101
The measurement of feature points,
Figure SMS_102
Indicates that the camera is
Figure SMS_103
Time for the
Figure SMS_104
The feature points are triangulated to obtain the three-dimensional position coordinates of the feature points.
Figure SMS_105
Represents the horizontal and vertical coordinates of the feature point projection on the pixel plane;

根据惯导预测得到相机投影点位置信息视觉测量

Figure SMS_106
,得到重投影误差为:Visual measurement of the camera projection point position information obtained based on inertial navigation prediction
Figure SMS_106
, and the reprojection error is:

Figure SMS_107
Figure SMS_107
,

其中,

Figure SMS_110
表示对应的噪声,
Figure SMS_111
Figure SMS_114
时刻系统的状态,
Figure SMS_109
Figure SMS_113
时刻相机在导航系下观测的第
Figure SMS_115
个特征点的坐标,
Figure SMS_117
Figure SMS_108
分别为与
Figure SMS_112
Figure SMS_116
对应的雅克比矩阵;in,
Figure SMS_110
represents the corresponding noise,
Figure SMS_111
for
Figure SMS_114
The state of the system at the moment,
Figure SMS_109
for
Figure SMS_113
The camera observes the first
Figure SMS_115
The coordinates of the feature points,
Figure SMS_117
,
Figure SMS_108
Respectively
Figure SMS_112
,
Figure SMS_116
The corresponding Jacobian matrix;

在一段累积更新的时刻会观测到

Figure SMS_118
个特征点,将
Figure SMS_119
个特征点的观测叠加得到累积误差为:Over a period of cumulative updates, it is observed
Figure SMS_118
feature points,
Figure SMS_119
The cumulative error obtained by superimposing the observations of feature points is:

Figure SMS_120
Figure SMS_120
,

其中,

Figure SMS_121
表示对应的噪声,
Figure SMS_122
为系统的状态,
Figure SMS_123
为相机在导航系下观测到的特征点的坐标,
Figure SMS_124
Figure SMS_125
分别为与
Figure SMS_126
Figure SMS_127
对应的雅克比矩阵;in,
Figure SMS_121
represents the corresponding noise,
Figure SMS_122
is the state of the system,
Figure SMS_123
is the coordinate of the feature point observed by the camera in the navigation system,
Figure SMS_124
and
Figure SMS_125
Respectively
Figure SMS_126
and
Figure SMS_127
The corresponding Jacobian matrix;

采用QR分解 矩阵,最后得到的视觉量测方程为:Using QR decomposition matrix, the final visual measurement equation is:

Figure SMS_128
Figure SMS_128
,

其中,

Figure SMS_129
为上述视觉量测方程的观测矩阵并由
Figure SMS_130
得到,
Figure SMS_131
Figure SMS_132
为QR分解中间过程变量,对得到
Figure SMS_133
具体形式无影响,
Figure SMS_134
为相对应的噪声。in,
Figure SMS_129
is the observation matrix of the above visual measurement equation and is
Figure SMS_130
get,
Figure SMS_131
and
Figure SMS_132
is the intermediate process variable of QR decomposition, and we get
Figure SMS_133
The specific form has no effect.
Figure SMS_134
is the corresponding noise.

进一步地,所述步骤4包括:Furthermore, the step 4 comprises:

选择多状态约束卡尔曼滤波方法进行无人机组合导航系统的信息组合,其中系统状态预测协方差矩阵为:The multi-state constrained Kalman filtering method is selected for information combination of the UAV integrated navigation system, where the system state prediction covariance matrix is:

Figure SMS_135
Figure SMS_135
,

其中,

Figure SMS_136
为第
Figure SMS_137
时刻惯导状态预测协方差矩阵,
Figure SMS_138
为状态转移矩阵,
Figure SMS_139
为第
Figure SMS_140
时刻相机系与惯导系之间的旋转和平移量的协方差矩阵,
Figure SMS_141
为第
Figure SMS_142
时刻相机位姿的协方差矩阵;in,
Figure SMS_136
For the
Figure SMS_137
The covariance matrix of the inertial navigation state prediction at the moment,
Figure SMS_138
is the state transfer matrix,
Figure SMS_139
For the
Figure SMS_140
The covariance matrix of the rotation and translation between the camera system and the inertial navigation system at the moment,
Figure SMS_141
For the
Figure SMS_142
The covariance matrix of the camera pose at each moment;

随着时间的累积相机位姿的积累完成状态增强,状态增强后的协方差矩阵变为:As time accumulates, the camera poses accumulate and the state enhancement is completed. The covariance matrix after state enhancement becomes:

Figure SMS_143
Figure SMS_143
,

其中,

Figure SMS_144
为新增相机误差状态对原系统误差状态的雅克比矩阵,
Figure SMS_145
Figure SMS_146
维度的单位矩阵,
Figure SMS_147
Figure SMS_148
时刻观测某个特征点时相机的状态个数;in,
Figure SMS_144
is the Jacobian matrix of the newly added camera error state to the original system error state,
Figure SMS_145
for
Figure SMS_146
The identity matrix of dimension ,
Figure SMS_147
for
Figure SMS_148
The number of camera states when observing a certain feature point at any moment;

在无人机组合导航系统中进行导航信息更新时,卡尔曼增益

Figure SMS_149
,新息
Figure SMS_150
和协方差状态更新矩阵分别为:When updating navigation information in the UAV integrated navigation system, the Kalman gain
Figure SMS_149
, new information
Figure SMS_150
And the covariance state update matrix are:

Figure SMS_151
Figure SMS_151
,

Figure SMS_152
Figure SMS_153
Figure SMS_152
,
Figure SMS_153
,

其中,

Figure SMS_155
Figure SMS_159
为视觉量测方程的输出矩阵,
Figure SMS_160
为偏振量测方程的输出矩阵,
Figure SMS_156
Figure SMS_158
时刻的协方差矩阵,
Figure SMS_163
为量测噪声矩阵,
Figure SMS_164
Figure SMS_154
为视觉的量测,
Figure SMS_157
为偏振的量测,
Figure SMS_161
为与
Figure SMS_162
维度相同的单位矩阵。in,
Figure SMS_155
,
Figure SMS_159
is the output matrix of the visual measurement equation,
Figure SMS_160
is the output matrix of the polarization measurement equation,
Figure SMS_156
for
Figure SMS_158
The covariance matrix at time ,
Figure SMS_163
is the measurement noise matrix,
Figure SMS_164
,
Figure SMS_154
For visual measurement,
Figure SMS_157
is the measurement of polarization,
Figure SMS_161
For
Figure SMS_162
Identity matrices of the same dimensions.

本发明与现有技术相比优点在于:Compared with the prior art, the present invention has the following advantages:

偏振相机获得的偏振图像在弱光强环境下图像纹理相较于可见光图像有较大提升。使用融合后的偏振度图像和偏振角图像完成特征点提取进而完成视觉量测的获取,弥补VINS导航系统在弱光强场景下精度较低的缺陷,融合点源式偏振传感器提供的偏振量测,实现无人机VINS系统在弱光强长航时场景下的高精度导航。本发明具有精度高、稳定性好、抗干扰能力强等优点。The polarization image obtained by the polarization camera has a much better image texture than the visible light image in a low-light environment. The fused polarization degree image and polarization angle image are used to complete feature point extraction and then complete the acquisition of visual measurement, making up for the low accuracy of the VINS navigation system in low-light scenes, integrating the polarization measurement provided by the point source polarization sensor, and realizing high-precision navigation of the UAV VINS system in low-light and long-flight scenes. The present invention has the advantages of high accuracy, good stability, and strong anti-interference ability.

附图说明BRIEF DESCRIPTION OF THE DRAWINGS

图1为本发明的一种弱光强环境下仿复眼偏振视觉无人机组合导航方法流程图。FIG1 is a flow chart of a method for combined navigation of unmanned aerial vehicles using compound eye polarization vision in a weak light environment according to the present invention.

具体实施方式DETAILED DESCRIPTION

下面结合图1以及实例对本发明的具体实现步骤说明如下:The specific implementation steps of the present invention are described below in conjunction with FIG. 1 and an example:

如图1所示,本发明的一种弱光强环境下仿复眼偏振视觉无人机组合导航方法包括如下步骤:As shown in FIG1 , a method for combined navigation of a UAV using compound eye polarization vision in a weak light environment of the present invention comprises the following steps:

步骤1、通过点源式偏振传感器获得偏振方位角

Figure SMS_165
,计算求得导航系下的偏振矢量
Figure SMS_166
,通过太阳年历可以计算得到导航系下太阳矢量
Figure SMS_167
,基于理想瑞利散射模型中偏振矢量
Figure SMS_168
与导航系下太阳矢量
Figure SMS_169
的垂直关系,结合惯导状态与相机位姿为状态向量建立系统状态方程与偏振量测方程;Step 1: Obtain the polarization azimuth through a point source polarization sensor
Figure SMS_165
, calculate and obtain the polarization vector under the navigation system
Figure SMS_166
, the solar vector in the navigation system can be calculated through the solar calendar
Figure SMS_167
, based on the polarization vector in the ideal Rayleigh scattering model
Figure SMS_168
The sun vector with the navigation system
Figure SMS_169
The vertical relationship between the inertial navigation state and the camera posture is used as the state vector to establish the system state equation and polarization measurement equation;

步骤2、在弱光强场景下,偏振相机得到偏振度图像

Figure SMS_184
、偏振角图像
Figure SMS_172
,将二者均值滤波后得到偏振度图像
Figure SMS_188
与偏振角图像
Figure SMS_176
,相减分解得到偏振度细节图像
Figure SMS_181
与偏振角细节图像
Figure SMS_182
,将偏振度图像
Figure SMS_183
与偏振角图像
Figure SMS_175
加权融合后得到图像
Figure SMS_178
,将原偏振图像形态开运算得到选通区域
Figure SMS_170
与选通区域
Figure SMS_185
,利用选通区域
Figure SMS_174
与选通区域
Figure SMS_180
对偏振度细节图像
Figure SMS_177
与偏振角细节图像
Figure SMS_186
进行优化后得到图像
Figure SMS_171
,将加权融合后得到图像
Figure SMS_179
与图像
Figure SMS_173
融合后得到融合后的偏振图像
Figure SMS_187
;Step 2: In low light conditions, the polarization camera obtains a polarization image
Figure SMS_184
, polarization angle image
Figure SMS_172
, and then filter the two mean values to get the polarization image
Figure SMS_188
Image with polarization angle
Figure SMS_176
, subtract and decompose to obtain the polarization detail image
Figure SMS_181
Detailed image with polarization angle
Figure SMS_182
, the polarization image
Figure SMS_183
Image with polarization angle
Figure SMS_175
After weighted fusion, the image is obtained
Figure SMS_178
, the original polarization image is opened to obtain the gated area
Figure SMS_170
With gated area
Figure SMS_185
, using the gated region
Figure SMS_174
With gated area
Figure SMS_180
Polarization detail image
Figure SMS_177
Detailed image with polarization angle
Figure SMS_186
After optimization, the image is obtained
Figure SMS_171
, the image is obtained after weighted fusion
Figure SMS_179
With image
Figure SMS_173
After fusion, the fused polarization image is obtained
Figure SMS_187
;

步骤3、结合步骤2中的偏振图像

Figure SMS_189
,在针孔模型下,偏振相机得到视觉整体测量
Figure SMS_190
并提供相机的运动信息,惯导得到预测视觉测量
Figure SMS_191
,二者相减得到
Figure SMS_192
时刻第
Figure SMS_193
个特征点的重投影误差
Figure SMS_194
,通过累积叠加与QR分解得到视觉量测方程;Step 3: Combine the polarization image from step 2
Figure SMS_189
, under the pinhole model, the polarization camera obtains the visual overall measurement
Figure SMS_190
And provide the camera's motion information, inertial navigation to obtain predicted visual measurement
Figure SMS_191
, subtracting the two to get
Figure SMS_192
Moment
Figure SMS_193
The reprojection error of feature points
Figure SMS_194
, the visual measurement equation is obtained through cumulative superposition and QR decomposition;

步骤4、结合步骤1与步骤3建立的偏振量测方程与视觉量测方程,利用多状态约束卡尔曼滤波对各个导航子系统获得的数据进行信息融合,修正惯导逐渐累积的误差,解算无人机组合导航信息。Step 4: Combine the polarization measurement equation and visual measurement equation established in Step 1 and Step 3, use multi-state constrained Kalman filtering to fuse the data obtained by each navigation subsystem, correct the gradually accumulated errors of the inertial navigation, and solve the UAV combined navigation information.

具体实施步骤如下:The specific implementation steps are as follows:

在步骤1中选择导航系到载体系的小角度旋转

Figure SMS_197
,载体在导航系下的估计速度与真实速度差
Figure SMS_199
和估计位置与真实位置差
Figure SMS_203
,陀螺仪测量的角速度与真实角速度零偏差
Figure SMS_196
,加速度计测量的线速度与真实线速度零偏差
Figure SMS_200
以及每个时刻相机的位姿
Figure SMS_202
作为组合导航系统的状态变量
Figure SMS_204
,其中
Figure SMS_195
Figure SMS_198
分别表示相机在导航系下的姿态与位置,
Figure SMS_201
具体形式如下:Select a small rotation angle from the navigation system to the carrier system in step 1
Figure SMS_197
, the difference between the estimated speed and the actual speed of the carrier in the navigation system
Figure SMS_199
The difference between the estimated position and the true position
Figure SMS_203
, the angular velocity measured by the gyroscope has zero deviation from the true angular velocity
Figure SMS_196
, the linear velocity measured by the accelerometer has zero deviation from the true linear velocity
Figure SMS_200
And the camera position at each moment
Figure SMS_202
As the state variable of the integrated navigation system
Figure SMS_204
,in
Figure SMS_195
and
Figure SMS_198
Respectively represent the attitude and position of the camera in the navigation system,
Figure SMS_201
The specific form is as follows:

Figure SMS_205
,
Figure SMS_205
,

Figure SMS_206
,
Figure SMS_206
,

系统状态方程为:The system state equation is:

Figure SMS_207
,
Figure SMS_207
,

其中,

Figure SMS_208
为状态转移矩阵,
Figure SMS_209
为系统噪声驱动阵,
Figure SMS_210
为系统噪声。in,
Figure SMS_208
is the state transfer matrix,
Figure SMS_209
is the system noise driving array,
Figure SMS_210
is the system noise.

定义载体系下的偏振矢量为

Figure SMS_211
,太阳矢量为
Figure SMS_212
Figure SMS_213
为从载体系到导航系的坐标转换矩阵;点源式偏振传感器首先测得偏振方位角
Figure SMS_214
之后,得到在载体系下的偏振矢量
Figure SMS_215
:The polarization vector under the carrier system is defined as
Figure SMS_211
, the sun vector is
Figure SMS_212
,
Figure SMS_213
is the coordinate conversion matrix from the carrier system to the navigation system; the point source polarization sensor first measures the polarization azimuth
Figure SMS_214
After that, the polarization vector under the carrier system is obtained
Figure SMS_215
:

Figure SMS_216
Figure SMS_216
,

根据理想瑞利散射模型,导航系中的偏振矢量

Figure SMS_217
垂直于太阳矢量
Figure SMS_218
,得到:According to the ideal Rayleigh scattering model, the polarization vector in the navigation system
Figure SMS_217
Perpendicular to the sun vector
Figure SMS_218
,get:

Figure SMS_219
Figure SMS_219
,

基于此偏振测量方程为:Based on this polarization measurement equation is:

Figure SMS_220
Figure SMS_220
,

其中,

Figure SMS_221
Figure SMS_222
Figure SMS_223
为偏振量测噪声。in,
Figure SMS_221
,
Figure SMS_222
,
Figure SMS_223
is the polarization measurement noise.

在步骤2中偏振相机获得偏振度图像

Figure SMS_224
与偏振角图像
Figure SMS_225
,首先分别对二者进行均值滤波部分消除偏振图像的噪声,滤波后得到偏振度图像
Figure SMS_226
与偏振角图像
Figure SMS_227
,原偏振图像与均值滤波后的偏振图像相减分解获得偏振度细节图像与偏振角细节图像:In step 2, the polarization camera obtains a polarization image
Figure SMS_224
Image with polarization angle
Figure SMS_225
First, the two are respectively subjected to mean filtering to partially eliminate the noise of the polarization image, and the polarization degree image is obtained after filtering.
Figure SMS_226
Image with polarization angle
Figure SMS_227
, the original polarization image and the polarization image after mean filtering are subtracted and decomposed to obtain the polarization degree detail image and the polarization angle detail image:

Figure SMS_228
Figure SMS_228
,

Figure SMS_229
Figure SMS_229
,

其次将均值滤波后的偏振图像加权融合得到图像

Figure SMS_230
,即:Secondly, the polarization images after mean filtering are weighted fused to obtain the image
Figure SMS_230
,Right now:

Figure SMS_231
Figure SMS_231
,

其中,系数

Figure SMS_232
Figure SMS_233
应满足
Figure SMS_234
;Among them, the coefficient
Figure SMS_232
and
Figure SMS_233
Should meet
Figure SMS_234
;

再次将偏振度图像

Figure SMS_237
与偏振角图像
Figure SMS_239
进行形态开运算后得到偏振度图像
Figure SMS_242
与偏振角图像
Figure SMS_236
,计算
Figure SMS_240
Figure SMS_243
的灰度均值,将形态开运算后的偏振度图像
Figure SMS_246
中的各像素灰度值
Figure SMS_238
Figure SMS_241
比较,将形态开运算后的偏振角图像
Figure SMS_244
中的各像素灰度值
Figure SMS_245
Figure SMS_235
比较,具体规则如下:Again, the polarization image
Figure SMS_237
Image with polarization angle
Figure SMS_239
After performing morphological opening operation, the polarization image is obtained
Figure SMS_242
Image with polarization angle
Figure SMS_236
,calculate
Figure SMS_240
and
Figure SMS_243
The grayscale mean of the image is the polarization image after the morphological opening operation.
Figure SMS_246
The gray value of each pixel in
Figure SMS_238
and
Figure SMS_241
Compare the polarization angle image after the morphological opening operation
Figure SMS_244
The gray value of each pixel in
Figure SMS_245
and
Figure SMS_235
For comparison, the specific rules are as follows:

Figure SMS_247
Figure SMS_247
,

Figure SMS_248
各个像素点整合得到偏振度图像选通区域
Figure SMS_249
;将
Figure SMS_250
各个像素点整合得到偏振角图像选通区域
Figure SMS_251
。对偏振度细节图像
Figure SMS_252
与偏振角细节图像
Figure SMS_253
进行选通加权融合得到偏振图像
Figure SMS_254
:Will
Figure SMS_248
Each pixel is integrated to obtain the polarization image gating area
Figure SMS_249
;Will
Figure SMS_250
Each pixel point is integrated to obtain the polarization angle image gating area
Figure SMS_251
. For polarization detail images
Figure SMS_252
Detailed image with polarization angle
Figure SMS_253
Perform gated weighted fusion to obtain polarization images
Figure SMS_254
:

Figure SMS_255
Figure SMS_255
,

Figure SMS_256
Figure SMS_257
融合得到最终融合后的偏振图像
Figure SMS_258
,即:Will
Figure SMS_256
and
Figure SMS_257
The final fused polarization image is obtained by fusion
Figure SMS_258
,Right now:

Figure SMS_259
Figure SMS_259
,

在步骤3中,将特征点的位置信息通过投影模型在像素平面得到位置坐标,该像素平面位置坐标对相机的运动信息和特征点的位置信息提供测量,利用特征点三角化得到特征点的三维位置坐标,在针孔模型投影下,视觉整体测量为:In step 3, the position information of the feature points is projected on the pixel plane to obtain the position coordinates. The pixel plane position coordinates provide measurement for the camera's motion information and the position information of the feature points. The three-dimensional position coordinates of the feature points are obtained by triangulating the feature points. Under the pinhole model projection, the overall visual measurement is:

Figure SMS_260
Figure SMS_260
,

其中,

Figure SMS_261
表示第
Figure SMS_262
时刻对第
Figure SMS_263
个特征点的测量,
Figure SMS_264
表示相机在第
Figure SMS_265
时刻对于第
Figure SMS_266
个特征点三角化得到特征点的三维位置坐标,
Figure SMS_267
表示特征点投影在像素平面的横纵坐标;in,
Figure SMS_261
Indicates
Figure SMS_262
Time to
Figure SMS_263
The measurement of feature points,
Figure SMS_264
Indicates that the camera is
Figure SMS_265
Time for the
Figure SMS_266
The feature points are triangulated to obtain the three-dimensional position coordinates of the feature points.
Figure SMS_267
Represents the horizontal and vertical coordinates of the feature point projection on the pixel plane;

根据惯导预测得到相机投影点位置信息视觉测量

Figure SMS_269
,得到重投影误差为:
Figure SMS_272
,其中,
Figure SMS_277
表示对应的噪声,
Figure SMS_268
Figure SMS_274
时刻系统的状态,
Figure SMS_276
Figure SMS_278
时刻相机在导航系下观测的第
Figure SMS_271
个特征点的坐标,
Figure SMS_273
Figure SMS_275
分别为与
Figure SMS_279
Figure SMS_270
对应的雅克比矩阵。Visual measurement of the camera projection point position information obtained based on inertial navigation prediction
Figure SMS_269
, and the reprojection error is:
Figure SMS_272
,in,
Figure SMS_277
represents the corresponding noise,
Figure SMS_268
for
Figure SMS_274
The state of the system at the moment,
Figure SMS_276
for
Figure SMS_278
The camera observes the first
Figure SMS_271
The coordinates of the feature points,
Figure SMS_273
,
Figure SMS_275
Respectively
Figure SMS_279
,
Figure SMS_270
The corresponding Jacobian matrix.

在一段累积更新的时刻会观测到

Figure SMS_280
个特征点,将
Figure SMS_281
个特征点的观测叠加得到累积误差为:Over a period of cumulative updates, it is observed
Figure SMS_280
feature points,
Figure SMS_281
The cumulative error obtained by superimposing the observations of feature points is:

Figure SMS_282
Figure SMS_282
,

其中,

Figure SMS_283
表示对应的噪声,
Figure SMS_284
为系统的状态,
Figure SMS_285
为相机在导航系下观测到的特征点的坐标,
Figure SMS_286
Figure SMS_287
分别为与
Figure SMS_288
Figure SMS_289
对应的雅克比矩阵;in,
Figure SMS_283
represents the corresponding noise,
Figure SMS_284
is the state of the system,
Figure SMS_285
is the coordinate of the feature point observed by the camera in the navigation system,
Figure SMS_286
and
Figure SMS_287
Respectively
Figure SMS_288
and
Figure SMS_289
The corresponding Jacobian matrix;

在实际计算过程中,由于

Figure SMS_290
的维度太大,采用QR分解
Figure SMS_291
矩阵,最后得到的视觉量测方程为:In the actual calculation process, due to
Figure SMS_290
The dimension is too large, so QR decomposition is used
Figure SMS_291
Matrix, the final visual measurement equation is:

Figure SMS_292
Figure SMS_292
,

其中,

Figure SMS_293
为上述视觉量测方程的观测矩阵并由
Figure SMS_294
得到,
Figure SMS_295
Figure SMS_296
为QR分解中间过程变量,对得到
Figure SMS_297
具体形式无影响,
Figure SMS_298
为相对应的噪声。in,
Figure SMS_293
is the observation matrix of the above visual measurement equation and is
Figure SMS_294
get,
Figure SMS_295
and
Figure SMS_296
is the intermediate process variable of QR decomposition, and we get
Figure SMS_297
The specific form has no effect.
Figure SMS_298
is the corresponding noise.

在步骤4中选择多状态约束卡尔曼滤波方法进行无人机组合导航系统的信息组合,其中系统状态预测协方差矩阵为:In step 4, the multi-state constrained Kalman filter method is selected to combine the information of the UAV integrated navigation system, where the system state prediction covariance matrix is:

Figure SMS_299
Figure SMS_299
,

其中,

Figure SMS_300
为第
Figure SMS_301
时刻惯导状态预测协方差矩阵,
Figure SMS_302
为状态转移矩阵,
Figure SMS_303
为第
Figure SMS_304
时刻相机系与惯导系之间的旋转和平移量的协方差矩阵,
Figure SMS_305
为第
Figure SMS_306
时刻相机位姿的协方差矩阵;in,
Figure SMS_300
For the
Figure SMS_301
The covariance matrix of the inertial navigation state prediction at the moment,
Figure SMS_302
is the state transfer matrix,
Figure SMS_303
For the
Figure SMS_304
The covariance matrix of the rotation and translation between the camera system and the inertial navigation system at the moment,
Figure SMS_305
For the
Figure SMS_306
The covariance matrix of the camera pose at each moment;

随着时间的累积相机位姿的积累需要完成状态增强,状态增强后的协方差矩阵变为:As time goes by, the camera pose accumulation needs to complete state enhancement, and the covariance matrix after state enhancement becomes:

Figure SMS_307
Figure SMS_307
,

其中,

Figure SMS_308
为新增相机误差状态对原系统误差状态的雅克比矩阵,
Figure SMS_309
Figure SMS_310
维度的单位矩阵,
Figure SMS_311
Figure SMS_312
时刻观测某个特征点时相机的状态个数。in,
Figure SMS_308
is the Jacobian matrix of the newly added camera error state to the original system error state,
Figure SMS_309
for
Figure SMS_310
The identity matrix of dimension ,
Figure SMS_311
for
Figure SMS_312
The number of camera states when observing a feature point at any moment.

在无人机组合导航系统中进行导航信息更新时,卡尔曼增益

Figure SMS_313
,新息
Figure SMS_314
和协方差状态更新矩阵分别为:When updating navigation information in the UAV integrated navigation system, the Kalman gain
Figure SMS_313
, new information
Figure SMS_314
And the covariance state update matrix are:

Figure SMS_315
Figure SMS_315
,

Figure SMS_316
Figure SMS_317
Figure SMS_316
,
Figure SMS_317
,

其中,

Figure SMS_320
Figure SMS_322
为视觉量测方程的输出矩阵,
Figure SMS_325
为偏振量测方程的输出矩阵,
Figure SMS_319
Figure SMS_324
时刻的协方差矩阵,
Figure SMS_327
为量测噪声矩阵,
Figure SMS_328
Figure SMS_318
为视觉的量测,
Figure SMS_321
为偏振的量测,
Figure SMS_323
为与
Figure SMS_326
维度相同的单位矩阵。in,
Figure SMS_320
,
Figure SMS_322
is the output matrix of the visual measurement equation,
Figure SMS_325
is the output matrix of the polarization measurement equation,
Figure SMS_319
for
Figure SMS_324
The covariance matrix at time ,
Figure SMS_327
is the measurement noise matrix,
Figure SMS_328
,
Figure SMS_318
For visual measurement,
Figure SMS_321
is the measurement of polarization,
Figure SMS_323
For
Figure SMS_326
Identity matrices of the same dimensions.

本发明说明书中未作详细描述的内容属于本领域专业技术人员公知的现有技术。以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以所述权利要求的保护范围为准。The contents not described in detail in the specification of the present invention belong to the prior art known to the professional and technical personnel in the field. The above description is only a specific embodiment of the present invention, but the protection scope of the present invention is not limited thereto. Any technician familiar with the technical field can easily think of changes or substitutions within the technical scope disclosed by the present invention, which should be included in the protection scope of the present invention. Therefore, the protection scope of the present invention shall be based on the protection scope of the claims.

Claims (5)

1.一种弱光强环境下仿复眼偏振视觉无人机组合导航方法,其特征在于,包括如下步骤:1. A method for combined navigation of unmanned aerial vehicles using compound eye polarization vision in a weak light environment, characterized in that it comprises the following steps: 步骤1、通过点源式偏振传感器获得偏振方位角
Figure QLYQS_1
,计算求得导航系下的偏振矢量
Figure QLYQS_2
,通过太阳年历计算得到导航系下太阳矢量
Figure QLYQS_3
,基于理想瑞利散射模型中偏振矢量
Figure QLYQS_4
与导航系下太阳矢量
Figure QLYQS_5
的垂直关系,结合惯导状态与相机位姿为状态向量建立系统状态方程与偏振量测方程;
Step 1: Obtain the polarization azimuth through a point source polarization sensor
Figure QLYQS_1
, calculate and obtain the polarization vector under the navigation system
Figure QLYQS_2
, the solar vector in the navigation system is obtained by calculating the solar calendar
Figure QLYQS_3
, based on the polarization vector in the ideal Rayleigh scattering model
Figure QLYQS_4
The sun vector with the navigation system
Figure QLYQS_5
The vertical relationship between the inertial navigation state and the camera posture is used as the state vector to establish the system state equation and polarization measurement equation;
步骤2、在弱光强场景下,偏振相机得到偏振度图像
Figure QLYQS_12
、偏振角图像
Figure QLYQS_8
,将二者均值滤波后得到偏振度图像
Figure QLYQS_17
与偏振角图像
Figure QLYQS_10
,相减分解得到偏振度细节图像
Figure QLYQS_15
与偏振角细节图像
Figure QLYQS_11
,将偏振度图像
Figure QLYQS_20
与偏振角图像
Figure QLYQS_14
加权融合后得到图像
Figure QLYQS_22
,将原偏振图像形态开运算得到选通区域
Figure QLYQS_6
与选通区域
Figure QLYQS_19
,利用选通区域
Figure QLYQS_13
与选通区域
Figure QLYQS_23
对偏振度细节图像
Figure QLYQS_21
与偏振角细节图像
Figure QLYQS_24
进行优化后得到图像
Figure QLYQS_7
,将加权融合后得到图像
Figure QLYQS_18
与图像
Figure QLYQS_9
融合后得到融合后的偏振图像
Figure QLYQS_16
Step 2: In low light conditions, the polarization camera obtains a polarization image
Figure QLYQS_12
, polarization angle image
Figure QLYQS_8
, and then filter the two mean values to get the polarization image
Figure QLYQS_17
Image with polarization angle
Figure QLYQS_10
, subtract and decompose to obtain the polarization detail image
Figure QLYQS_15
Detailed image with polarization angle
Figure QLYQS_11
, the polarization image
Figure QLYQS_20
Image with polarization angle
Figure QLYQS_14
After weighted fusion, the image is obtained
Figure QLYQS_22
, the original polarization image is opened to obtain the gated area
Figure QLYQS_6
With gated area
Figure QLYQS_19
, using the gated region
Figure QLYQS_13
With gated area
Figure QLYQS_23
Polarization detail image
Figure QLYQS_21
Detailed image with polarization angle
Figure QLYQS_24
After optimization, the image is obtained
Figure QLYQS_7
, the image is obtained after weighted fusion
Figure QLYQS_18
With image
Figure QLYQS_9
After fusion, the fused polarization image is obtained
Figure QLYQS_16
;
步骤3、结合步骤2中的偏振图像
Figure QLYQS_25
,在针孔模型下,偏振相机得到视觉整体测量
Figure QLYQS_26
并提供相机的运动信息,惯导得到预测视觉测量
Figure QLYQS_27
,二者相减得到
Figure QLYQS_28
时刻第
Figure QLYQS_29
个特征点的重投影误差
Figure QLYQS_30
,通过累积叠加与QR分解得到视觉量测方程;
Step 3: Combine the polarization image from step 2
Figure QLYQS_25
, under the pinhole model, the polarization camera obtains the visual overall measurement
Figure QLYQS_26
And provide the camera's motion information, inertial navigation to obtain predicted visual measurement
Figure QLYQS_27
, subtracting the two to get
Figure QLYQS_28
Moment
Figure QLYQS_29
The reprojection error of feature points
Figure QLYQS_30
, the visual measurement equation is obtained through cumulative superposition and QR decomposition;
步骤4、结合步骤1与步骤3建立的偏振量测方程与视觉量测方程,利用多状态约束卡尔曼滤波对各个导航子系统获得的数据进行信息融合,修正惯导逐渐累积的误差,解算无人机组合导航信息。Step 4: Combine the polarization measurement equation and visual measurement equation established in Step 1 and Step 3, use multi-state constrained Kalman filtering to fuse the data obtained by each navigation subsystem, correct the gradually accumulated errors of the inertial navigation, and solve the UAV combined navigation information.
2.根据权利要求1所述的一种弱光强环境下仿复眼偏振视觉无人机组合导航方法,其特征在于:所述步骤1包括:2. According to the method for combined navigation of unmanned aerial vehicles with simulated compound eye polarization vision in a weak light environment in claim 1, it is characterized in that: the step 1 comprises: 选择导航系到载体系的小角度旋转
Figure QLYQS_33
,载体在导航系下的估计速度与真实速度差
Figure QLYQS_34
和估计位置与真实位置差
Figure QLYQS_37
,陀螺仪测量的角速度与真实角速度零偏差
Figure QLYQS_32
,加速度计测量的线速度与真实线速度零偏差
Figure QLYQS_36
以及每个时刻相机的位姿
Figure QLYQS_39
作为组合导航系统的状态变量
Figure QLYQS_40
,其中
Figure QLYQS_31
Figure QLYQS_35
分别表示相机在导航系下的姿态与位置,
Figure QLYQS_38
具体形式如下:
Select a small angle rotation from the navigation system to the carrier system
Figure QLYQS_33
, the difference between the estimated speed and the actual speed of the carrier in the navigation system
Figure QLYQS_34
The difference between the estimated position and the true position
Figure QLYQS_37
, the angular velocity measured by the gyroscope has zero deviation from the true angular velocity
Figure QLYQS_32
, the linear velocity measured by the accelerometer has zero deviation from the true linear velocity
Figure QLYQS_36
And the camera position at each moment
Figure QLYQS_39
As the state variable of the integrated navigation system
Figure QLYQS_40
,in
Figure QLYQS_31
and
Figure QLYQS_35
Respectively represent the attitude and position of the camera in the navigation system,
Figure QLYQS_38
The specific form is as follows:
Figure QLYQS_41
,
Figure QLYQS_41
,
Figure QLYQS_42
,
Figure QLYQS_42
,
系统状态方程为:The system state equation is:
Figure QLYQS_43
,
Figure QLYQS_43
,
其中,
Figure QLYQS_44
为状态转移矩阵,
Figure QLYQS_45
为系统噪声驱动阵,
Figure QLYQS_46
为系统噪声;
in,
Figure QLYQS_44
is the state transfer matrix,
Figure QLYQS_45
is the system noise driving array,
Figure QLYQS_46
is the system noise;
定义载体系下的偏振矢量为
Figure QLYQS_47
,太阳矢量为
Figure QLYQS_48
Figure QLYQS_49
为从载体系到导航系的坐标转换矩阵;点源式偏振传感器首先测得偏振方位角
Figure QLYQS_50
之后,得到在载体系下的偏振矢量
Figure QLYQS_51
The polarization vector under the carrier system is defined as
Figure QLYQS_47
, the sun vector is
Figure QLYQS_48
,
Figure QLYQS_49
is the coordinate conversion matrix from the carrier system to the navigation system; the point source polarization sensor first measures the polarization azimuth
Figure QLYQS_50
After that, the polarization vector under the carrier system is obtained
Figure QLYQS_51
:
Figure QLYQS_52
,
Figure QLYQS_52
,
根据理想瑞利散射模型,导航系中的偏振矢量
Figure QLYQS_53
垂直于太阳矢量
Figure QLYQS_54
,得到:
According to the ideal Rayleigh scattering model, the polarization vector in the navigation system
Figure QLYQS_53
Perpendicular to the sun vector
Figure QLYQS_54
,get:
Figure QLYQS_55
,
Figure QLYQS_55
,
基于此偏振测量方程为:Based on this polarization measurement equation is:
Figure QLYQS_56
Figure QLYQS_56
,
其中,
Figure QLYQS_57
Figure QLYQS_58
Figure QLYQS_59
为偏振量测噪声。
in,
Figure QLYQS_57
,
Figure QLYQS_58
,
Figure QLYQS_59
is the polarization measurement noise.
3.根据权利要求2所述的一种弱光强环境下仿复眼偏振视觉无人机组合导航方法,其特征在于:所述步骤2包括:3. The method for combined navigation of unmanned aerial vehicles using compound eye polarization vision in a weak light environment according to claim 2, wherein step 2 comprises: 偏振相机获得偏振度图像
Figure QLYQS_60
与偏振角图像
Figure QLYQS_61
,首先分别对二者进行均值滤波部分消除偏振图像的噪声,滤波后得到偏振度图像
Figure QLYQS_62
与偏振角图像
Figure QLYQS_63
,原偏振图像与均值滤波后的偏振图像相减分解获得偏振度细节图像与偏振角细节图像:
Polarization camera obtains polarization image
Figure QLYQS_60
Image with polarization angle
Figure QLYQS_61
First, the two are respectively subjected to mean filtering to partially eliminate the noise of the polarization image, and the polarization degree image is obtained after filtering.
Figure QLYQS_62
Image with polarization angle
Figure QLYQS_63
, the original polarization image and the polarization image after mean filtering are subtracted and decomposed to obtain the polarization degree detail image and the polarization angle detail image:
Figure QLYQS_64
Figure QLYQS_64
,
Figure QLYQS_65
Figure QLYQS_65
,
其次将均值滤波后的偏振图像加权融合得到图像
Figure QLYQS_66
,即:
Secondly, the polarization images after mean filtering are weighted fused to obtain the image
Figure QLYQS_66
,Right now:
Figure QLYQS_67
Figure QLYQS_67
,
其中,系数
Figure QLYQS_68
Figure QLYQS_69
应满足
Figure QLYQS_70
Among them, the coefficient
Figure QLYQS_68
and
Figure QLYQS_69
Should meet
Figure QLYQS_70
;
再次将偏振度图像
Figure QLYQS_74
与偏振角图像
Figure QLYQS_76
进行形态开运算后得到偏振度图像
Figure QLYQS_79
与偏振角图像
Figure QLYQS_71
,计算
Figure QLYQS_77
Figure QLYQS_81
的灰度均值,分别为
Figure QLYQS_83
Figure QLYQS_73
,将形态开运算后的偏振度图像
Figure QLYQS_75
中的各像素灰度值
Figure QLYQS_80
Figure QLYQS_85
比较,将形态开运算后的偏振角图像
Figure QLYQS_72
中的各像素灰度值
Figure QLYQS_78
Figure QLYQS_82
比较,具体规则如下:
Figure QLYQS_84
Again, the polarization image
Figure QLYQS_74
Image with polarization angle
Figure QLYQS_76
After performing morphological opening operation, the polarization image is obtained
Figure QLYQS_79
Image with polarization angle
Figure QLYQS_71
,calculate
Figure QLYQS_77
and
Figure QLYQS_81
The grayscale mean values are
Figure QLYQS_83
and
Figure QLYQS_73
, the polarization image after morphological opening operation
Figure QLYQS_75
The gray value of each pixel in
Figure QLYQS_80
and
Figure QLYQS_85
Compare the polarization angle image after the morphological opening operation
Figure QLYQS_72
The gray value of each pixel in
Figure QLYQS_78
and
Figure QLYQS_82
For comparison, the specific rules are as follows:
Figure QLYQS_84
,
Figure QLYQS_86
各个像素点整合得到偏振度图像选通区域
Figure QLYQS_87
;将
Figure QLYQS_88
各个像素点整合得到偏振角图像选通区域
Figure QLYQS_89
;对偏振度细节图像
Figure QLYQS_90
与偏振角细节图像
Figure QLYQS_91
进行选通加权融合得到偏振图像
Figure QLYQS_92
Will
Figure QLYQS_86
Each pixel is integrated to obtain the polarization image gating area
Figure QLYQS_87
;Will
Figure QLYQS_88
Each pixel point is integrated to obtain the polarization angle image gating area
Figure QLYQS_89
; For polarization detail image
Figure QLYQS_90
Detailed image with polarization angle
Figure QLYQS_91
Perform gated weighted fusion to obtain polarization images
Figure QLYQS_92
:
Figure QLYQS_93
Figure QLYQS_93
,
Figure QLYQS_94
Figure QLYQS_95
融合得到最终融合后的偏振图像
Figure QLYQS_96
,即:
Will
Figure QLYQS_94
and
Figure QLYQS_95
The final fused polarization image is obtained by fusion
Figure QLYQS_96
,Right now:
Figure QLYQS_97
Figure QLYQS_97
.
4.根据权利要求3所述的一种弱光强环境下仿复眼偏振视觉无人机组合导航方法,其特征在于:所述步骤3包括:4. The method for combined navigation of unmanned aerial vehicles using compound eye polarization vision in a weak light environment according to claim 3, wherein step 3 comprises: 将特征点的位置信息通过投影模型在像素平面得到位置坐标,该像素平面的位置坐标对相机的运动信息和特征点的位置信息提供测量,利用特征点三角化得到特征点的三维位置坐标,在针孔模型投影下,视觉整体测量为:The position information of the feature points is projected on the pixel plane to obtain the position coordinates. The position coordinates of the pixel plane provide measurement for the camera's motion information and the position information of the feature points. The three-dimensional position coordinates of the feature points are obtained by triangulating the feature points. Under the pinhole model projection, the overall visual measurement is:
Figure QLYQS_98
Figure QLYQS_98
,
其中,
Figure QLYQS_99
表示第
Figure QLYQS_100
时刻对第
Figure QLYQS_101
个特征点的测量,
Figure QLYQS_102
表示相机在第
Figure QLYQS_103
时刻对于第
Figure QLYQS_104
个特征点三角化得到特征点的三维位置坐标,
Figure QLYQS_105
表示特征点投影在像素平面的横纵坐标;
in,
Figure QLYQS_99
Indicates
Figure QLYQS_100
Time to
Figure QLYQS_101
The measurement of feature points,
Figure QLYQS_102
Indicates that the camera is
Figure QLYQS_103
Time for the
Figure QLYQS_104
The feature points are triangulated to obtain the three-dimensional position coordinates of the feature points.
Figure QLYQS_105
Represents the horizontal and vertical coordinates of the feature point projection on the pixel plane;
根据惯导预测得到相机投影点位置信息视觉测量
Figure QLYQS_106
,得到重投影误差为:
Figure QLYQS_107
Visual measurement of the camera projection point position information obtained based on inertial navigation prediction
Figure QLYQS_106
, and the reprojection error is:
Figure QLYQS_107
,
其中,
Figure QLYQS_110
表示对应的噪声,
Figure QLYQS_112
Figure QLYQS_115
时刻系统的状态,
Figure QLYQS_109
Figure QLYQS_113
时刻相机在导航系下观测的第
Figure QLYQS_116
个特征点的坐标,
Figure QLYQS_117
Figure QLYQS_108
分别为与
Figure QLYQS_111
Figure QLYQS_114
对应的雅克比矩阵;
in,
Figure QLYQS_110
represents the corresponding noise,
Figure QLYQS_112
for
Figure QLYQS_115
The state of the system at the moment,
Figure QLYQS_109
for
Figure QLYQS_113
The camera observes the first
Figure QLYQS_116
The coordinates of the feature points,
Figure QLYQS_117
,
Figure QLYQS_108
Respectively
Figure QLYQS_111
,
Figure QLYQS_114
The corresponding Jacobian matrix;
在一段累积更新的时刻会观测到
Figure QLYQS_118
个特征点,将
Figure QLYQS_119
个特征点的观测叠加得到累积误差为:
Over a period of cumulative updates, it is observed
Figure QLYQS_118
feature points,
Figure QLYQS_119
The cumulative error obtained by superimposing the observations of feature points is:
Figure QLYQS_120
Figure QLYQS_120
,
其中,
Figure QLYQS_121
表示对应的噪声,
Figure QLYQS_122
为系统的状态,
Figure QLYQS_123
为相机在导航系下观测到的特征点的坐标,
Figure QLYQS_124
Figure QLYQS_125
分别为与
Figure QLYQS_126
Figure QLYQS_127
对应的雅克比矩阵;
in,
Figure QLYQS_121
represents the corresponding noise,
Figure QLYQS_122
is the state of the system,
Figure QLYQS_123
is the coordinate of the feature point observed by the camera in the navigation system,
Figure QLYQS_124
and
Figure QLYQS_125
Respectively
Figure QLYQS_126
and
Figure QLYQS_127
The corresponding Jacobian matrix;
采用QR分解 矩阵,最后得到的视觉量测方程为:Using QR decomposition matrix, the final visual measurement equation is:
Figure QLYQS_128
Figure QLYQS_128
,
其中,
Figure QLYQS_129
为上述视觉量测方程的观测矩阵并由
Figure QLYQS_130
得到,
Figure QLYQS_131
Figure QLYQS_132
为QR分解中间过程变量,对得到
Figure QLYQS_133
具体形式无影响,
Figure QLYQS_134
为相对应的噪声。
in,
Figure QLYQS_129
is the observation matrix of the above visual measurement equation and is
Figure QLYQS_130
get,
Figure QLYQS_131
and
Figure QLYQS_132
is the intermediate process variable of QR decomposition, and we get
Figure QLYQS_133
The specific form has no effect.
Figure QLYQS_134
is the corresponding noise.
5.根据权利要求4所述的一种弱光强环境下仿复眼偏振视觉无人机组合导航方法,其特征在于:所述步骤4包括:5. The method for combined navigation of unmanned aerial vehicles using compound eye polarization vision in a weak light environment according to claim 4, wherein step 4 comprises: 选择多状态约束卡尔曼滤波方法进行无人机组合导航系统的信息组合,其中系统状态预测协方差矩阵为:The multi-state constrained Kalman filtering method is selected for information combination of the UAV integrated navigation system, where the system state prediction covariance matrix is:
Figure QLYQS_135
Figure QLYQS_135
,
其中,
Figure QLYQS_136
为第
Figure QLYQS_137
时刻惯导状态预测协方差矩阵,
Figure QLYQS_138
为状态转移矩阵,
Figure QLYQS_139
为第
Figure QLYQS_140
时刻相机系与惯导系之间的旋转和平移量的协方差矩阵,
Figure QLYQS_141
为第
Figure QLYQS_142
时刻相机位姿的协方差矩阵;
in,
Figure QLYQS_136
For the
Figure QLYQS_137
The covariance matrix of the inertial navigation state prediction at the moment,
Figure QLYQS_138
is the state transfer matrix,
Figure QLYQS_139
For the
Figure QLYQS_140
The covariance matrix of the rotation and translation between the camera system and the inertial navigation system at the moment,
Figure QLYQS_141
For the
Figure QLYQS_142
The covariance matrix of the camera pose at each moment;
随着时间的累积相机位姿的积累完成状态增强,状态增强后的协方差矩阵变为:As time accumulates, the camera poses accumulate and the state enhancement is completed. The covariance matrix after state enhancement becomes:
Figure QLYQS_143
Figure QLYQS_143
,
其中,
Figure QLYQS_144
为新增相机误差状态对原系统误差状态的雅克比矩阵,
Figure QLYQS_145
Figure QLYQS_146
维度的单位矩阵,
Figure QLYQS_147
Figure QLYQS_148
时刻观测某个特征点时相机的状态个数;
in,
Figure QLYQS_144
is the Jacobian matrix of the newly added camera error state to the original system error state,
Figure QLYQS_145
for
Figure QLYQS_146
The identity matrix of dimension ,
Figure QLYQS_147
for
Figure QLYQS_148
The number of camera states when observing a certain feature point at any moment;
在无人机组合导航系统中进行导航信息更新时,卡尔曼增益
Figure QLYQS_149
,新息
Figure QLYQS_150
和协方差状态更新矩阵分别为:
When updating navigation information in the UAV integrated navigation system, the Kalman gain
Figure QLYQS_149
, new information
Figure QLYQS_150
And the covariance state update matrix are:
Figure QLYQS_151
Figure QLYQS_151
,
Figure QLYQS_152
Figure QLYQS_152
,
Figure QLYQS_153
Figure QLYQS_153
,
其中,
Figure QLYQS_155
Figure QLYQS_157
为视觉量测方程的输出矩阵,
Figure QLYQS_158
为偏振量测方程的输出矩阵,
Figure QLYQS_156
Figure QLYQS_159
时刻的协方差矩阵,
Figure QLYQS_163
为量测噪声矩阵,
Figure QLYQS_164
Figure QLYQS_154
为视觉的量测,
Figure QLYQS_160
为偏振的量测,
Figure QLYQS_161
为与
Figure QLYQS_162
维度相同的单位矩阵。
in,
Figure QLYQS_155
,
Figure QLYQS_157
is the output matrix of the visual measurement equation,
Figure QLYQS_158
is the output matrix of the polarization measurement equation,
Figure QLYQS_156
for
Figure QLYQS_159
The covariance matrix at time ,
Figure QLYQS_163
is the measurement noise matrix,
Figure QLYQS_164
,
Figure QLYQS_154
For visual measurement,
Figure QLYQS_160
is the measurement of polarization,
Figure QLYQS_161
For
Figure QLYQS_162
Identity matrices of the same dimensions.
CN202310474059.3A 2023-04-28 2023-04-28 Combined navigation method of compound eye-simulated polarized vision unmanned aerial vehicle under weak light and strong environment Active CN116182855B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310474059.3A CN116182855B (en) 2023-04-28 2023-04-28 Combined navigation method of compound eye-simulated polarized vision unmanned aerial vehicle under weak light and strong environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310474059.3A CN116182855B (en) 2023-04-28 2023-04-28 Combined navigation method of compound eye-simulated polarized vision unmanned aerial vehicle under weak light and strong environment

Publications (2)

Publication Number Publication Date
CN116182855A true CN116182855A (en) 2023-05-30
CN116182855B CN116182855B (en) 2023-07-07

Family

ID=86452727

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310474059.3A Active CN116182855B (en) 2023-04-28 2023-04-28 Combined navigation method of compound eye-simulated polarized vision unmanned aerial vehicle under weak light and strong environment

Country Status (1)

Country Link
CN (1) CN116182855B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117191047A (en) * 2023-11-03 2023-12-08 南京信息工程大学 Unmanned aerial vehicle self-adaptive active visual navigation method and device in low-light environment

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140022539A1 (en) * 2012-07-23 2014-01-23 Trimble Navigation Limited Use of a sky polarization sensor for absolute orientation determination in position determining systems
CN106767752A (en) * 2016-11-25 2017-05-31 北京航空航天大学 A kind of Combinated navigation method based on polarization information
CN108387206A (en) * 2018-01-23 2018-08-10 北京航空航天大学 A kind of carrier three-dimensional attitude acquisition method based on horizon and polarised light
CN111504312A (en) * 2020-07-02 2020-08-07 中国人民解放军国防科技大学 Unmanned aerial vehicle pose estimation method based on visual inertial polarized light fusion
CN112066979A (en) * 2020-08-27 2020-12-11 北京航空航天大学 Polarization pose information coupling iteration autonomous navigation positioning method
CN112444245A (en) * 2020-11-17 2021-03-05 大连理工大学 Insect-imitated vision integrated navigation method based on polarized light, optical flow vector and binocular vision sensor
CN113739795A (en) * 2021-06-03 2021-12-03 东北电力大学 Underwater synchronous positioning and mapping method based on polarized light/inertia/vision combined navigation
CN113819907A (en) * 2021-11-22 2021-12-21 北京航空航天大学 An Inertial/Polarization Navigation Method Based on Polarization and Sun Dual Vector Switching
CN113819904A (en) * 2021-11-22 2021-12-21 北京航空航天大学 polarization/VIO three-dimensional attitude determination method based on zenith vector
CN113834483A (en) * 2021-11-23 2021-12-24 北京航空航天大学 An Inertial/Polarization/Geomagnetic Fault Tolerant Navigation Method Based on Observability

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140022539A1 (en) * 2012-07-23 2014-01-23 Trimble Navigation Limited Use of a sky polarization sensor for absolute orientation determination in position determining systems
CN106767752A (en) * 2016-11-25 2017-05-31 北京航空航天大学 A kind of Combinated navigation method based on polarization information
CN108387206A (en) * 2018-01-23 2018-08-10 北京航空航天大学 A kind of carrier three-dimensional attitude acquisition method based on horizon and polarised light
CN111504312A (en) * 2020-07-02 2020-08-07 中国人民解放军国防科技大学 Unmanned aerial vehicle pose estimation method based on visual inertial polarized light fusion
CN112066979A (en) * 2020-08-27 2020-12-11 北京航空航天大学 Polarization pose information coupling iteration autonomous navigation positioning method
CN112444245A (en) * 2020-11-17 2021-03-05 大连理工大学 Insect-imitated vision integrated navigation method based on polarized light, optical flow vector and binocular vision sensor
CN113739795A (en) * 2021-06-03 2021-12-03 东北电力大学 Underwater synchronous positioning and mapping method based on polarized light/inertia/vision combined navigation
CN113819907A (en) * 2021-11-22 2021-12-21 北京航空航天大学 An Inertial/Polarization Navigation Method Based on Polarization and Sun Dual Vector Switching
CN113819904A (en) * 2021-11-22 2021-12-21 北京航空航天大学 polarization/VIO three-dimensional attitude determination method based on zenith vector
CN113834483A (en) * 2021-11-23 2021-12-24 北京航空航天大学 An Inertial/Polarization/Geomagnetic Fault Tolerant Navigation Method Based on Observability

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
SHANPENG WANG等: "A Bioinspired Navigation System for Multirotor UAV by Integrating Polarization Compass/Magnetometer/INS/GNSS", 《IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS》, vol. 70, no. 8, pages 8526 - 8536, XP011936621, DOI: 10.1109/TIE.2022.3212421 *
刘杰: "基于偏振成像的双目立体视觉应用技术研究", 《中国博士学位论文全文数据库》, no. 5, pages 005 - 19 *
张霄等: "强干扰环境下的自主导航与控制新技术", 《自动化博览》, no. 4, pages 68 - 72 *
胡瀚珮: "偏振光、IMU和单目视觉的组合导航研究", 《中国优秀硕士学位论文全文数据库》, no. 2, pages 136 - 1957 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117191047A (en) * 2023-11-03 2023-12-08 南京信息工程大学 Unmanned aerial vehicle self-adaptive active visual navigation method and device in low-light environment
CN117191047B (en) * 2023-11-03 2024-02-23 南京信息工程大学 Unmanned aerial vehicle self-adaptive active visual navigation method and device in low-light environment

Also Published As

Publication number Publication date
CN116182855B (en) 2023-07-07

Similar Documents

Publication Publication Date Title
CN109307508B (en) Panoramic inertial navigation SLAM method based on multiple key frames
CN109540126B (en) An Inertial Vision Integrated Navigation Method Based on Optical Flow Method
CN112556719B (en) Visual inertial odometer implementation method based on CNN-EKF
CN110068335B (en) A method and system for real-time positioning of UAV swarms in GPS-denied environment
CN105865454B (en) A kind of Navigation of Pilotless Aircraft method generated based on real-time online map
CN112304307A (en) Positioning method and device based on multi-sensor fusion and storage medium
CN111121767A (en) A GPS-integrated robot vision-inertial navigation combined positioning method
CN112683281B (en) A joint localization method for autonomous vehicles based on vehicle kinematics
CN115574816B (en) Bionic vision multi-source information intelligent perception unmanned platform
CN106989744A (en) A kind of rotor wing unmanned aerial vehicle autonomic positioning method for merging onboard multi-sensor
CN111288989B (en) Visual positioning method for small unmanned aerial vehicle
CN113551665B (en) A highly dynamic motion state perception system and perception method for motion carriers
CN110887486B (en) Unmanned aerial vehicle visual navigation positioning method based on laser line assistance
CN113807435B (en) Multi-sensor-based remote sensing image feature point elevation acquisition method
CN115406447B (en) Autonomous positioning method of quad-rotor unmanned aerial vehicle based on visual inertia in rejection environment
CN115479602A (en) A Visual-Inertial Odometry Method Fused with Event and Distance
CN116989772B (en) An air-ground multi-modal multi-agent collaborative positioning and mapping method
CN115560760A (en) Unmanned aerial vehicle-oriented vision/laser ranging high-altitude navigation method
CN113240597B (en) Three-dimensional software image stabilizing method based on visual inertial information fusion
CN114964276A (en) Dynamic vision SLAM method fusing inertial navigation
CN109785428B (en) A Handheld 3D Reconstruction Method Based on Polymorphic Constrained Kalman Filter
CN118274817A (en) Monocular panoramic vision and inertia combined initialization system for on-line space-time calibration
CN113345032A (en) Wide-angle camera large-distortion image based initial image construction method and system
CN117073720A (en) Method and equipment for quick visual inertia calibration and initialization under weak environment and weak action control
CN113503872A (en) Low-speed unmanned vehicle positioning method based on integration of camera and consumption-level IMU

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant