[go: up one dir, main page]

CN113739795A - Underwater synchronous positioning and mapping method based on polarized light/inertia/vision combined navigation - Google Patents

Underwater synchronous positioning and mapping method based on polarized light/inertia/vision combined navigation Download PDF

Info

Publication number
CN113739795A
CN113739795A CN202110620366.9A CN202110620366A CN113739795A CN 113739795 A CN113739795 A CN 113739795A CN 202110620366 A CN202110620366 A CN 202110620366A CN 113739795 A CN113739795 A CN 113739795A
Authority
CN
China
Prior art keywords
underwater
navigation
measurement
vector
polarized light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110620366.9A
Other languages
Chinese (zh)
Other versions
CN113739795B (en
Inventor
夏琳琳
刘瑞敏
蒙跃
张晶晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeast Electric Power University
Original Assignee
Northeast Dianli University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeast Dianli University filed Critical Northeast Dianli University
Priority to CN202110620366.9A priority Critical patent/CN113739795B/en
Publication of CN113739795A publication Critical patent/CN113739795A/en
Application granted granted Critical
Publication of CN113739795B publication Critical patent/CN113739795B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

本发明公开了一种基于偏振光/惯性/视觉组合导航的水下同步定位与建图方法,属于水下导航领域本发明方法包括:首先利用水下偏振模式依赖太阳位置信息的原理,测量不同地点、深度和时间的水下偏振角,同时,利用偏振光传感器准确测得水下潜航器的速度、姿态、位置及航向角信息;其次利用场景的偏振信息,研究拓扑节点的识别方法和特征表达方法,以便更好地进行场景识别和目标探测;最后,考虑到水下滤波器的实用性,提出自适应无迹卡尔曼滤波(AUKF)数据融合策略,以改进的最大无偏后验噪声统计估计器在线估计未知系统噪声等参数,应用于水下潜航器的同时定位和地图构建(SLAM),即实现AUKF‑SLAM。本发明可显著提高组合导航的定位定向精度。

Figure 202110620366

The invention discloses an underwater synchronous positioning and mapping method based on polarized light/inertial/visual combined navigation, belonging to the field of underwater navigation. The underwater polarization angle of location, depth and time, and at the same time, the speed, attitude, position and heading angle information of the underwater vehicle are accurately measured by the polarized light sensor; secondly, the identification method and characteristics of topology nodes are studied by using the polarization information of the scene expression method for better scene recognition and object detection; finally, considering the practicality of underwater filters, an adaptive unscented Kalman filter (AUKF) data fusion strategy is proposed to improve the maximum unbiased posterior noise The statistical estimator estimates parameters such as unknown system noise online, and is applied to simultaneous localization and map construction (SLAM) of underwater vehicles, that is, to achieve AUKF‑SLAM. The invention can significantly improve the positioning and orientation accuracy of the combined navigation.

Figure 202110620366

Description

Underwater synchronous positioning and mapping method based on polarized light/inertia/vision combined navigation
Technical Field
The invention relates to an underwater synchronous positioning and mapping method based on polarized light/inertia/vision combined navigation, belongs to the field of underwater navigation, and meets the requirements of technical precision, reliability, concealment and sea area applicability of underwater navigation in different scenes.
Background
The bionic polarized light navigation as a new navigation technology has the advantages of good navigation performance, strong concealment and low interference, and is concerned by a plurality of scholars in recent years. In the existing research, the research on polarized sky light, whether sunlight or moonlight, is mainly focused on land, and the research on sea surface and underwater diving navigation is very little, however, the bionic polarized light navigation also has potential in water application. Since the underwater polarization mode is stable, and the polarization distribution of the underwater scattered light contains position information of the sun direction of the underwater radiance peak, many underwater living beings navigate by sensing the polarization information by using the marine scattered light polarization distribution mode. The invention provides an underwater polarized light navigation technology, which is used for assisting inertial navigation and realizing underwater autonomous passive navigation.
The simultaneous localization and mapping (SLAM) technology is applied to underwater navigation and has important research significance, the underwater vehicle is enabled not to depend on prior underwater environment information, required information is extracted from a sensor carried by the underwater vehicle, and a map is constructed through an environment model and environment characteristics. At present, the common navigation solution focuses on expressing by utilizing a motion environment, namely, images are used as characteristics, semantic information is used as driving, but motion space measurement is omitted, so that the navigation solution constructs the navigation topology of the motion environment in water, and the positioning and orientation accuracy of navigation is improved.
In conclusion, the method takes inertial navigation as a main body and polarized light bionic navigation as an auxiliary body, improves the identification expression of node characteristics in the aspect of constructing a map, and designs a polarized visual image enhancement model by utilizing a polarizing film and a camera. The scene polarization analysis is beneficial to scene identification and target identification detection; by adopting an AUKF-SLAM data fusion strategy, the stability and the self-adaptive capacity of a filtering algorithm can be enhanced, navigation errors can be corrected, and the method plays a vital role in future underwater environment detection and resource exploration.
Disclosure of Invention
For navigation positioning of an underwater vehicle, in order to overcome the defects of an inertial navigation system, the invention provides an underwater synchronous positioning and mapping method based on polarized light/inertial/visual integrated navigation, and an underwater integrated navigation system is constructed by adopting a polarized light/inertial/visual integrated navigation mode. The polarized light compass is used as an orientation result, the image information of the polarized visual scene is utilized, the characteristics of the topological nodes are used as an identification and expression method, the self-positioning of underwater navigation is realized, and meanwhile, an environment map is constructed. And an AUKF data fusion model is provided and applied to synchronous positioning and mapping of the underwater vehicle, so that more accurate navigation orientation positioning precision is obtained.
The technical scheme is as follows: the invention provides an underwater synchronous positioning and mapping method based on polarized light/inertia/vision combined navigation, which comprises the following five steps:
step 1: selecting the attitude, the speed and the position of the underwater vehicle as a system state, and establishing a kinematic model of the underwater vehicle;
step 2: calculating a sun vector by adopting a least square method of a polarization state through an orientation algorithm of a polarization sensor, and determining the position of the sun;
and step 3: constructing a visual scene environment characteristic map by using image polarization information through a topological node identification algorithm;
and 4, step 4: according to the speed output of the inertial navigation system, establishing a speed inertial navigation measurement model and a polarized light nonlinear measurement model;
and 5: and integrating the steps, and establishing an integrated navigation system model through data fusion to construct an environment map.
Step 1: in SLAM research algorithm, the method is important for constructing a model for a vector. Selecting the attitude, the speed and the position of the underwater vehicle as a system state, and establishing a kinematic model of the underwater vehicle;
the nonlinear equation of the motion model is as follows:
Figure BDA0003099618080000021
in the formula above, the first and second groups of the compound,
Figure BDA0003099618080000022
and
Figure BDA0003099618080000023
the position abscissa, the position ordinate, the zenith coordinate and the course coordinate of the underwater vehicle are respectively, u is a control input vector, and w is a zero-mean Gaussian-distribution process noise vector;
step 2: calculating a sun vector by a least square method of a polarization state through an orientation algorithm of a polarization sensor;
Figure BDA0003099618080000024
in the formula above, the first and second groups of the compound,
Figure BDA0003099618080000025
a heading angle representing the direction of a probe on the vehicle,
Figure BDA0003099618080000026
representing an element angle difference;
Figure BDA0003099618080000027
is an estimate of the position of the sun;
the sun position is thus determined from the estimated values:
Figure BDA0003099618080000028
in the formula above, the first and second groups of the compound,
Figure BDA0003099618080000029
the position of the sun, t is the measurement time, and arcidist (-) is the arc length;
and step 3: the method comprises the steps that information is provided for autonomous navigation through a topological node as a carrier, positioning calculation is carried out, a scene and a detection target are identified through polarization information, and a visual scene environment characteristic map is constructed;
according to the position information of the identification image, determining a target sample (an image sample of a region corresponding to the center of each node), marking a mark, wherein in the navigation topological graph, n nodes contain m image samples, and the image target corresponding to each node is marked as:
Figure BDA0003099618080000031
the above formula IiIs the feature vector of the ith image sample, yi∈[1,2,…,n]Target labels for nodes;
then, training target samples in adjacent topological nodes according to a large-boundary neighbor algorithm in machine learning to obtain a new measurement matrix M, so that the target samples with the same label are more compact, and the target samples with different labels are looser;
the distance metric between any two target samples is defined as:
dM(Ii,Ii+j)=(Ii-Ii+j)TM(Ii-Ii+j)
above formula, dMIs the Euclidean distance of two nodes in the reconstructed topological node space, dM(Ii,Ii+j) In a linear relationship with the metric matrix, IiAnd Ii+jA target neighbor sample;
and finally, solving the optimal solution of the following objective function by using the measurement matrix M:
Figure BDA0003099618080000032
let yilE {0,1} represents whether the characteristic vectors of the samples belong to the same node, if the characteristic vectors belong to the same node, the characteristic vectors are 1, otherwise, the characteristic vectors are 0; λ denotes the weight parameter, εijlThe total number of non-target samples. Therefore, an optimal solution can be obtained, and the expression and measurement of the topological graph on the motion environment are obtained;
and 4, step 4: establishing a combined model of velocity inertial navigation measurement and nonlinear state polarized light measurement;
the state model of the integrated navigation system is as follows:
Figure BDA0003099618080000033
the measurement equation of the integrated navigation system is as follows:
Figure BDA0003099618080000034
and 5: navigation data fusion of the underwater autonomous underwater vehicle is carried out by adopting an AUKF algorithm through course information provided by polarized light navigation, speed information provided by inertial navigation and position information provided by visual navigation, SLAM is carried out, and the steps are as follows:
1) reading speed, position and course observation data of the underwater vehicle respectively provided by an inertial navigation sensor, a visual odometer and a polarized light sensor;
2) selecting weighting coefficients
βi=dkbi-1,i=1,…,k,
Figure BDA0003099618080000041
3) Calculating adaptive factor coefficients
Coefficient of adaptive factor lambdaiExpressed as:
Figure BDA0003099618080000042
of the above formula, λ0Tr (-) represents the matrix tracking for the initial value, S is the adjustable coefficient,
Figure RE-GDA0003314162220000041
is a residual sequence;
4) attenuation memory time-varying noise statistic estimator
Recursive formula of attenuation memory time-varying noise statistic estimator
Figure BDA0003099618080000044
The expression is as follows:
Figure BDA0003099618080000045
Figure BDA0003099618080000046
Figure BDA0003099618080000047
Figure BDA0003099618080000048
in the formula above, the first and second groups of the compound,
Figure BDA0003099618080000049
in order to be the covariance of the system noise,
Figure BDA00030996180800000410
for observing noise covariance, K is Kalman gain, WiAre the weights of the mean and covariance,
Figure BDA00030996180800000411
satisfies the UKF sequence;
5) through the noise reduction processing of the filter, the convergence condition of the filter is as follows:
Figure BDA00030996180800000412
6) performing data association on the acquired attitude, position, speed and course by adopting a nearest neighbor data association algorithm; if the nodes (target samples) in the navigation topological graph are adjacent, merging is carried out, namely association is successful, and the underwater vehicle carries out time updating and measurement updating once; otherwise, adding the new metric as a new feature to the current state vector, namely performing state augmentation, and then continuing to perform merging (namely data association) of adjacent target samples, as follows;
Figure BDA0003099618080000051
in the formula above, the first and second groups of the compound,
Figure BDA0003099618080000052
is a new feature vector.
And (3) time updating:
Figure BDA0003099618080000053
Figure BDA0003099618080000054
Figure BDA0003099618080000055
measurement updating:
Figure BDA0003099618080000056
Figure BDA0003099618080000057
Figure BDA0003099618080000058
Figure BDA0003099618080000059
Figure BDA00030996180800000510
in the formula above, the first and second groups of the compound,
Figure BDA00030996180800000511
and
Figure BDA00030996180800000512
the previous state vector, covariance vector and measurement vector of state at time k-1, x, respectively, of the systemi,k∣k-1、zi,k∣k-1Is the value obtained by sampling the Sigma points,
Figure BDA00030996180800000513
and
Figure BDA00030996180800000514
weight, Q, of the system mean and covariance, respectivelyk-1Is the covariance of the system noise and,
Figure BDA00030996180800000515
and
Figure BDA00030996180800000516
are all observed covariances obtained in the system measurement update, R is the covariances of the observed noise, KkIn order to be a matrix of the filter gains,
Figure BDA00030996180800000517
and
Figure BDA00030996180800000518
respectively, the current state vector and covariance vector of the system.
And finally, completing underwater synchronous positioning and mapping.
Compared with the prior art, the invention has the advantages that:
(1) the invention utilizes the polarized light bionic navigation principle, does not need the support of a geophysical field database, and can complete autonomous navigation in an underwater complex unstructured environment by researching the orientation algorithm of the underwater polarization sensor. The navigation mode is high in concealment and strong in robustness.
(2) Compared with the application scenario of the land SLAM, the underwater environment is dim and complex, and the visual sensor-based SLAM system has many limitations. Therefore, the invention fully utilizes the image information of the polarization vision, provides position data for the identification of the topology nodes of the visual scene, obviously improves the positioning and orientation precision of the integrated navigation and effectively inhibits the positioning accumulative error in the autonomous navigation process.
(3) At present, a Kalman filter is mostly adopted for navigation system data fusion, and the great disadvantage is that the accuracy of the established model is seriously depended on for the estimation precision of navigation error parameters. The typical UKF algorithm is a nonlinear filter which avoids the linearization of state or measurement equations, but the implementation principle has limitation and is not suitable for underwater SLAM. The invention provides the AUKF-SLAM algorithm, which can effectively inhibit the divergence of the filter and improve the quick tracking capability of the filter under the condition of unknown underwater and time-varying noise in deep sea areas and the like, thereby improving the performance of the integrated navigation system.
Drawings
FIG. 1 is a schematic diagram of an underwater polarized light/inertial/visual integrated navigation principle;
FIG. 2 is an underwater polarization state scattering model;
FIG. 3 is a schematic diagram of topology node feature identification;
FIG. 4 is a basic schematic diagram of the AUKF-SLAM algorithm.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
Step 1: in SLAM research algorithm, the method is important for constructing a model for a vector. Selecting the attitude, the speed and the position of the underwater vehicle as a system state, and establishing a kinematic model of the underwater vehicle;
the motion rule of the underwater vehicle is very complex, the motion needs to be simplified, and only the plane motion of two degrees of freedom of a horizontal plane and a vertical plane is considered. The augmented state of the motion model is defined as follows:
Figure BDA0003099618080000061
of the above formula, XvA state vector matrix, { x, { for the current attitude in the spacecraftiI-0.. k-1} is a node of the state vector in the attitude map, where X isvIs defined as:
Xv=[x y z y]T
in the above formula, x, y and z are position coordinates, and y is the heading angle of the underwater vehicle in the horizontal reference coordinate system.
Since the corresponding error covariance matrix is extended, the augmented error covariance matrix can be expressed as:
Figure BDA0003099618080000071
of the above formula, Pv,vIs the covariance of the current attitude, Pv,jAnd Pi,jRespectively representing the covariance of the jth current state and the ith, jth previous state;
the system state equation is:
Figure BDA0003099618080000072
in the above formula, A is a transfer matrix, xvIs the estimated value of the system state at the moment v, B is an input weighting matrix, u is a control input vector, w is a zero mean Gaussian distributed process noise vector, and the covariance matrix is Qv
Establishing a relational expression of the covariance of the current attitude error and the current attitude state vector, i.e.
Figure BDA0003099618080000073
According to the definition of the system state vector, establishing a motion model nonlinear equation:
Figure BDA0003099618080000074
in the formula above, the first and second groups of the compound,
Figure BDA0003099618080000075
and
Figure BDA0003099618080000076
the position abscissa, the position ordinate, the zenith coordinate and the course coordinate of the underwater vehicle are respectively, u is a control input vector, and w is a zero-mean Gaussian-distribution process noise vector;
step 2: the underwater mode in the E-vector angle of the underwater polarization mode is less sensitive to disturbances (including surface waves) and therefore can rely on the position of the sun and from this determine the heading and pitch angles of the sun to accurately estimate the position of the observer. Therefore, the sun vector is calculated by adopting a least square method of the polarization state through the orientation algorithm of the polarization sensor;
using the Mueller-Stokes form of polarized light, a beam of polarized light consisting of wave vectors
Figure BDA0003099618080000077
(representing wave number and propagation direction of polarized light) and Stokes vector
Figure BDA0003099618080000078
(representing the intensity and polarization state of polarized light) indicates that sunlight (k)i,xi,Si) Refraction in water, air-water interface from surface normal n and actual refractive index η of air and wateritDefinition, S, assuming too much sunlight is unpolarizedi=(1,0,0,0)TThus transmitting light (k) according to snell's law of refractiont,xt,St) Is represented as follows:
Figure BDA0003099618080000081
xt=xi=n×ki/||n×ki||
St=MRSi
Figure BDA0003099618080000082
above formula, SiAnd StStokes vectors of the incident angle and the emergence angle, respectively, where Si=(1 0 0 0)T. Using the above formula, matrix M is transformed by coordinatesR→STo obtain StIn the underwater transmission sunlight model, the scattering matrix is expressed as MSWave vector k of scattered lightsAnd k istThe size is the same, and the pointing detector is specifically expressed as follows:
Figure BDA0003099618080000083
Ss=MSMR→SSt
then the Stokes vector is:
Sd=MS→DMSMR→SMRSi
extracting a polarization angle by using Stokes vectors, and calculating an initial estimation value of the position of the sun in the sky angle as
Figure BDA0003099618080000084
Figure BDA0003099618080000085
L of the above formula1The norm effect is to eliminate outliers caused by measurement noise, which, in the underwater scattering model,
Figure BDA0003099618080000086
Figure BDA0003099618080000087
a course angle representing the direction of the probe on the vehicle, then:
Figure BDA0003099618080000088
Figure BDA0003099618080000089
denotes the element angle difference, τ denotes the polarization angle:
Figure BDA00030996180800000810
to eliminate the minimization of model errors, the sun position is estimated using the least squares method:
Figure BDA00030996180800000811
the sun position is thus finally determined from the estimated values:
Figure BDA0003099618080000091
in the above formula, t is the measurement time, and arcidist (-) is the arc length;
and step 3: after the solar altitude angle and the solar azimuth angle are determined through the polarization mode of the scattering model, the step provides information for autonomous navigation through the topological nodes as a carrier, positioning calculation is carried out, and the problem of building a visual scene environment feature map is analyzed by utilizing polarization information to identify a scene and a detection target;
a schematic diagram of identifying characteristics of topology nodes, as shown in fig. 3, is a comparison diagram before and after training of a target sample, when a navigation topology map of an underwater environment is constructed, identification of the topology nodes is critical, and in a complex underwater environment, the environmental characteristics of some adjacent nodes may show high similarity, so that if a frame in a current node has too much similarity with a frame in a previous node, two adjacent nodes are merged, so that a memory of map can be saved;
the specific method comprises the following steps: according to the position information of the identification image, determining a target sample (an image sample of a region corresponding to the center of each node), marking a label, wherein in the navigation topological graph, n nodes contain m image samples, and the image target corresponding to each node is marked as:
Figure BDA0003099618080000092
the above formula IiIs the feature vector of the ith image sample, yi∈[1,2,…,n]The destination label of the node.
And then training target samples in adjacent topological nodes according to a large-boundary neighbor algorithm in machine learning to obtain a new measurement matrix M, so that the target samples with the same label are more compact, and the target samples with different labels are looser. The distance metric between any two target samples is defined as:
dM(Ii,Ii+j)=(Ii-Ii+j)TM(Ii-Ii+j)
above formula, dMIs the Euclidean distance of two nodes in the reconstructed topological node space, dM(Ii,Ii+j) In a linear relationship with the metric matrix, IiAnd Ii+jA target neighbor sample;
and finally, solving the optimal solution of the following objective function by using the measurement matrix M:
Figure BDA0003099618080000093
let yilE {0,1} represents whether the characteristic vectors of the samples belong to the same node, if the characteristic vectors belong to the same node, the characteristic vectors are 1, otherwise, the characteristic vectors are 0; λ represents a weight parameter, εijlThe total number of non-target samples. Therefore, the optimal solution can be obtained, and the expression and degree of the topological graph to the motion environment can be obtainedAn amount;
and 4, step 4: establishing a combined model of velocity inertial navigation measurement and nonlinear state polarized light measurement;
in the integrated navigation, attitude angle error, speed error, position error and gyro random drift in an inertial navigation system are taken as state variables of the system, namely:
Figure BDA0003099618080000101
in the formula above, the first and second groups of the compound,
Figure BDA0003099618080000102
for inertial navigation attitude angle error, δ vE,δvN,δvuFor velocity error, δ L, δ λ, δ h are position errors, εbxbybzIs a constant error of the gyro random drift, epsilonrxrynA first order markov process for gyroscopic random drift,
Figure BDA0003099618080000103
random error for the accelerometer;
establishing state equations of each subsystem of inertial navigation and polarized light, wherein the state equations respectively comprise:
Figure BDA0003099618080000104
Figure BDA0003099618080000105
above formula, FIMU(t)、Fpolar(t) represents a state coefficient matrix, GIMU(t)、Gpolar(t) denotes the system noise drive matrix, WIMU(t)、Wpolar(t) represents a system noise matrix;
in summary, the state model of the integrated navigation system is:
Figure BDA0003099618080000106
the measurement model of the system is established as follows:
YIMU(t)=HIMU(t)+UIMU(t)
Ypolar(t)=Hpolar(t)+Upolar(t)
of the above formula, HIMU(t)、Hpolar(t) a non-linear function, U, representing the measurement equationIMU(t)、Upolar(t) represents the measurement noise of the inertial sensor and the polarized light sensor, respectively, wherein:
Figure BDA0003099618080000107
Figure BDA0003099618080000108
in summary, the measurement equation of the integrated navigation system is:
Figure BDA0003099618080000109
and 5: and integrating the steps, establishing a combined navigation system model through data fusion, and constructing an environment map.
The invention provides an AUKF algorithm for SLAM research (AUKF-SLAM), namely, the steps are integrated, navigation data fusion of an underwater autonomous underwater vehicle is carried out through course information provided by polarized light navigation, speed information provided by inertial navigation and position information provided by visual navigation, the algorithm solves the problems of precision reduction and divergence of the traditional UKF under the unknown and time-varying conditions, and introduces adaptive fading factors to correct prediction error covariance and adjust a filter gain matrix K, so that the divergence of a filter can be inhibited, and the rapid tracking capability is improved. The basic principle of the AUKF-SLAM algorithm is shown in FIG. 4, and the specific steps are as follows:
old data is gradually forgotten when the time-varying noise statistic is estimated, but the UKF algorithm does not have the change of the noise statistic, so the algorithm can be improved to design a time-varying noise statistic estimator by using an attenuation memory weighted index method under the condition of unknown and time-varying noise statistic parameters. Selecting a weighting coefficient:
βi=βi-1b,0<b<1,
Figure BDA0003099618080000111
can be written as:
βi=dkbi-1,i=1,…,k,
Figure BDA0003099618080000112
thus, the recursive formula of the decaying memory time-varying noise statistic estimator
Figure BDA0003099618080000113
The expression is as follows:
Figure BDA0003099618080000114
Figure BDA0003099618080000115
Figure BDA0003099618080000116
Figure BDA0003099618080000117
in the formula above, the first and second groups of the compound,
Figure BDA0003099618080000118
in order to be the covariance of the system noise,
Figure BDA0003099618080000119
for observing noise covariance, K is Kalman gain, WiAre the weights of the mean and the covariance,
Figure BDA00030996180800001110
satisfies the UKF sequence;
the AUKF algorithm calculates the self-adaptive factor coefficient lambda by introducing an attenuation factor formulaiThen using the coefficient to correct the covariance P of the predictable termk|k-1Thereby, divergence of the filter is suppressed and convergence of the filter is ensured. The convergence conditions are as follows:
Figure BDA00030996180800001111
calculating adaptive factor coefficient lambda for attenuation factor formulai
Figure BDA0003099618080000121
Of the above formula, λ0Tr (-) represents the matrix tracking for the initial value, S is the adjustable coefficient,
Figure BDA0003099618080000122
is a residual sequence;
factor lambda in the algorithmiThe larger the value, the smaller the proportion of information produced, resulting in a residual vector effect vkThe more pronounced. Therefore, the algorithm not only inhibits the divergence of the filter and improves the noise reduction capability of the filter, but also has strong tracking capability on the condition of sudden change of state;
after noise reduction, performing data association on the acquired attitude, position, speed, course and the like by adopting a nearest neighbor data association algorithm; if the nodes (target samples) in the navigation topological graph are adjacent, merging is carried out, namely association is successful, and the underwater vehicle carries out time updating and measurement updating once; otherwise, adding the new metric as a new feature to the current state vector, i.e. performing state augmentation, and then continuing to perform merging (i.e. data correlation) of adjacent target samples. The operation is as follows:
Figure BDA0003099618080000123
in the formula above, the first and second groups of the compound,
Figure BDA0003099618080000124
is a new feature vector.
Through the improvement, the time updating and the measurement updating are carried out on the filter:
and (3) time updating:
Figure BDA0003099618080000125
Figure BDA0003099618080000126
Figure BDA0003099618080000127
measurement updating:
Figure BDA0003099618080000128
Figure BDA0003099618080000129
Figure BDA00030996180800001210
Figure BDA0003099618080000131
Figure BDA0003099618080000132
in the formula above, the first and second groups of the compound,
Figure BDA0003099618080000133
and
Figure BDA0003099618080000134
the previous state vector, covariance vector and measurement vector of state at time k-1, x, respectively, of the systemi,k∣k-1、zi,k∣k-1Values obtained for Sigma point sampling, Wi (m)And Wi (c)Weight, Q, of the system mean and covariance, respectivelyk-1Is the covariance of the system noise and,
Figure BDA0003099618080000135
and
Figure BDA0003099618080000136
are all observed covariances obtained in the system measurement update, R is the covariances of the observed noise, KkIn order to be a matrix of the filter gains,
Figure BDA0003099618080000137
and
Figure BDA0003099618080000138
respectively, the current state vector and covariance vector of the system.
In conclusion, the AUKF-SLAM algorithm provides a feasible and novel method for simultaneously positioning and mapping underwater under unknown environment.

Claims (6)

1.一种基于偏振光/惯性/视觉组合导航的水下同步定位与建图方法,其特征在于,该方法包括以下五个步骤:1. an underwater synchronous positioning and mapping method based on polarized light/inertial/vision combined navigation, it is characterised in that the method comprises the following five steps: 步骤1:选取水下潜航器的姿态、速度及位置作为系统状态,建立潜航器的运动学模型;Step 1: Select the attitude, speed and position of the underwater vehicle as the system state, and establish the kinematics model of the underwater vehicle; 步骤2:通过偏振传感器的定向算法,采用偏振态的最小二乘法计算太阳矢量,并确定太阳位置;Step 2: Through the orientation algorithm of the polarization sensor, the least squares method of the polarization state is used to calculate the sun vector and determine the position of the sun; 步骤3:通过拓扑节点识别算法,利用图像偏振信息,分析视觉场景特征提取和匹配问题;Step 3: Analyze the visual scene feature extraction and matching problem by using the image polarization information through the topology node identification algorithm; 步骤4:根据惯性导航系统的速度输出,建立速度惯性导航量测模型和偏振光非线性量测模型;Step 4: According to the speed output of the inertial navigation system, establish a speed inertial navigation measurement model and a polarized light nonlinear measurement model; 步骤5:整合上述步骤,通过数据融合建立组合导航系统模型,构建环境地图。Step 5: Integrate the above steps, establish an integrated navigation system model through data fusion, and construct an environmental map. 2.一种基于偏振光/惯性/视觉组合导航的水下同步定位与建图方法,其特征在于,步骤1中所建立潜航器的运动学模型的非线性方程为:2. an underwater synchronous positioning and mapping method based on polarized light/inertial/vision combined navigation, it is characterized in that, the nonlinear equation of the kinematics model of the established submarine in step 1 is:
Figure FDA0003099618070000011
Figure FDA0003099618070000011
上式,
Figure FDA0003099618070000012
Figure FDA0003099618070000013
分别为潜航器的位置横坐标、位置纵坐标、天顶处坐标及航向坐标,u为控制输入向量,w为零均值高斯分布的过程噪声向量。
the above formula,
Figure FDA0003099618070000012
and
Figure FDA0003099618070000013
are the position abscissa, position ordinate, zenith coordinate and heading coordinate of the submersible, respectively, u is the control input vector, and w is the process noise vector with zero mean Gaussian distribution.
3.一种基于偏振光/惯性/视觉组合导航的水下同步定位与建图方法,其特征在于,步骤2中通过偏振传感器的定向算法,采用偏振态的最小二乘法计算太阳矢量,并确定太阳位置;3. An underwater synchronous positioning and mapping method based on polarized light/inertial/visual combined navigation, it is characterized in that, in step 2, through the orientation algorithm of the polarization sensor, adopt the least squares method of polarization state to calculate the sun vector, and determine sun position;
Figure FDA0003099618070000014
Figure FDA0003099618070000014
上式,
Figure FDA0003099618070000015
表示潜航器上探测器方向的航向角,
Figure FDA0003099618070000019
表示元素角度差,
Figure FDA0003099618070000016
为太阳位置的估计值;
the above formula,
Figure FDA0003099618070000015
is the heading angle representing the direction of the detector on the submersible,
Figure FDA0003099618070000019
represents the element angle difference,
Figure FDA0003099618070000016
is an estimate of the position of the sun;
由此根据估计值确定太阳位置:From this the position of the sun is determined from the estimate:
Figure FDA0003099618070000017
Figure FDA0003099618070000017
上式,
Figure FDA0003099618070000018
为太阳位置,t为测量时间,arcdist(·)为弧长。
the above formula,
Figure FDA0003099618070000018
is the position of the sun, t is the measurement time, and arcdist( ) is the arc length.
4.一种基于偏振光/惯性/视觉组合导航的水下同步定位与建图方法,其特征在于,步骤3:构建视觉场景环境特征地图的步骤具体如下:4. an underwater synchronous positioning and mapping method based on polarized light/inertial/visual combined navigation, it is characterized in that, step 3: the step of constructing visual scene environment feature map is as follows: 1)根据识别图像的位置信息,确定目标样本,即各节点中心对应区域的图像样本,并标注标号,在导航拓扑图中,有n个节点共包含m个图像样本,各个节点的对应图像目标记为:1) According to the position information of the recognized image, determine the target sample, that is, the image sample of the corresponding area in the center of each node, and mark the label. In the navigation topology map, there are n nodes containing m image samples in total, and the corresponding image target of each node. Record as:
Figure FDA0003099618070000021
Figure FDA0003099618070000021
上式,Ii为第i个图像样本的特征矢量,yi∈[1,2,…,n]为节点的目标标号;In the above formula, I i is the feature vector of the ith image sample, and y i ∈ [1,2,…,n] is the target label of the node; 2)根据机器学习中大边界近邻算法,对相邻拓扑节点中目标样本进行训练,得到新度量矩阵M,使相同标号的目标样本更加紧凑、不同标号的目标样本更加松散;2) According to the large boundary neighbor algorithm in machine learning, the target samples in adjacent topological nodes are trained to obtain a new metric matrix M, so that the target samples with the same label are more compact and the target samples with different labels are looser; 3)定义任意两目标样本间的距离度量为:3) Define the distance metric between any two target samples as: dM(Ii,Ii+j)=(Ii-Ii+j)TM(Ii-Ii+j)d M (I i ,I i+j )=(I i -I i+j ) T M(I i -I i+j ) 上式,dM为在重构后的拓扑节点空间中两节点的欧式距离,dM(Ii,Ii+j)与度量矩阵呈线性关系,Ii和Ii+j为目标近邻样本;In the above formula, d M is the Euclidean distance between two nodes in the reconstructed topological node space, d M (I i , I i+j ) has a linear relationship with the metric matrix, and I i and I i+j are the target neighbor samples ; 4)利用度量矩阵M求解如下目标函数的最优解:4) Use the metric matrix M to solve the optimal solution of the following objective function:
Figure FDA0003099618070000022
Figure FDA0003099618070000022
令yil∈{0,1}表示样本的特征向量是否属于相同节点,若属于即为1,否则为0;λ表示权重参数,εijl为非目标样本总数Let y il ∈{0,1} represent whether the feature vector of the sample belongs to the same node, if it belongs, it is 1, otherwise it is 0; λ represents the weight parameter, and ε ijl is the total number of non-target samples 5)由上述步骤得到拓扑图对运动环境的表达和度量,完成构建视觉场景环境特征地图。5) The representation and measurement of the motion environment by the topology map are obtained from the above steps, and the construction of the visual scene environment feature map is completed.
5.一种基于偏振光/惯性/视觉组合导航的水下同步定位与建图方法,其特征在于,步骤4:建立速度惯性导航量测和非线性状态偏振光量测的组合模型如下:5. an underwater synchronous positioning and mapping method based on polarized light/inertial/visual combined navigation, it is characterized in that, step 4: the combined model of establishing speed inertial navigation measurement and nonlinear state polarized light measurement is as follows: 组合导航系统的状态模型为:The state model of the combined navigation system is:
Figure FDA0003099618070000023
Figure FDA0003099618070000023
上式,FIMU(t)、Fpolar(t)表示状态系数矩阵,GIMU(t)、Gpolar(t)表示系统噪声驱动矩阵,WIMU(t)、Wpolar(t)表示系统噪声矩阵;In the above formula, F IMU (t) and F polar (t) represent the state coefficient matrix, G IMU (t) and G polar (t) represent the system noise driving matrix, and W IMU (t) and W polar (t) represent the system noise matrix; 组合导航系统的量测方程为:The measurement equation of the integrated navigation system is:
Figure FDA0003099618070000024
Figure FDA0003099618070000024
上式,HIMU(t)、Hpolar(t)表示量测方程的非线性函数,UIMU(t)、Upolar(t)分别表示惯性传感器和偏振光传感器的量测噪声。In the above formula, H IMU (t) and H polar (t) represent the nonlinear functions of the measurement equation, and U IMU (t) and U polar (t) represent the measurement noise of the inertial sensor and the polarized light sensor, respectively.
6.一种基于偏振光/惯性/视觉组合导航的水下同步定位与建图方法,其特征在于,步骤5:通过AUKF-SLAM算法进行数据融合,建立组合导航系统模型,构建环境地图的具体步骤如下:6. An underwater synchronous positioning and mapping method based on polarized light/inertial/visual integrated navigation, it is characterized in that, step 5: carry out data fusion by AUKF-SLAM algorithm, establish integrated navigation system model, construct the concrete of environment map. Proceed as follows: 1)在估计时变噪声统计时,应逐渐忘记旧的数据,但是UKF算法不具备噪声统计量的变化,所以改进该算法可以在未知和时变噪声统计参数的情况下,使用衰减记忆加权指数法设计时变噪声统计估计器,选取加权系数:1) When estimating the time-varying noise statistics, the old data should be gradually forgotten, but the UKF algorithm does not have the change of the noise statistics, so improving the algorithm can use the decay memory weighting index in the case of unknown and time-varying noise statistics parameters. The time-varying noise statistical estimator is designed by using the method, and the weighting coefficients are selected: βi=βi-1b,0<b<1,
Figure FDA0003099618070000031
β ii-1 b,0<b<1,
Figure FDA0003099618070000031
可以写成:can be written as: βi=dkbi-1,i=1,…,k,
Figure FDA0003099618070000032
β i =d k b i-1 ,i=1,...,k,
Figure FDA0003099618070000032
因此,衰减记忆时变噪声统计估计器的递推公式
Figure FDA0003099618070000033
表述为:
Therefore, the recursive formulation of the decay-memory time-varying noise statistical estimator
Figure FDA0003099618070000033
Expressed as:
Figure FDA0003099618070000034
Figure FDA0003099618070000034
Figure FDA0003099618070000035
Figure FDA0003099618070000035
Figure FDA0003099618070000036
Figure FDA0003099618070000036
Figure FDA0003099618070000037
Figure FDA0003099618070000037
上式,
Figure FDA0003099618070000038
为系统噪声协方差,
Figure FDA0003099618070000039
为观测噪声协方差,K为卡尔曼增益,Wi为均值和协方差的权重,
Figure FDA00030996180700000310
满足UKF序列;
the above formula,
Figure FDA0003099618070000038
is the system noise covariance,
Figure FDA0003099618070000039
is the observation noise covariance, K is the Kalman gain, Wi is the weight of the mean and covariance,
Figure FDA00030996180700000310
Meet the UKF sequence;
2)AUKF算法通过引入衰减因子公式计算自适应因子系数λi,然后利用该系数校正可预测项的协方差Pk|k-1,从而抑制滤波器的发散,确保滤波器的收敛性,其收敛条件为:2) The AUKF algorithm calculates the adaptive factor coefficient λ i by introducing the attenuation factor formula, and then uses the coefficient to correct the covariance P k|k-1 of the predictable term, thereby suppressing the divergence of the filter and ensuring the convergence of the filter. The convergence condition is:
Figure FDA00030996180700000311
Figure FDA00030996180700000311
对于衰减因子公式计算出自适应因子系数λiThe adaptive factor coefficient λ i is calculated for the decay factor formula:
Figure FDA0003099618070000041
Figure FDA0003099618070000041
上式,λ0为初始值,tr(·)表示矩阵跟踪,S为可调系数,
Figure FDA0003099618070000042
为残差序列;
In the above formula, λ 0 is the initial value, tr( ) is the matrix tracking, S is the adjustable coefficient,
Figure FDA0003099618070000042
is the residual sequence;
该算法中因子λi值越大,产生的信息比例就越小,导致残差向量效应vk越显著。The larger the value of the factor λ i in this algorithm, the smaller the proportion of information generated, resulting in the more significant the residual vector effect v k . 3)降噪之后,采用最近邻数据关联算法,对获取的姿态、位置、速度及航向等进行数据关联;若导航拓扑图中的节点目标样本相邻,则进行合并,即关联成功,此时潜航器将进行一次时间更新和量测更新;否则,将新的度量作为新特征添加到当前状态向量,即进行状态增广,然后继续执行相邻目标样本的合并即数据关联,操作如下:3) After noise reduction, the nearest neighbor data association algorithm is used to perform data association on the acquired attitude, position, speed and heading; if the node target samples in the navigation topology map are adjacent, they will be merged, that is, the association is successful. The submersible will perform a time update and a measurement update; otherwise, the new measurement will be added to the current state vector as a new feature, that is, state augmentation, and then continue to perform the merging of adjacent target samples, that is, data association. The operation is as follows:
Figure FDA0003099618070000043
Figure FDA0003099618070000043
上式,
Figure FDA0003099618070000044
是新特征向量。
the above formula,
Figure FDA0003099618070000044
is the new feature vector.
经过以上的改进,对滤波器进行时间更新和量测更新:After the above improvements, time update and measurement update are performed on the filter: 时间更新:Time update:
Figure FDA0003099618070000045
Figure FDA0003099618070000045
Figure FDA0003099618070000046
Figure FDA0003099618070000046
Figure FDA0003099618070000047
Figure FDA0003099618070000047
量测更新:Measurement update:
Figure FDA0003099618070000048
Figure FDA0003099618070000048
Figure FDA0003099618070000049
Figure FDA0003099618070000049
Figure FDA00030996180700000410
Figure FDA00030996180700000410
Figure FDA00030996180700000411
Figure FDA00030996180700000411
Figure FDA0003099618070000051
Figure FDA0003099618070000051
上式,
Figure FDA0003099618070000052
Figure FDA0003099618070000053
分别为系统先前的状态向量、协方差向量及k-1时刻状态的测量向量,xi,k∣k-1、zi,k∣k-1为Sigma点采样得到的值,Wi (m)和Wi (c)分别为系统均值和协方差的权值,Qk-1为系统噪声的协方差,
Figure FDA0003099618070000054
Figure FDA0003099618070000055
均为系统量测更新中获取的观测协方差,R为观测噪声的协方差,Kk为滤波器增益矩阵,
Figure FDA0003099618070000056
Figure FDA0003099618070000057
分别为系统当前的状态向量和协方差向量。
the above formula,
Figure FDA0003099618070000052
and
Figure FDA0003099618070000053
are the previous state vector of the system, the covariance vector and the measurement vector of the state at time k-1 respectively, x i,k∣k-1 , z i,k∣k-1 are the values sampled from the Sigma point, W i (m ) and Wi (c) are the weights of the system mean and covariance, respectively, Q k -1 is the covariance of the system noise,
Figure FDA0003099618070000054
and
Figure FDA0003099618070000055
are the observation covariance obtained in the system measurement update, R is the covariance of the observation noise, K k is the filter gain matrix,
Figure FDA0003099618070000056
and
Figure FDA0003099618070000057
are the current state vector and covariance vector of the system, respectively.
CN202110620366.9A 2021-06-03 2021-06-03 Underwater synchronous positioning and mapping method based on polarized light/inertia/vision integrated navigation Active CN113739795B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110620366.9A CN113739795B (en) 2021-06-03 2021-06-03 Underwater synchronous positioning and mapping method based on polarized light/inertia/vision integrated navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110620366.9A CN113739795B (en) 2021-06-03 2021-06-03 Underwater synchronous positioning and mapping method based on polarized light/inertia/vision integrated navigation

Publications (2)

Publication Number Publication Date
CN113739795A true CN113739795A (en) 2021-12-03
CN113739795B CN113739795B (en) 2023-10-20

Family

ID=78728422

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110620366.9A Active CN113739795B (en) 2021-06-03 2021-06-03 Underwater synchronous positioning and mapping method based on polarized light/inertia/vision integrated navigation

Country Status (1)

Country Link
CN (1) CN113739795B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114353774A (en) * 2022-01-17 2022-04-15 青岛智海牧洋科技有限公司 Underwater light-pattern compass device
CN114910080A (en) * 2022-07-15 2022-08-16 北京航空航天大学 Three-dimensional attitude determination method based on underwater downlink radiation intensity and polarized light field
CN115014313A (en) * 2022-05-30 2022-09-06 中北大学 A parallel multi-scale-based method for dealing with heading error of polarized light compass
CN116164740A (en) * 2023-02-28 2023-05-26 长安大学 Multi-sensor-fused multi-scene unmanned vibratory roller
CN116182855A (en) * 2023-04-28 2023-05-30 北京航空航天大学 An integrated navigation method for UAV with imitation compound eye polarization vision in weak light environment
CN116222580A (en) * 2023-05-09 2023-06-06 北京航空航天大学 Underwater polarization orientation method based on cross-medium refraction interference compensation correction
CN119469170A (en) * 2025-01-16 2025-02-18 中国人民解放军海军潜艇学院 A navigation method based on underwater polarized light detection

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5270883A (en) * 1991-08-29 1993-12-14 Mitsubishi Denki Kabushiki Kaisha Magnetic read/write circuit
US20130138247A1 (en) * 2005-03-25 2013-05-30 Jens-Steffen Gutmann Re-localization of a robot for slam
CN103712621A (en) * 2013-12-23 2014-04-09 大连理工大学 Method for determining attitude of polarized light and infrared sensor auxiliary inertial navigation system
CN106767752A (en) * 2016-11-25 2017-05-31 北京航空航天大学 A kind of Combinated navigation method based on polarization information
US20170323129A1 (en) * 2016-05-07 2017-11-09 Morgan E. Davidson Navigation Using Self-Describing Fiducials
CN107589748A (en) * 2017-08-21 2018-01-16 江苏科技大学 AUV autonomous navigation methods based on UnscentedFastSLAM algorithms
CN108362288A (en) * 2018-02-08 2018-08-03 北方工业大学 A Polarized SLAM Method Based on Unscented Kalman Filter
CN108387236A (en) * 2018-02-08 2018-08-10 北方工业大学 Polarized light S L AM method based on extended Kalman filtering
CN108827288A (en) * 2018-04-12 2018-11-16 东北电力大学 A kind of dimensionality reduction strapdown inertial navigation system Initial Alignment Method and system based on dual quaterion
US20190331762A1 (en) * 2013-10-22 2019-10-31 Polaris Sensor Technologies, Inc. Celestial Positioning System and Method
CN110672131A (en) * 2019-11-19 2020-01-10 北方工业大学 A UKF alignment method for inertial/polarized light integrated navigation system under large misalignment angle
CN111024076A (en) * 2019-12-30 2020-04-17 北京航空航天大学 An underwater integrated navigation system based on bionic polarization
CN111045454A (en) * 2019-12-30 2020-04-21 北京航空航天大学 A UAV Autopilot Based on Bionic Autonomous Navigation
CN111504312A (en) * 2020-07-02 2020-08-07 中国人民解放军国防科技大学 Unmanned aerial vehicle pose estimation method based on visual inertial polarized light fusion
CN112129288A (en) * 2020-11-24 2020-12-25 中国人民解放军国防科技大学 A method and system for pose estimation based on polarized light/geomagnetic heading constraints
CN112444245A (en) * 2020-11-17 2021-03-05 大连理工大学 Insect-imitated vision integrated navigation method based on polarized light, optical flow vector and binocular vision sensor
CN112697138A (en) * 2020-12-07 2021-04-23 北方工业大学 Factor graph optimization-based bionic polarization synchronous positioning and composition method
CN113834480A (en) * 2021-11-26 2021-12-24 北京航空航天大学 An autonomous positioning method for compound eye-like polarization sensor based on scattering angle weight distribution
CN215767102U (en) * 2021-12-20 2022-02-08 中北大学 An airborne inertial/polarized light/optical flow/vision integrated navigation device
CN116255988A (en) * 2023-05-11 2023-06-13 北京航空航天大学 A Composite Anti-jamming Adaptive Filtering Method Based on Ship Dynamics Integrated Navigation

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5270883A (en) * 1991-08-29 1993-12-14 Mitsubishi Denki Kabushiki Kaisha Magnetic read/write circuit
US20130138247A1 (en) * 2005-03-25 2013-05-30 Jens-Steffen Gutmann Re-localization of a robot for slam
US20190331762A1 (en) * 2013-10-22 2019-10-31 Polaris Sensor Technologies, Inc. Celestial Positioning System and Method
CN103712621A (en) * 2013-12-23 2014-04-09 大连理工大学 Method for determining attitude of polarized light and infrared sensor auxiliary inertial navigation system
US20170323129A1 (en) * 2016-05-07 2017-11-09 Morgan E. Davidson Navigation Using Self-Describing Fiducials
CN106767752A (en) * 2016-11-25 2017-05-31 北京航空航天大学 A kind of Combinated navigation method based on polarization information
CN107589748A (en) * 2017-08-21 2018-01-16 江苏科技大学 AUV autonomous navigation methods based on UnscentedFastSLAM algorithms
CN108362288A (en) * 2018-02-08 2018-08-03 北方工业大学 A Polarized SLAM Method Based on Unscented Kalman Filter
CN108387236A (en) * 2018-02-08 2018-08-10 北方工业大学 Polarized light S L AM method based on extended Kalman filtering
CN108827288A (en) * 2018-04-12 2018-11-16 东北电力大学 A kind of dimensionality reduction strapdown inertial navigation system Initial Alignment Method and system based on dual quaterion
CN110672131A (en) * 2019-11-19 2020-01-10 北方工业大学 A UKF alignment method for inertial/polarized light integrated navigation system under large misalignment angle
CN111024076A (en) * 2019-12-30 2020-04-17 北京航空航天大学 An underwater integrated navigation system based on bionic polarization
CN111045454A (en) * 2019-12-30 2020-04-21 北京航空航天大学 A UAV Autopilot Based on Bionic Autonomous Navigation
CN111504312A (en) * 2020-07-02 2020-08-07 中国人民解放军国防科技大学 Unmanned aerial vehicle pose estimation method based on visual inertial polarized light fusion
CN112444245A (en) * 2020-11-17 2021-03-05 大连理工大学 Insect-imitated vision integrated navigation method based on polarized light, optical flow vector and binocular vision sensor
CN112129288A (en) * 2020-11-24 2020-12-25 中国人民解放军国防科技大学 A method and system for pose estimation based on polarized light/geomagnetic heading constraints
CN112697138A (en) * 2020-12-07 2021-04-23 北方工业大学 Factor graph optimization-based bionic polarization synchronous positioning and composition method
CN113834480A (en) * 2021-11-26 2021-12-24 北京航空航天大学 An autonomous positioning method for compound eye-like polarization sensor based on scattering angle weight distribution
CN215767102U (en) * 2021-12-20 2022-02-08 中北大学 An airborne inertial/polarized light/optical flow/vision integrated navigation device
CN116255988A (en) * 2023-05-11 2023-06-13 北京航空航天大学 A Composite Anti-jamming Adaptive Filtering Method Based on Ship Dynamics Integrated Navigation

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
XIA LL 等: "Polarized light-aided visual-inertial navigation system:global heading measurements and graph optimization-based multi-sensor fusion", 《MEASUREMENT SCIENCE AND TECHNOLOGY》, vol. 33, no. 5, pages 055111, XP020457244, DOI: 10.1088/1361-6501/ac4637 *
刘瑞敏: "天空偏振光导航辅助的双目视觉-惯性", 《中国优秀硕士学位论文全文数据库信息科技辑》, no. 1, pages 1138 - 1625 *
孔祥龙 等: "基于多视图几何的惯性/立体视觉组合导航方法研究", 《中国优秀硕士学位论文全文数据库社会科学I辑》, no. 1, pages 112 - 9 *
胡小平;毛军;范晨;张礼廉;何晓峰;韩国良;范颖;: "仿生导航技术综述", 导航定位与授时, no. 04, pages 7 - 16 *
褚金奎;张慧霞;王寅龙;时超;: "多方向偏振光实时定位样机的设计与搭建", 光学精密工程, no. 02, pages 37 - 43 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114353774A (en) * 2022-01-17 2022-04-15 青岛智海牧洋科技有限公司 Underwater light-pattern compass device
CN114353774B (en) * 2022-01-17 2024-04-30 青岛智海牧洋科技有限公司 Underwater light pattern compass device
CN115014313A (en) * 2022-05-30 2022-09-06 中北大学 A parallel multi-scale-based method for dealing with heading error of polarized light compass
CN115014313B (en) * 2022-05-30 2023-08-22 中北大学 Polarized light compass heading error processing method based on parallel multi-scale
CN114910080A (en) * 2022-07-15 2022-08-16 北京航空航天大学 Three-dimensional attitude determination method based on underwater downlink radiation intensity and polarized light field
CN116164740A (en) * 2023-02-28 2023-05-26 长安大学 Multi-sensor-fused multi-scene unmanned vibratory roller
CN116182855A (en) * 2023-04-28 2023-05-30 北京航空航天大学 An integrated navigation method for UAV with imitation compound eye polarization vision in weak light environment
CN116182855B (en) * 2023-04-28 2023-07-07 北京航空航天大学 Combined navigation method of compound eye-simulated polarized vision unmanned aerial vehicle under weak light and strong environment
CN116222580A (en) * 2023-05-09 2023-06-06 北京航空航天大学 Underwater polarization orientation method based on cross-medium refraction interference compensation correction
CN119469170A (en) * 2025-01-16 2025-02-18 中国人民解放军海军潜艇学院 A navigation method based on underwater polarized light detection

Also Published As

Publication number Publication date
CN113739795B (en) 2023-10-20

Similar Documents

Publication Publication Date Title
CN113739795B (en) Underwater synchronous positioning and mapping method based on polarized light/inertia/vision integrated navigation
CN111780755B (en) Multi-source fusion navigation method based on factor graph and observability analysis
CN111486845B (en) AUV multi-strategy navigation method based on submarine topography matching
CN112639502B (en) Robot pose estimation
CN106772524B (en) A Rank Filter Based Integrated Navigation Information Fusion Method for Agricultural Robots
Zhang et al. NavNet: AUV navigation through deep sequential learning
Peretroukhin et al. Reducing drift in visual odometry by inferring sun direction using a bayesian convolutional neural network
CN103674021A (en) Integrated navigation system and method based on SINS (Strapdown Inertial Navigation System) and star sensor
CN114608568B (en) Multi-sensor information based instant fusion positioning method
CN115143954B (en) Unmanned vehicle navigation method based on multi-source information fusion
CN108387236B (en) A Polarized Light SLAM Method Based on Extended Kalman Filtering
CN116295511B (en) Robust initial alignment method and system for pipeline submerged robot
CN114111818B (en) Universal vision SLAM method
CN112444245B (en) An insect-like vision integrated navigation method based on polarized light, optical flow vector and binocular vision sensor
CN112797985A (en) Indoor positioning method and indoor positioning system based on weighted extended Kalman filter
CN116380079A (en) Underwater SLAM method for fusing front-view sonar and ORB-SLAM3
CN110850455A (en) Track recording method based on differential GPS and vehicle kinematics model
CN116659510A (en) Underwater robot positioning and obstacle avoidance method, device and storage medium
CN111595330A (en) Night polarization course calculation method based on probability density function estimation
CN115930977A (en) Localization method, system, electronic device and readable storage medium for feature degradation scene
CN117739972A (en) A UAV approach phase positioning method without global satellite positioning system
Li et al. Di-eme: Deep inertial ego-motion estimation for autonomous underwater vehicle
CN114355409B (en) Surface target motion estimation method
He et al. NINT: Neural inertial navigation based on time interval information in underwater environments
Huang et al. Visual-Inertial-Acoustic Sensor Fusion for Accurate Autonomous Localization of Underwater Vehicles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant