CN116182855A - An integrated navigation method for UAV with imitation compound eye polarization vision in weak light environment - Google Patents
An integrated navigation method for UAV with imitation compound eye polarization vision in weak light environment Download PDFInfo
- Publication number
- CN116182855A CN116182855A CN202310474059.3A CN202310474059A CN116182855A CN 116182855 A CN116182855 A CN 116182855A CN 202310474059 A CN202310474059 A CN 202310474059A CN 116182855 A CN116182855 A CN 116182855A
- Authority
- CN
- China
- Prior art keywords
- polarization
- image
- camera
- navigation
- state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000010287 polarization Effects 0.000 title claims abstract description 200
- 238000000034 method Methods 0.000 title claims abstract description 27
- 150000001875 compounds Chemical class 0.000 title claims abstract description 13
- 238000005259 measurement Methods 0.000 claims abstract description 64
- 230000000007 visual effect Effects 0.000 claims abstract description 39
- 238000001914 filtration Methods 0.000 claims abstract description 20
- 230000004927 fusion Effects 0.000 claims abstract description 19
- 239000011159 matrix material Substances 0.000 claims description 57
- 230000001186 cumulative effect Effects 0.000 claims description 9
- 238000000354 decomposition reaction Methods 0.000 claims description 9
- 230000000877 morphologic effect Effects 0.000 claims description 9
- 238000012546 transfer Methods 0.000 claims description 6
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 claims description 3
- 238000005457 optimization Methods 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 238000009825 accumulation Methods 0.000 description 4
- 101001121408 Homo sapiens L-amino-acid oxidase Proteins 0.000 description 2
- 102100026388 L-amino-acid oxidase Human genes 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 241000238631 Hexapoda Species 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/18—Stabilised platforms, e.g. by gyroscope
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
Abstract
本发明涉及一种弱光强环境下仿复眼偏振视觉无人机组合导航方法,主要包括以下步骤:首先,利用理想瑞利散射模型中偏振矢量与太阳矢量垂直关系,选择惯导误差和相机位姿为状态向量建立偏振量测方程;然后,针对弱光强场景图像特征不明显的问题,利用偏振相机获得的每一帧的偏振度图像和偏振角图像进行图像融合增强图像特征;再次,在针孔模型投影下,根据测量得到的特征点投影位置信息与惯导预测得到的相机投影点位置信息得到的重投影误差建立视觉量测方程;最后,采用偏振传感器与偏振相机得到的量测信息进行多状态约束卡尔曼滤波得到无人机组合导航信息。本发明具有计算量小、精度高和鲁棒性强等优点。
The invention relates to a combined navigation method for an unmanned aerial vehicle imitating compound eye polarization vision in a weak light environment. The polarization measurement equation is established for the state vector; then, to solve the problem that the image characteristics of the scene with weak light and strong light are not obvious, the polarization degree image and the polarization angle image of each frame obtained by the polarization camera are used to perform image fusion to enhance the image characteristics; again, in Under the pinhole model projection, the visual measurement equation is established according to the reprojection error obtained from the measured feature point projection position information and the camera projection point position information predicted by inertial navigation; finally, the measurement information obtained by the polarization sensor and polarization camera is used Multi-state constrained Kalman filtering is performed to obtain integrated navigation information of UAV. The invention has the advantages of small calculation amount, high precision, strong robustness and the like.
Description
技术领域Technical Field
本发明属于组合导航领域,具体涉及一种弱光强环境下仿复眼偏振视觉无人机组合导航方法。The invention belongs to the field of combined navigation, and in particular relates to a combined navigation method for a UAV simulating compound eye polarization vision in a weak light environment.
背景技术Background Art
组合导航技术是无人机导航领域的一项关键技术。VINS导航系统短时间内精度较高,但长时间误差会随时间积累逐渐变大且在弱光强场景下精度较低甚至不可用。根据理想瑞利散射模型得到的偏振信息可以为组合导航系统提供航向约束,并且具有自主性强、无误差积累等优点。对于弱光强场景下相机获取图像纹理不清晰导致VINS导航系统精度大幅度下降的问题,偏振相机在弱光强场景下获取的偏振度图像所含目标信息量大、偏振角图像目标边缘和轮廓清晰,融合偏振度图像与偏振角图像后获得偏振图像,可以在弱光强场景下大幅度提高VINS系统的导航精度。为充分发挥组合导航系统各个子系统的优点,设计一种仿复眼偏振视觉无人机组合导航方法,对完成卫星导航失灵、强干扰环境和复杂位置环境的自主导航和定位任务具有重大意义。Integrated navigation technology is a key technology in the field of UAV navigation. The VINS navigation system has high accuracy in the short term, but the long-term error will gradually increase with time accumulation and the accuracy is low or even unavailable in low-light scenes. The polarization information obtained according to the ideal Rayleigh scattering model can provide heading constraints for the integrated navigation system, and has the advantages of strong autonomy and no error accumulation. In order to solve the problem that the texture of the image obtained by the camera in low-light scenes is not clear, which leads to a significant decrease in the accuracy of the VINS navigation system, the polarization image obtained by the polarization camera in low-light scenes contains a large amount of target information, and the target edge and contour of the polarization angle image are clear. The polarization image is obtained by fusing the polarization image with the polarization angle image, which can greatly improve the navigation accuracy of the VINS system in low-light scenes. In order to give full play to the advantages of each subsystem of the integrated navigation system, a compound eye polarization vision UAV integrated navigation method is designed, which is of great significance for completing autonomous navigation and positioning tasks in satellite navigation failure, strong interference environment and complex location environment.
单一导航有优势也有劣势,利用多状态约束卡尔曼滤波对各个子导航系统进行紧组合滤波来达到信息融合的目的,可以提高导航系统精度和可靠性。中国专利申请“一种基于天顶点矢量的偏振/VIO三维姿态确定方法”(申请号:CN202111381893.5),该方法仅考虑了利用偏振修正三维姿态,未考虑位置信息的修正;中国专利申请“一种基于视觉惯性偏振光融合的无人机位姿估计方法”(申请号:CN202010623718.1),该方法利用偏振导航提供航向约束修正惯导逐渐累积的误差;论文《基于偏振成像的双目立体视觉应用技术研究》,利用偏振成像提升双目视觉导航精度但无法进行长航时导航;中国专利申请“一种基于偏振光、光流矢量、双目视觉传感器的仿昆虫视觉组合导航方法”(申请号:CN202011289488.6),该方法通过优化的方法进行信息融合并利用光流失量进行位置修正。Single navigation has advantages and disadvantages. Using multi-state constrained Kalman filtering to perform tight combination filtering on each sub-navigation system to achieve the purpose of information fusion can improve the accuracy and reliability of the navigation system. Chinese patent application "A polarization/VIO three-dimensional attitude determination method based on zenith point vector" (application number: CN202111381893.5), this method only considers the use of polarization to correct the three-dimensional attitude, and does not consider the correction of position information; Chinese patent application "A UAV pose estimation method based on visual inertial polarization light fusion" (application number: CN202010623718.1), this method uses polarization navigation to provide heading constraints to correct the gradually accumulated errors of inertial navigation; the paper "Research on binocular stereo vision application technology based on polarization imaging" uses polarization imaging to improve the accuracy of binocular vision navigation but cannot perform long-duration navigation; Chinese patent application "A method of combined navigation based on insect vision based on polarized light, optical flow vector, and binocular vision sensor" (application number: CN202011289488.6), this method uses an optimized method to fuse information and uses optical flow loss for position correction.
以上发明均使用可见光视觉相机获取视觉图像并进行视觉更新,未考虑可见光视觉系统在弱光强场景下的精度下降问题。本发明使用偏振相机获取偏振图像并进行融合来进行视觉测量更新,偏振相机可在弱光强环境下提供清晰的视觉图像进行视觉测量更新,对比于可见光相机获取的视觉信息可大幅提高导航精度。The above inventions all use visible light vision cameras to obtain visual images and perform visual updates, without considering the problem of reduced accuracy of visible light vision systems in low-light intensity scenes. The present invention uses a polarization camera to obtain polarization images and fuse them for visual measurement updates. The polarization camera can provide clear visual images for visual measurement updates in low-light intensity environments, which can greatly improve navigation accuracy compared to the visual information obtained by visible light cameras.
发明内容Summary of the invention
为解决在GNSS拒止环境弱光强场景下长航时无人机高精度导航问题,克服VINS导航系统长时间误差累积的不足以及弱光强场景下导航精度大幅下降问题,本发明提供一种弱光强环境下仿复眼偏振视觉无人机组合导航方法,采用多状态约束卡尔曼滤波对组合导航信息进行滤波紧组合实现信息融合,保证无人机组合导航系统在面对弱光强环境的长航时精度。本发明为研究弱光强环境下无人机导航提供了新的思路,解决在面对弱光强长航时导航场景时VINS系统导航精度下降的问题。In order to solve the problem of high-precision navigation of long-flight UAVs in weak-light intensity scenes in GNSS-denied environments, overcome the shortcomings of long-term error accumulation of the VINS navigation system and the problem of a significant decrease in navigation accuracy in weak-light intensity scenes, the present invention provides a method for combined navigation of UAVs with simulated compound eye polarization vision in a weak-light intensity environment, and adopts a multi-state constrained Kalman filter to filter and tightly combine the combined navigation information to achieve information fusion, thereby ensuring the long-flight accuracy of the combined navigation system of the UAV in a weak-light intensity environment. The present invention provides a new idea for studying UAV navigation in a weak-light intensity environment, and solves the problem of decreased navigation accuracy of the VINS system in a weak-light intensity long-flight navigation scene.
为达到上述目的,本发明采用的技术方案为:In order to achieve the above object, the technical solution adopted by the present invention is:
一种弱光强环境下仿复眼偏振视觉无人机组合导航方法,包括如下步骤:A method for combined navigation of a UAV using compound eye polarization vision in a weak light environment comprises the following steps:
步骤1、通过点源式偏振传感器获得偏振方位角 ,计算求得导航系下的偏振矢量 ,通过太阳年历计算得到导航系下太阳矢量 ,基于理想瑞利散射模型中偏振矢量 与导航系下太阳矢量 的垂直关系,结合惯导状态与相机位姿为状态向量建立系统状态方程与偏振量测方程;Step 1: Obtain the polarization azimuth through a point source polarization sensor , calculate and obtain the polarization vector under the navigation system , the solar vector in the navigation system is obtained by calculating the solar calendar , based on the polarization vector in the ideal Rayleigh scattering model The sun vector with the navigation system The vertical relationship between the inertial navigation state and the camera posture is used as the state vector to establish the system state equation and polarization measurement equation;
步骤2、在弱光强场景下,偏振相机得到偏振度图像 、偏振角图像 ,将二者均值滤波后得到偏振度图像 与偏振角图像 ,相减分解得到偏振度细节图像与偏振角细节图像 ,将偏振度图像 与偏振角图像 加权融合后得到图像,将原偏振图像形态开运算得到选通区域 与选通区域 ,利用选通区域 与选通区域 对偏振度细节图像 与偏振角细节图像 进行优化后得到图像 ,将加权融合后得到图像 与图像 融合后得到融合后的偏振图像 ;Step 2: In low light conditions, the polarization camera obtains a polarization image , polarization angle image , and then filter the two mean values to get the polarization image Image with polarization angle , subtract and decompose to obtain the polarization detail image Detailed image with polarization angle , the polarization image Image with polarization angle After weighted fusion, the image is obtained , the original polarization image is opened to obtain the gated area With gated area , using the gated region With gated area Polarization detail image Detailed image with polarization angle After optimization, the image is obtained , the image is obtained after weighted fusion With image After fusion, the fused polarization image is obtained ;
步骤3、结合步骤2中的偏振图像 ,在针孔模型下,偏振相机得到视觉整体测量并提供相机的运动信息,惯导得到预测视觉测量 ,二者相减得到 时刻第 个特征点的重投影误差 ,通过累积叠加与QR分解得到视觉量测方程;Step 3: Combine the polarization image from step 2 , under the pinhole model, the polarization camera obtains the visual overall measurement And provide the camera's motion information, inertial navigation to obtain predicted visual measurement , subtracting the two to get Moment The reprojection error of feature points , the visual measurement equation is obtained through cumulative superposition and QR decomposition;
步骤4、结合步骤1与步骤3建立的偏振量测方程与视觉量测方程,利用多状态约束卡尔曼滤波对各个导航子系统获得的数据进行信息融合,修正惯导逐渐累积的误差,解算无人机组合导航信息。Step 4: Combine the polarization measurement equation and visual measurement equation established in Step 1 and Step 3, use multi-state constrained Kalman filtering to fuse the data obtained by each navigation subsystem, correct the gradually accumulated errors of the inertial navigation, and solve the UAV combined navigation information.
进一步地,所述步骤1包括:Furthermore, the step 1 comprises:
选择导航系到载体系的小角度旋转 ,载体在导航系下的估计速度与真实速度差 和估计位置与真实位置差 ,陀螺仪测量的角速度与真实角速度零偏差,加速度计测量的线速度与真实线速度零偏差 以及每个时刻相机的位姿 作为组合导航系统的状态变量 ,其中 和分别表示相机在导航系下的姿态与位置, 具体形式如下:Select a small angle rotation from the navigation system to the carrier system , the difference between the estimated speed and the actual speed of the carrier in the navigation system The difference between the estimated position and the true position , the angular velocity measured by the gyroscope has zero deviation from the true angular velocity , the linear velocity measured by the accelerometer has zero deviation from the true linear velocity And the camera position at each moment As the state variable of the integrated navigation system ,in and Respectively represent the attitude and position of the camera in the navigation system, The specific form is as follows:
, ,
, ,
系统状态方程为:The system state equation is:
, ,
其中,为状态转移矩阵,为系统噪声驱动阵, 为系统噪声;in, is the state transfer matrix, is the system noise driving array, is the system noise;
定义载体系下的偏振矢量为,太阳矢量为,为从载体系到导航系的坐标转换矩阵;点源式偏振传感器首先测得偏振方位角 之后,得到在载体系下的偏振矢量 :The polarization vector under the carrier system is defined as , the sun vector is , is the coordinate conversion matrix from the carrier system to the navigation system; the point source polarization sensor first measures the polarization azimuth After that, the polarization vector under the carrier system is obtained :
, ,
根据理想瑞利散射模型,导航系中的偏振矢量 垂直于太阳矢量,得到:According to the ideal Rayleigh scattering model, the polarization vector in the navigation system Perpendicular to the sun vector ,get:
, ,
基于此偏振测量方程为:Based on this polarization measurement equation is:
, ,
其中, ,, 为偏振量测噪声。in, , , is the polarization measurement noise.
进一步地,所述步骤2包括:Furthermore, the step 2 comprises:
偏振相机获得偏振度图像 与偏振角图像 ,首先分别对二者进行均值滤波部分消除偏振图像的噪声,滤波后得到偏振度图像 与偏振角图像,原偏振图像与均值滤波后的偏振图像相减分解获得偏振度细节图像与偏振角细节图像:Polarization camera obtains polarization image Image with polarization angle First, the two are respectively subjected to mean filtering to partially eliminate the noise of the polarization image, and the polarization degree image is obtained after filtering. Image with polarization angle , the original polarization image and the polarization image after mean filtering are subtracted and decomposed to obtain the polarization degree detail image and the polarization angle detail image:
, ,
, ,
其次将均值滤波后的偏振图像加权融合得到图像 ,即:Secondly, the polarization images after mean filtering are weighted fused to obtain the image ,Right now:
, ,
其中,系数与应满足;Among them, the coefficient and Should meet ;
再次将偏振度图像 与偏振角图像进行形态开运算后得到偏振度图像与偏振角图像,计算 与 的灰度均值,分别为与,将形态开运算后的偏振度图像 中的各像素灰度值与 比较,将形态开运算后的偏振角图像 中的各像素灰度值与比较,具体规则如下:Again, the polarization image Image with polarization angle After performing morphological opening operation, the polarization image is obtained Image with polarization angle ,calculate and The grayscale mean values are and , the polarization image after morphological opening operation The gray value of each pixel in and Compare the polarization angle image after the morphological opening operation The gray value of each pixel in and For comparison, the specific rules are as follows:
, ,
将 各个像素点整合得到偏振度图像选通区域;将 各个像素点整合得到偏振角图像选通区域 ;对偏振度细节图像 与偏振角细节图像 进行选通加权融合得到偏振图像 :Will Each pixel is integrated to obtain the polarization image gating area ;Will Each pixel point is integrated to obtain the polarization angle image gating area ; For polarization detail image Detailed image with polarization angle Perform gated weighted fusion to obtain polarization images :
, ,
将 与融合得到最终融合后的偏振图像 ,即:Will and The final fused polarization image is obtained by fusion ,Right now:
。 .
进一步地,所述步骤3包括:Furthermore, the step 3 comprises:
将特征点的位置信息通过投影模型在像素平面得到位置坐标,该像素平面的位置坐标对相机的运动信息和特征点的位置信息提供测量,利用特征点三角化得到特征点的三维位置坐标,在针孔模型投影下,视觉整体测量为:The position information of the feature points is projected on the pixel plane to obtain the position coordinates. The position coordinates of the pixel plane provide measurement for the camera's motion information and the position information of the feature points. The three-dimensional position coordinates of the feature points are obtained by triangulating the feature points. Under the pinhole model projection, the overall visual measurement is:
, ,
其中,表示第 时刻对第个特征点的测量, 表示相机在第 时刻对于第 个特征点三角化得到特征点的三维位置坐标,表示特征点投影在像素平面的横纵坐标;in, Indicates Time to The measurement of feature points, Indicates that the camera is Time for the The feature points are triangulated to obtain the three-dimensional position coordinates of the feature points. Represents the horizontal and vertical coordinates of the feature point projection on the pixel plane;
根据惯导预测得到相机投影点位置信息视觉测量 ,得到重投影误差为:Visual measurement of the camera projection point position information obtained based on inertial navigation prediction , and the reprojection error is:
, ,
其中, 表示对应的噪声, 为时刻系统的状态, 为 时刻相机在导航系下观测的第 个特征点的坐标, 、分别为与、对应的雅克比矩阵;in, represents the corresponding noise, for The state of the system at the moment, for The camera observes the first The coordinates of the feature points, , Respectively , The corresponding Jacobian matrix;
在一段累积更新的时刻会观测到 个特征点,将 个特征点的观测叠加得到累积误差为:Over a period of cumulative updates, it is observed feature points, The cumulative error obtained by superimposing the observations of feature points is:
, ,
其中, 表示对应的噪声, 为系统的状态, 为相机在导航系下观测到的特征点的坐标, 和分别为与和对应的雅克比矩阵;in, represents the corresponding noise, is the state of the system, is the coordinate of the feature point observed by the camera in the navigation system, and Respectively and The corresponding Jacobian matrix;
采用QR分解 矩阵,最后得到的视觉量测方程为:Using QR decomposition matrix, the final visual measurement equation is:
, ,
其中, 为上述视觉量测方程的观测矩阵并由得到,与为QR分解中间过程变量,对得到 具体形式无影响, 为相对应的噪声。in, is the observation matrix of the above visual measurement equation and is get, and is the intermediate process variable of QR decomposition, and we get The specific form has no effect. is the corresponding noise.
进一步地,所述步骤4包括:Furthermore, the step 4 comprises:
选择多状态约束卡尔曼滤波方法进行无人机组合导航系统的信息组合,其中系统状态预测协方差矩阵为:The multi-state constrained Kalman filtering method is selected for information combination of the UAV integrated navigation system, where the system state prediction covariance matrix is:
, ,
其中, 为第时刻惯导状态预测协方差矩阵, 为状态转移矩阵,为第时刻相机系与惯导系之间的旋转和平移量的协方差矩阵,为第时刻相机位姿的协方差矩阵;in, For the The covariance matrix of the inertial navigation state prediction at the moment, is the state transfer matrix, For the The covariance matrix of the rotation and translation between the camera system and the inertial navigation system at the moment, For the The covariance matrix of the camera pose at each moment;
随着时间的累积相机位姿的积累完成状态增强,状态增强后的协方差矩阵变为:As time accumulates, the camera poses accumulate and the state enhancement is completed. The covariance matrix after state enhancement becomes:
, ,
其中, 为新增相机误差状态对原系统误差状态的雅克比矩阵, 为维度的单位矩阵,为时刻观测某个特征点时相机的状态个数;in, is the Jacobian matrix of the newly added camera error state to the original system error state, for The identity matrix of dimension , for The number of camera states when observing a certain feature point at any moment;
在无人机组合导航系统中进行导航信息更新时,卡尔曼增益,新息和协方差状态更新矩阵分别为:When updating navigation information in the UAV integrated navigation system, the Kalman gain , new information And the covariance state update matrix are:
, ,
, , , ,
其中, ,为视觉量测方程的输出矩阵, 为偏振量测方程的输出矩阵, 为时刻的协方差矩阵, 为量测噪声矩阵, ,为视觉的量测, 为偏振的量测,为与维度相同的单位矩阵。in, , is the output matrix of the visual measurement equation, is the output matrix of the polarization measurement equation, for The covariance matrix at time , is the measurement noise matrix, , For visual measurement, is the measurement of polarization, For Identity matrices of the same dimensions.
本发明与现有技术相比优点在于:Compared with the prior art, the present invention has the following advantages:
偏振相机获得的偏振图像在弱光强环境下图像纹理相较于可见光图像有较大提升。使用融合后的偏振度图像和偏振角图像完成特征点提取进而完成视觉量测的获取,弥补VINS导航系统在弱光强场景下精度较低的缺陷,融合点源式偏振传感器提供的偏振量测,实现无人机VINS系统在弱光强长航时场景下的高精度导航。本发明具有精度高、稳定性好、抗干扰能力强等优点。The polarization image obtained by the polarization camera has a much better image texture than the visible light image in a low-light environment. The fused polarization degree image and polarization angle image are used to complete feature point extraction and then complete the acquisition of visual measurement, making up for the low accuracy of the VINS navigation system in low-light scenes, integrating the polarization measurement provided by the point source polarization sensor, and realizing high-precision navigation of the UAV VINS system in low-light and long-flight scenes. The present invention has the advantages of high accuracy, good stability, and strong anti-interference ability.
附图说明BRIEF DESCRIPTION OF THE DRAWINGS
图1为本发明的一种弱光强环境下仿复眼偏振视觉无人机组合导航方法流程图。FIG1 is a flow chart of a method for combined navigation of unmanned aerial vehicles using compound eye polarization vision in a weak light environment according to the present invention.
具体实施方式DETAILED DESCRIPTION
下面结合图1以及实例对本发明的具体实现步骤说明如下:The specific implementation steps of the present invention are described below in conjunction with FIG. 1 and an example:
如图1所示,本发明的一种弱光强环境下仿复眼偏振视觉无人机组合导航方法包括如下步骤:As shown in FIG1 , a method for combined navigation of a UAV using compound eye polarization vision in a weak light environment of the present invention comprises the following steps:
步骤1、通过点源式偏振传感器获得偏振方位角 ,计算求得导航系下的偏振矢量 ,通过太阳年历可以计算得到导航系下太阳矢量 ,基于理想瑞利散射模型中偏振矢量 与导航系下太阳矢量 的垂直关系,结合惯导状态与相机位姿为状态向量建立系统状态方程与偏振量测方程;Step 1: Obtain the polarization azimuth through a point source polarization sensor , calculate and obtain the polarization vector under the navigation system , the solar vector in the navigation system can be calculated through the solar calendar , based on the polarization vector in the ideal Rayleigh scattering model The sun vector with the navigation system The vertical relationship between the inertial navigation state and the camera posture is used as the state vector to establish the system state equation and polarization measurement equation;
步骤2、在弱光强场景下,偏振相机得到偏振度图像 、偏振角图像 ,将二者均值滤波后得到偏振度图像 与偏振角图像 ,相减分解得到偏振度细节图像与偏振角细节图像 ,将偏振度图像 与偏振角图像 加权融合后得到图像,将原偏振图像形态开运算得到选通区域 与选通区域 ,利用选通区域 与选通区域 对偏振度细节图像 与偏振角细节图像 进行优化后得到图像 ,将加权融合后得到图像 与图像 融合后得到融合后的偏振图像 ;Step 2: In low light conditions, the polarization camera obtains a polarization image , polarization angle image , and then filter the two mean values to get the polarization image Image with polarization angle , subtract and decompose to obtain the polarization detail image Detailed image with polarization angle , the polarization image Image with polarization angle After weighted fusion, the image is obtained , the original polarization image is opened to obtain the gated area With gated area , using the gated region With gated area Polarization detail image Detailed image with polarization angle After optimization, the image is obtained , the image is obtained after weighted fusion With image After fusion, the fused polarization image is obtained ;
步骤3、结合步骤2中的偏振图像 ,在针孔模型下,偏振相机得到视觉整体测量并提供相机的运动信息,惯导得到预测视觉测量 ,二者相减得到 时刻第个特征点的重投影误差 ,通过累积叠加与QR分解得到视觉量测方程;Step 3: Combine the polarization image from step 2 , under the pinhole model, the polarization camera obtains the visual overall measurement And provide the camera's motion information, inertial navigation to obtain predicted visual measurement , subtracting the two to get Moment The reprojection error of feature points , the visual measurement equation is obtained through cumulative superposition and QR decomposition;
步骤4、结合步骤1与步骤3建立的偏振量测方程与视觉量测方程,利用多状态约束卡尔曼滤波对各个导航子系统获得的数据进行信息融合,修正惯导逐渐累积的误差,解算无人机组合导航信息。Step 4: Combine the polarization measurement equation and visual measurement equation established in Step 1 and Step 3, use multi-state constrained Kalman filtering to fuse the data obtained by each navigation subsystem, correct the gradually accumulated errors of the inertial navigation, and solve the UAV combined navigation information.
具体实施步骤如下:The specific implementation steps are as follows:
在步骤1中选择导航系到载体系的小角度旋转 ,载体在导航系下的估计速度与真实速度差 和估计位置与真实位置差 ,陀螺仪测量的角速度与真实角速度零偏差 ,加速度计测量的线速度与真实线速度零偏差 以及每个时刻相机的位姿作为组合导航系统的状态变量 ,其中和分别表示相机在导航系下的姿态与位置, 具体形式如下:Select a small rotation angle from the navigation system to the carrier system in step 1 , the difference between the estimated speed and the actual speed of the carrier in the navigation system The difference between the estimated position and the true position , the angular velocity measured by the gyroscope has zero deviation from the true angular velocity , the linear velocity measured by the accelerometer has zero deviation from the true linear velocity And the camera position at each moment As the state variable of the integrated navigation system ,in and Respectively represent the attitude and position of the camera in the navigation system, The specific form is as follows:
, ,
, ,
系统状态方程为:The system state equation is:
, ,
其中, 为状态转移矩阵, 为系统噪声驱动阵, 为系统噪声。in, is the state transfer matrix, is the system noise driving array, is the system noise.
定义载体系下的偏振矢量为 ,太阳矢量为,为从载体系到导航系的坐标转换矩阵;点源式偏振传感器首先测得偏振方位角 之后,得到在载体系下的偏振矢量 :The polarization vector under the carrier system is defined as , the sun vector is , is the coordinate conversion matrix from the carrier system to the navigation system; the point source polarization sensor first measures the polarization azimuth After that, the polarization vector under the carrier system is obtained :
, ,
根据理想瑞利散射模型,导航系中的偏振矢量 垂直于太阳矢量 ,得到:According to the ideal Rayleigh scattering model, the polarization vector in the navigation system Perpendicular to the sun vector ,get:
, ,
基于此偏振测量方程为:Based on this polarization measurement equation is:
, ,
其中,, , 为偏振量测噪声。in, , , is the polarization measurement noise.
在步骤2中偏振相机获得偏振度图像 与偏振角图像 ,首先分别对二者进行均值滤波部分消除偏振图像的噪声,滤波后得到偏振度图像 与偏振角图像 ,原偏振图像与均值滤波后的偏振图像相减分解获得偏振度细节图像与偏振角细节图像:In step 2, the polarization camera obtains a polarization image Image with polarization angle First, the two are respectively subjected to mean filtering to partially eliminate the noise of the polarization image, and the polarization degree image is obtained after filtering. Image with polarization angle , the original polarization image and the polarization image after mean filtering are subtracted and decomposed to obtain the polarization degree detail image and the polarization angle detail image:
, ,
, ,
其次将均值滤波后的偏振图像加权融合得到图像 ,即:Secondly, the polarization images after mean filtering are weighted fused to obtain the image ,Right now:
, ,
其中,系数与应满足;Among them, the coefficient and Should meet ;
再次将偏振度图像 与偏振角图像 进行形态开运算后得到偏振度图像 与偏振角图像 ,计算与的灰度均值,将形态开运算后的偏振度图像中的各像素灰度值与 比较,将形态开运算后的偏振角图像中的各像素灰度值与比较,具体规则如下:Again, the polarization image Image with polarization angle After performing morphological opening operation, the polarization image is obtained Image with polarization angle ,calculate and The grayscale mean of the image is the polarization image after the morphological opening operation. The gray value of each pixel in and Compare the polarization angle image after the morphological opening operation The gray value of each pixel in and For comparison, the specific rules are as follows:
, ,
将 各个像素点整合得到偏振度图像选通区域;将 各个像素点整合得到偏振角图像选通区域 。对偏振度细节图像 与偏振角细节图像进行选通加权融合得到偏振图像 :Will Each pixel is integrated to obtain the polarization image gating area ;Will Each pixel point is integrated to obtain the polarization angle image gating area . For polarization detail images Detailed image with polarization angle Perform gated weighted fusion to obtain polarization images :
, ,
将 与融合得到最终融合后的偏振图像 ,即:Will and The final fused polarization image is obtained by fusion ,Right now:
, ,
在步骤3中,将特征点的位置信息通过投影模型在像素平面得到位置坐标,该像素平面位置坐标对相机的运动信息和特征点的位置信息提供测量,利用特征点三角化得到特征点的三维位置坐标,在针孔模型投影下,视觉整体测量为:In step 3, the position information of the feature points is projected on the pixel plane to obtain the position coordinates. The pixel plane position coordinates provide measurement for the camera's motion information and the position information of the feature points. The three-dimensional position coordinates of the feature points are obtained by triangulating the feature points. Under the pinhole model projection, the overall visual measurement is:
, ,
其中,表示第 时刻对第个特征点的测量, 表示相机在第 时刻对于第 个特征点三角化得到特征点的三维位置坐标,表示特征点投影在像素平面的横纵坐标;in, Indicates Time to The measurement of feature points, Indicates that the camera is Time for the The feature points are triangulated to obtain the three-dimensional position coordinates of the feature points. Represents the horizontal and vertical coordinates of the feature point projection on the pixel plane;
根据惯导预测得到相机投影点位置信息视觉测量 ,得到重投影误差为:,其中, 表示对应的噪声, 为时刻系统的状态, 为 时刻相机在导航系下观测的第 个特征点的坐标, 、分别为与、对应的雅克比矩阵。Visual measurement of the camera projection point position information obtained based on inertial navigation prediction , and the reprojection error is: ,in, represents the corresponding noise, for The state of the system at the moment, for The camera observes the first The coordinates of the feature points, , Respectively , The corresponding Jacobian matrix.
在一段累积更新的时刻会观测到 个特征点,将 个特征点的观测叠加得到累积误差为:Over a period of cumulative updates, it is observed feature points, The cumulative error obtained by superimposing the observations of feature points is:
, ,
其中, 表示对应的噪声, 为系统的状态, 为相机在导航系下观测到的特征点的坐标, 和分别为与 和对应的雅克比矩阵;in, represents the corresponding noise, is the state of the system, is the coordinate of the feature point observed by the camera in the navigation system, and Respectively and The corresponding Jacobian matrix;
在实际计算过程中,由于 的维度太大,采用QR分解 矩阵,最后得到的视觉量测方程为:In the actual calculation process, due to The dimension is too large, so QR decomposition is used Matrix, the final visual measurement equation is:
, ,
其中, 为上述视觉量测方程的观测矩阵并由得到,与为QR分解中间过程变量,对得到 具体形式无影响, 为相对应的噪声。in, is the observation matrix of the above visual measurement equation and is get, and is the intermediate process variable of QR decomposition, and we get The specific form has no effect. is the corresponding noise.
在步骤4中选择多状态约束卡尔曼滤波方法进行无人机组合导航系统的信息组合,其中系统状态预测协方差矩阵为:In step 4, the multi-state constrained Kalman filter method is selected to combine the information of the UAV integrated navigation system, where the system state prediction covariance matrix is:
, ,
其中, 为第时刻惯导状态预测协方差矩阵, 为状态转移矩阵,为第时刻相机系与惯导系之间的旋转和平移量的协方差矩阵,为第时刻相机位姿的协方差矩阵;in, For the The covariance matrix of the inertial navigation state prediction at the moment, is the state transfer matrix, For the The covariance matrix of the rotation and translation between the camera system and the inertial navigation system at the moment, For the The covariance matrix of the camera pose at each moment;
随着时间的累积相机位姿的积累需要完成状态增强,状态增强后的协方差矩阵变为:As time goes by, the camera pose accumulation needs to complete state enhancement, and the covariance matrix after state enhancement becomes:
, ,
其中, 为新增相机误差状态对原系统误差状态的雅克比矩阵, 为维度的单位矩阵,为时刻观测某个特征点时相机的状态个数。in, is the Jacobian matrix of the newly added camera error state to the original system error state, for The identity matrix of dimension , for The number of camera states when observing a feature point at any moment.
在无人机组合导航系统中进行导航信息更新时,卡尔曼增益,新息和协方差状态更新矩阵分别为:When updating navigation information in the UAV integrated navigation system, the Kalman gain , new information And the covariance state update matrix are:
, ,
, , , ,
其中, , 为视觉量测方程的输出矩阵, 为偏振量测方程的输出矩阵, 为时刻的协方差矩阵, 为量测噪声矩阵, , 为视觉的量测, 为偏振的量测,为与 维度相同的单位矩阵。in, , is the output matrix of the visual measurement equation, is the output matrix of the polarization measurement equation, for The covariance matrix at time , is the measurement noise matrix, , For visual measurement, is the measurement of polarization, For Identity matrices of the same dimensions.
本发明说明书中未作详细描述的内容属于本领域专业技术人员公知的现有技术。以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以所述权利要求的保护范围为准。The contents not described in detail in the specification of the present invention belong to the prior art known to the professional and technical personnel in the field. The above description is only a specific embodiment of the present invention, but the protection scope of the present invention is not limited thereto. Any technician familiar with the technical field can easily think of changes or substitutions within the technical scope disclosed by the present invention, which should be included in the protection scope of the present invention. Therefore, the protection scope of the present invention shall be based on the protection scope of the claims.
Claims (5)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310474059.3A CN116182855B (en) | 2023-04-28 | 2023-04-28 | Combined navigation method of compound eye-simulated polarized vision unmanned aerial vehicle under weak light and strong environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310474059.3A CN116182855B (en) | 2023-04-28 | 2023-04-28 | Combined navigation method of compound eye-simulated polarized vision unmanned aerial vehicle under weak light and strong environment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116182855A true CN116182855A (en) | 2023-05-30 |
CN116182855B CN116182855B (en) | 2023-07-07 |
Family
ID=86452727
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310474059.3A Active CN116182855B (en) | 2023-04-28 | 2023-04-28 | Combined navigation method of compound eye-simulated polarized vision unmanned aerial vehicle under weak light and strong environment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116182855B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117191047A (en) * | 2023-11-03 | 2023-12-08 | 南京信息工程大学 | Unmanned aerial vehicle self-adaptive active visual navigation method and device in low-light environment |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140022539A1 (en) * | 2012-07-23 | 2014-01-23 | Trimble Navigation Limited | Use of a sky polarization sensor for absolute orientation determination in position determining systems |
CN106767752A (en) * | 2016-11-25 | 2017-05-31 | 北京航空航天大学 | A kind of Combinated navigation method based on polarization information |
CN108387206A (en) * | 2018-01-23 | 2018-08-10 | 北京航空航天大学 | A kind of carrier three-dimensional attitude acquisition method based on horizon and polarised light |
CN111504312A (en) * | 2020-07-02 | 2020-08-07 | 中国人民解放军国防科技大学 | Unmanned aerial vehicle pose estimation method based on visual inertial polarized light fusion |
CN112066979A (en) * | 2020-08-27 | 2020-12-11 | 北京航空航天大学 | Polarization pose information coupling iteration autonomous navigation positioning method |
CN112444245A (en) * | 2020-11-17 | 2021-03-05 | 大连理工大学 | Insect-imitated vision integrated navigation method based on polarized light, optical flow vector and binocular vision sensor |
CN113739795A (en) * | 2021-06-03 | 2021-12-03 | 东北电力大学 | Underwater synchronous positioning and mapping method based on polarized light/inertia/vision combined navigation |
CN113819907A (en) * | 2021-11-22 | 2021-12-21 | 北京航空航天大学 | An Inertial/Polarization Navigation Method Based on Polarization and Sun Dual Vector Switching |
CN113819904A (en) * | 2021-11-22 | 2021-12-21 | 北京航空航天大学 | polarization/VIO three-dimensional attitude determination method based on zenith vector |
CN113834483A (en) * | 2021-11-23 | 2021-12-24 | 北京航空航天大学 | An Inertial/Polarization/Geomagnetic Fault Tolerant Navigation Method Based on Observability |
-
2023
- 2023-04-28 CN CN202310474059.3A patent/CN116182855B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140022539A1 (en) * | 2012-07-23 | 2014-01-23 | Trimble Navigation Limited | Use of a sky polarization sensor for absolute orientation determination in position determining systems |
CN106767752A (en) * | 2016-11-25 | 2017-05-31 | 北京航空航天大学 | A kind of Combinated navigation method based on polarization information |
CN108387206A (en) * | 2018-01-23 | 2018-08-10 | 北京航空航天大学 | A kind of carrier three-dimensional attitude acquisition method based on horizon and polarised light |
CN111504312A (en) * | 2020-07-02 | 2020-08-07 | 中国人民解放军国防科技大学 | Unmanned aerial vehicle pose estimation method based on visual inertial polarized light fusion |
CN112066979A (en) * | 2020-08-27 | 2020-12-11 | 北京航空航天大学 | Polarization pose information coupling iteration autonomous navigation positioning method |
CN112444245A (en) * | 2020-11-17 | 2021-03-05 | 大连理工大学 | Insect-imitated vision integrated navigation method based on polarized light, optical flow vector and binocular vision sensor |
CN113739795A (en) * | 2021-06-03 | 2021-12-03 | 东北电力大学 | Underwater synchronous positioning and mapping method based on polarized light/inertia/vision combined navigation |
CN113819907A (en) * | 2021-11-22 | 2021-12-21 | 北京航空航天大学 | An Inertial/Polarization Navigation Method Based on Polarization and Sun Dual Vector Switching |
CN113819904A (en) * | 2021-11-22 | 2021-12-21 | 北京航空航天大学 | polarization/VIO three-dimensional attitude determination method based on zenith vector |
CN113834483A (en) * | 2021-11-23 | 2021-12-24 | 北京航空航天大学 | An Inertial/Polarization/Geomagnetic Fault Tolerant Navigation Method Based on Observability |
Non-Patent Citations (4)
Title |
---|
SHANPENG WANG等: "A Bioinspired Navigation System for Multirotor UAV by Integrating Polarization Compass/Magnetometer/INS/GNSS", 《IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS》, vol. 70, no. 8, pages 8526 - 8536, XP011936621, DOI: 10.1109/TIE.2022.3212421 * |
刘杰: "基于偏振成像的双目立体视觉应用技术研究", 《中国博士学位论文全文数据库》, no. 5, pages 005 - 19 * |
张霄等: "强干扰环境下的自主导航与控制新技术", 《自动化博览》, no. 4, pages 68 - 72 * |
胡瀚珮: "偏振光、IMU和单目视觉的组合导航研究", 《中国优秀硕士学位论文全文数据库》, no. 2, pages 136 - 1957 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117191047A (en) * | 2023-11-03 | 2023-12-08 | 南京信息工程大学 | Unmanned aerial vehicle self-adaptive active visual navigation method and device in low-light environment |
CN117191047B (en) * | 2023-11-03 | 2024-02-23 | 南京信息工程大学 | Unmanned aerial vehicle self-adaptive active visual navigation method and device in low-light environment |
Also Published As
Publication number | Publication date |
---|---|
CN116182855B (en) | 2023-07-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109307508B (en) | Panoramic inertial navigation SLAM method based on multiple key frames | |
CN109540126B (en) | An Inertial Vision Integrated Navigation Method Based on Optical Flow Method | |
CN112556719B (en) | Visual inertial odometer implementation method based on CNN-EKF | |
CN110068335B (en) | A method and system for real-time positioning of UAV swarms in GPS-denied environment | |
CN105865454B (en) | A kind of Navigation of Pilotless Aircraft method generated based on real-time online map | |
CN112304307A (en) | Positioning method and device based on multi-sensor fusion and storage medium | |
CN111121767A (en) | A GPS-integrated robot vision-inertial navigation combined positioning method | |
CN112683281B (en) | A joint localization method for autonomous vehicles based on vehicle kinematics | |
CN115574816B (en) | Bionic vision multi-source information intelligent perception unmanned platform | |
CN106989744A (en) | A kind of rotor wing unmanned aerial vehicle autonomic positioning method for merging onboard multi-sensor | |
CN111288989B (en) | Visual positioning method for small unmanned aerial vehicle | |
CN113551665B (en) | A highly dynamic motion state perception system and perception method for motion carriers | |
CN110887486B (en) | Unmanned aerial vehicle visual navigation positioning method based on laser line assistance | |
CN113807435B (en) | Multi-sensor-based remote sensing image feature point elevation acquisition method | |
CN115406447B (en) | Autonomous positioning method of quad-rotor unmanned aerial vehicle based on visual inertia in rejection environment | |
CN115479602A (en) | A Visual-Inertial Odometry Method Fused with Event and Distance | |
CN116989772B (en) | An air-ground multi-modal multi-agent collaborative positioning and mapping method | |
CN115560760A (en) | Unmanned aerial vehicle-oriented vision/laser ranging high-altitude navigation method | |
CN113240597B (en) | Three-dimensional software image stabilizing method based on visual inertial information fusion | |
CN114964276A (en) | Dynamic vision SLAM method fusing inertial navigation | |
CN109785428B (en) | A Handheld 3D Reconstruction Method Based on Polymorphic Constrained Kalman Filter | |
CN118274817A (en) | Monocular panoramic vision and inertia combined initialization system for on-line space-time calibration | |
CN113345032A (en) | Wide-angle camera large-distortion image based initial image construction method and system | |
CN117073720A (en) | Method and equipment for quick visual inertia calibration and initialization under weak environment and weak action control | |
CN113503872A (en) | Low-speed unmanned vehicle positioning method based on integration of camera and consumption-level IMU |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |