[go: up one dir, main page]

CN104808231B - Unmanned plane localization method based on GPS Yu light stream Data Fusion of Sensor - Google Patents

Unmanned plane localization method based on GPS Yu light stream Data Fusion of Sensor Download PDF

Info

Publication number
CN104808231B
CN104808231B CN201510104096.0A CN201510104096A CN104808231B CN 104808231 B CN104808231 B CN 104808231B CN 201510104096 A CN201510104096 A CN 201510104096A CN 104808231 B CN104808231 B CN 104808231B
Authority
CN
China
Prior art keywords
gps
unmanned plane
information
light stream
positional information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510104096.0A
Other languages
Chinese (zh)
Other versions
CN104808231A (en
Inventor
鲜斌
曹美会
张旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN201510104096.0A priority Critical patent/CN104808231B/en
Publication of CN104808231A publication Critical patent/CN104808231A/en
Application granted granted Critical
Publication of CN104808231B publication Critical patent/CN104808231B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)

Abstract

The present invention relates to a kind of unmanned plane localization method, to realize unmanned plane autonomous positioning, therefore, the present invention is adopted the technical scheme that, the unmanned plane localization method based on GPS Yu light stream Data Fusion of Sensor comprises the following steps:The velocity information and velocity information of unmanned plane are obtained using the light stream sensor installed in four rotor wing unmanned aerial vehicle bottoms, the positional information of unmanned plane is obtained using GPS and judges the reliability of positional information, above- mentioned information is merged using complementary filter, in the alignment system of unmanned plane.Present invention is mainly applied to unmanned plane positioning.

Description

基于GPS与光流传感器数据融合的无人机定位方法UAV positioning method based on GPS and optical flow sensor data fusion

技术领域technical field

本发明涉及一种无人机定位方法,特别是涉及一种基于GPS与光流传感器数据融合的无人机自主定位方法。The invention relates to a positioning method for an unmanned aerial vehicle, in particular to an autonomous positioning method for an unmanned aerial vehicle based on fusion of GPS and optical flow sensor data.

背景技术Background technique

无人机的定位问题主要是指利用自身传感器确定无人机在飞行环境中相对于惯性坐标系的位置和姿态信息。准确的位姿估计是实现四旋翼无人机安全飞行、轨迹规划以及目标跟踪等复杂飞行任务的前提和基础。The positioning problem of the UAV mainly refers to the use of its own sensors to determine the position and attitude information of the UAV relative to the inertial coordinate system in the flight environment. Accurate pose estimation is the premise and basis for complex flight tasks such as safe flight, trajectory planning, and target tracking of quadrotor UAVs.

目前广泛使用的无人机导航系统主要是基于GPS定位的方法,但是其定位精度较低。而近几年广泛使用的基于光流法的无人机定位系统虽定位精度较高,但其定位结果随时间增加存在较大的位置漂移。Currently widely used UAV navigation system is mainly based on GPS positioning method, but its positioning accuracy is low. In recent years, although the UAV positioning system based on the optical flow method has high positioning accuracy, its positioning results have a large position drift with time.

新加坡国立大学的研究人员使用卡尔曼滤波和互补滤波器结合的算法和H∞鲁棒滤波算法,成功将GPS获取的无人机位置信息与速度信息进行融合,克服了GPS获取的位置信息的不连续性,得到了较为准确且平滑的无人机位置信息,但是由于受到GPS数据输出精度的限制,其定位精度约为±1米(会议:IEEE International Conference on ControlandAutomation;著者:Yun B,Peng K,Chen B M;出版年月:2007年;文章题目:Enhancement ofGPS Signals for Automatic Controlof a UAV Helicopter System;页码:1185-1189)(期刊:Transactions of theInstitute of Measurement and Control;著者:Yun B,CaiG,Chen B M,出版年月:2011年;文章题目:GPS Signal Enhancement and AttitudeDeterminationfor a Mini and Low-costUnmanned Aerial Vehicle,页码:665–682)。Researchers from the National University of Singapore used the algorithm combining Kalman filter and complementary filter and the H∞ robust filter algorithm to successfully fuse the position information and velocity information of the UAV obtained by GPS, overcoming the inconsistency of the position information obtained by GPS. Continuity, to obtain relatively accurate and smooth UAV position information, but due to the limitation of GPS data output accuracy, its positioning accuracy is about ±1 meter (Conference: IEEE International Conference on Control and Automation; Author: Yun B, Peng K , Chen B M; publication year: 2007; article title: Enhancement ofGPS Signals for Automatic Control of a UAV Helicopter System; page number: 1185-1189) (Journal: Transactions of the Institute of Measurement and Control; Author: Yun B, CaiG, Chen B M, Year and Month of Publication: 2011; Article Title: GPS Signal Enhancement and Attitude Determination for a Mini and Low-cost Unmanned Aerial Vehicle, Pages: 665–682).

瑞士苏黎世联邦理工学院的研究人员采用PX4FLOW光流传感器作为位置测量单元并用于室内外无人机的定位控制系统中,该研究机构虽实现了基于光流法的室内轨迹跟踪实验,但从图中可以看出,光流传感器获取的无人机速度信息经长距离积分作用后,在边长约为3米的连续两圈矩形轨迹跟踪过程中,其最大定位误差大于0.5米,在随时间累加过程中其位置信息存在较大的漂移(会议:IEEE International Conference on RoboticsandAutomation;著者:Honegger D,Meier L,Tanskanen P;出版年月:2013年;文章题目:AnOpen Source and Open HardwareEmbedded Metric Optical Flow CMOS Camera forIndoor and Outdoor Applications;页码:1736-1741)。Researchers from the Swiss Federal Institute of Technology in Zurich used the PX4FLOW optical flow sensor as a position measurement unit and used it in the positioning control system of indoor and outdoor drones. Although the research institution realized the indoor trajectory tracking experiment based on the optical flow method, the It can be seen that after the long-distance integration of the speed information of the UAV obtained by the optical flow sensor, the maximum positioning error is greater than 0.5 meters in the process of tracking two consecutive rectangular trajectories with a side length of about 3 meters. There is a large drift in its position information during the process (Conference: IEEE International Conference on Robotics and Automation; Author: Honegger D, Meier L, Tanskanen P; Year of publication: 2013; Article title: AnOpen Source and Open Hardware Embedded Metric Optical Flow CMOS Camera for Indoor and Outdoor Applications; pp. 1736-1741).

此外,瑞士苏黎世联邦理工学院的研究人员还利用光流法作为辅助位置测量单元用于无人机的定位控制系统中,但由于长距离运动过程中光流法因积分作用产生较大的位置漂移,其连续两周矩形轨迹之间存在较大的定位误差(会议:IEEE/RSJ InternationalConferenceon Intelligent Robots and Systems;著者:Fraundorfer F,Heng L,Honegger D;出版年月:2012年;文章题目:Vision-based Autonomous MappingandExploration Using a Quadrotor MAV;页码:4557-4564)。In addition, researchers from the Swiss Federal Institute of Technology in Zurich also used the optical flow method as an auxiliary position measurement unit for the positioning control system of the drone, but due to the large position drift caused by the integral effect of the optical flow method during the long-distance movement , there is a large positioning error between its two consecutive weeks of rectangular trajectories (Conference: IEEE/RSJ InternationalConference on Intelligent Robots and Systems; Author: Fraundorfer F, Heng L, Honegger D; Year of publication: 2012; Article title: Vision- based Autonomous Mapping and Exploration Using a Quadrotor MAV; pp. 4557-4564).

发明内容Contents of the invention

为克服现有技术的不足,实现无人机自主定位,为此,本发明采取的技术方案是,基于GPS与光流传感器数据融合的无人机定位方法,包括如下步骤:In order to overcome the deficiencies in the prior art and realize the autonomous positioning of the UAV, the technical solution adopted by the present invention is a method for positioning the UAV based on the fusion of GPS and optical flow sensor data, including the following steps:

利用安装在四旋翼无人机底部的光流传感器获取无人机的速度信息及速度信息,利用GPS获取无人机的位置信息并判断位置信息的可靠性,采用互补滤波器将上述信息进行融合,用于无人机的定位系统中。Use the optical flow sensor installed at the bottom of the quadrotor UAV to obtain the speed information and speed information of the UAV, use GPS to obtain the position information of the UAV and judge the reliability of the position information, and use complementary filters to fuse the above information , used in the positioning system of the UAV.

所述的利用光流传感器获取无人机的速度信息及速度信息处理是:The speed information and speed information processing of the speed information obtained by the optical flow sensor of the unmanned aerial vehicle are:

光流法直接获取的是无人机的水平速度信息,所以需要通过积分运算获得其位置信息,同时,由于其他不可测因素的影响,在速度测量过程中会有短暂的错误测量,需要人为将其滤除。The optical flow method directly obtains the horizontal velocity information of the UAV, so it needs to obtain its position information through integral calculation. It filters out.

采用GPS法获取的位置信息和由光流法获取的速度信息简化为如下形式:The position information obtained by the GPS method and the speed information obtained by the optical flow method are simplified into the following form:

yp=p+μp y p =p+μ p

yv=v+μv y v =v+μ v

这里的p、v分别为采用GPS法和光流法获取的真实位置和速度测量值,μp和μv分别为位置信息和速度信息的测量噪声,均为恒定值。Here, p and v are the real position and velocity measurements obtained by GPS method and optical flow method respectively, and μ p and μ v are the measurement noise of position information and velocity information respectively, both of which are constant values.

所述的GPS获取无人机的位置信息并判断位置信息的可靠性是:Described GPS obtains the position information of unmanned aerial vehicle and judges the reliability of position information is:

由于GPS信息具有不稳定性,需要检测GPS信息,若用于定位过程中的卫星数量低于5颗星时,则认为该方法获取的无人机位置信息不够准确。Due to the instability of GPS information, it is necessary to detect GPS information. If the number of satellites used in the positioning process is less than 5, it is considered that the position information of the UAV obtained by this method is not accurate enough.

采用互补滤波器将上述信息进行融合具体为:将测量结果通过互补滤波器后的输出的频域表达式写成如下形式:Using a complementary filter to fuse the above information is specifically: the output of the measurement results after passing through the complementary filter The frequency domain expression of is written as follows:

其中s为拉普拉斯算子,T(s)为一阶低通滤波,S(s)为一阶高通滤波,T(s)+S(s)=1,yp(s)表示由传感器得到在频域内的位置信息;互补滤波控制器采用比例反馈,即C(s)=k;在这种情况下,得到闭环系统的动态方程为Where s is the Laplacian operator, T(s) is the first-order low-pass filter, S(s) is the first-order high-pass filter, T(s)+S(s)=1, y p (s) means The sensor obtains position information in the frequency domain; the complementary filter controller adopts proportional feedback, that is, C(s)=k; in this case, the dynamic equation of the closed-loop system is

此时滤波器频域表达形式为此时设计的滤波器截止频率为在大于fT的高频段,光流传感器数据对融合结果起主要作用,在小于fT的低频段,GPS数据对融合结果起主要作用,k为常数。At this time, the filter frequency domain expression form is with The cutoff frequency of the designed filter is In the high frequency band greater than f T , the optical flow sensor data plays a major role in the fusion result, and in the low frequency band smaller than f T , the GPS data plays a major role in the fusion result, and k is a constant.

与已有技术相比,本发明的技术特点与效果:Compared with prior art, technical characteristic and effect of the present invention:

本发明采用互补滤波算法将采用GPS和光流法获取的无人机位置信息进行融合,实现在较长时间范围内获取无人机的高精度准确位置信息,满足无人机长时间自主飞行控制的需要。The invention uses a complementary filtering algorithm to fuse the position information of the UAV obtained by GPS and optical flow method, so as to realize the acquisition of high-precision and accurate position information of the UAV in a relatively long time range, and meet the requirements of long-term autonomous flight control of the UAV. need.

附图说明Description of drawings

图1是本发明采用的互补滤波器结构框图;Fig. 1 is the structural block diagram of complementary filter that the present invention adopts;

图2无人机水平方向运动轨迹示意图,其中,Figure 2 is a schematic diagram of the trajectory of the UAV in the horizontal direction, in which,

图2a是采用GPS获取的无人机水平方向运动轨迹;Figure 2a is the trajectory of the UAV in the horizontal direction acquired by GPS;

图2b是采用光流传感器获取的无人机水平方向运动轨迹;Figure 2b is the horizontal motion trajectory of the UAV acquired by the optical flow sensor;

图2c是采用数据融合算法获取的无人机水平方向运动轨迹。Figure 2c is the horizontal movement trajectory of the UAV obtained by using the data fusion algorithm.

具体实施方式detailed description

本发明所要解决的技术问题是,提供一种基于GPS与光流传感器数据融合的无人机自主定位方法,实现室外环境下无人机的精准无漂移定位。The technical problem to be solved by the present invention is to provide an autonomous positioning method for unmanned aerial vehicles based on fusion of GPS and optical flow sensor data, so as to realize accurate and drift-free positioning of unmanned aerial vehicles in outdoor environments.

本发明采用的技术方案是:采用GPS与光流传感器数据融合的方法用于无人机的定位系统中,包括如下步骤:The technical scheme adopted in the present invention is: adopt the method for GPS and optical flow sensor data fusion to be used in the positioning system of unmanned aerial vehicle, comprise the following steps:

利用安装在四旋翼无人机底部的光流传感器获取无人机的速度信息及速度信息,利用GPS获取无人机的位置信息并判断位置信息的可靠性,采用互补滤波器将上述数据进行融合,用于无人机的定位系统中。Use the optical flow sensor installed at the bottom of the quadrotor UAV to obtain the speed information and speed information of the UAV, use GPS to obtain the position information of the UAV and judge the reliability of the position information, and use complementary filters to fuse the above data , used in the positioning system of the UAV.

所述的利用光流传感器获取无人机的速度信息及速度信息处理是:The speed information and speed information processing of the speed information obtained by the optical flow sensor of the unmanned aerial vehicle are:

光流法直接获取的是无人机的水平速度信息,所以需要通过积分运算获得其位置信息,同时,由于其他不可测因素的影响,在速度测量过程中会有短暂的错误测量,需要人为将其滤除。The optical flow method directly obtains the horizontal velocity information of the UAV, so it needs to obtain its position information through integral calculation. It filters out.

所述的GPS获取无人机的位置信息并判断位置信息的可靠性是:Described GPS obtains the position information of unmanned aerial vehicle and judges the reliability of position information is:

由于GPS信息具有不稳定性,其信号强弱会受到太阳活动等的影响,所以本文中需要检测GPS信息,若用于定位过程中的卫星数量低于5颗星时,则认为该方法获取的无人机位置信息不够准确;Due to the instability of GPS information, its signal strength will be affected by solar activities, etc., so it is necessary to detect GPS information in this paper. If the number of satellites used in the positioning process is less than 5, it is considered that the method obtained The location information of the drone is not accurate enough;

所述的采用互补滤波器进行数据融合是:Described adopting complementary filter to carry out data fusion is:

采用互补滤波算法将采用GPS和光流法获取的无人机位置信息进行融合,实现在较长时间范围内获取无人机的高精度准确位置信息,满足无人机长时间自主飞行控制的需要。以下将主要对基于互补滤波器的数据融合算法进行介绍。The complementary filtering algorithm is used to fuse the position information of the UAV obtained by GPS and optical flow method, so as to obtain the high-precision and accurate position information of the UAV in a long time range, and meet the needs of long-term autonomous flight control of the UAV. The following will mainly introduce the data fusion algorithm based on complementary filter.

由于采用基于GPS和光流法的定位算法只能直接或间接获取无人机的水平位置信息,所以本发明中只考虑水平方向无人机的位置信息融合方法。Since the positioning algorithm based on GPS and optical flow method can only directly or indirectly obtain the horizontal position information of the UAV, only the position information fusion method of the UAV in the horizontal direction is considered in the present invention.

在理想状态下,无人机的位置信息p=[x y]T与其对应速度信息v=[vx vy]T具有如下关系:In an ideal state, the UAV’s position information p=[xy] T and its corresponding speed information v=[v x v y ] T have the following relationship:

其中x和y分别为无人机X和Y方向的位置信息,vx和vy分别为无人机X和Y方向的速度信息,方程式左面为p的一阶导数。Among them, x and y are the position information of the UAV in the X and Y directions respectively, v x and vy are the velocity information of the UAV in the X and Y directions respectively, and the left side of the equation is the first derivative of p.

但在实际测量过程中,由于传感器自身精度及外界干扰的存在,测量结果往往包含有大量的噪声信号和干扰信息,这里将采用GPS法获取的位置信息和由光流法获取的速度信息简化为如下形式:However, in the actual measurement process, due to the sensor's own accuracy and the existence of external interference, the measurement results often contain a large number of noise signals and interference information. Here, the position information obtained by the GPS method and the speed information obtained by the optical flow method are simplified as In the following form:

yp=p+μp y p =p+μ p

yv=v+μv y v =v+μ v

这里的p、v分别为采用GPS法和光流法获取的真实位置和速度测量值,μp和μv分别为位置信息和速度信息的测量噪声,均为恒定值。Here, p and v are the real position and velocity measurements obtained by GPS method and optical flow method respectively, and μ p and μ v are the measurement noise of position information and velocity information respectively, both of which are constant values.

采用互补滤波算法的结构框图如图1所示,根据系统的结构框图,可以将测量结果通过互补滤波器后的输出的频域表达式写成如下形式:The structural block diagram of the complementary filtering algorithm is shown in Figure 1. According to the structural block diagram of the system, the measurement results can be output after passing through the complementary filter The frequency domain expression of is written as follows:

其中s为拉普拉斯算子,T(s)为一阶低通滤波,S(s)为一阶高通滤波,T(s)+S(s)=1,yp(s)表示由传感器得到在频域内的位置信息。Where s is the Laplacian operator, T(s) is the first-order low-pass filter, S(s) is the first-order high-pass filter, T(s)+S(s)=1, y p (s) means The sensor obtains position information in the frequency domain.

在实际控制系统中,根据yp(s)的低通特性和yv(s)的高通特性,设计互补滤波控制器为比例反馈,即C(s)=k;在这种情况下,可以得到闭环系统的动态方程为In the actual control system, according to the low-pass characteristics of y p (s) and the high-pass characteristics of y v (s), the complementary filter controller is designed as proportional feedback, that is, C(s)=k; in this case, it can be The dynamic equation of the closed-loop system is obtained as

此时滤波器频域表达形式为此时设计的滤波器截止频率为在大于fT的高频段,光流传感器数据对融合结果起主要作用,在小于fT的低频段,GPS数据对融合结果起主要作用,k为常数。At this time, the filter frequency domain expression form is with The cutoff frequency of the designed filter is In the high frequency band greater than f T , the optical flow sensor data plays a major role in the fusion result, and in the low frequency band smaller than f T , the GPS data plays a major role in the fusion result, and k is a constant.

下面结合实施例和附图对本发明基于GPS与光流传感器数据融合的无人机自主定位方法做出详细说明。The autonomous positioning method of the drone based on the fusion of GPS and optical flow sensor data of the present invention will be described in detail below in combination with the embodiments and the accompanying drawings.

本发明综合考虑基于GPS的定位方法与基于光流法的定位方法的优点,充分利用GPS长时间长距离定位与光流法短时间内定位精度较高的优势,采用互补滤波方法将两种定位算法进行融合,并在室外环境中通过手持四旋翼无人机进行了相应的实验验证,实现基于多传感器数据融合的四旋翼无人机精准定位功能。The present invention comprehensively considers the advantages of the positioning method based on GPS and the positioning method based on the optical flow method, fully utilizes the advantages of long-term long-distance positioning of GPS and high positioning accuracy of the optical flow method in a short time, and adopts a complementary filtering method to combine the two positioning methods The algorithm is fused, and the corresponding experimental verification is carried out in an outdoor environment with a handheld quadrotor UAV, and the precise positioning function of the quadrotor UAV based on multi-sensor data fusion is realized.

本发明基于GPS与光流传感器数据融合的无人机自主定位方法,包括以下步骤:The present invention is based on the unmanned aerial vehicle autonomous positioning method of GPS and optical flow sensor data fusion, comprises the following steps:

1)利用光流传感器获取无人机的速度信息及速度信息处理:1) Use the optical flow sensor to obtain the speed information of the UAV and process the speed information:

光流法直接获取的是无人机的水平速度信息,所以需要通过积分运算获得其位置信息,同时,由于其他不可测因素的影响,在速度测量过程中会有短暂的错误测量,需要人为将其滤除。The optical flow method directly obtains the horizontal velocity information of the UAV, so it needs to obtain its position information through integral calculation. It filters out.

2)利用GPS获取无人机的位置信息并判断位置信息的可靠性:2) Use GPS to obtain the location information of the drone and judge the reliability of the location information:

由于GPS信息具有不稳定性,其信号强弱会受到太阳活动等的影响,所以本文中需要检测GPS信息,若用于定位过程中的卫星数量低于5颗星时,则认为该方法获取的无人机位置信息不够准确。Due to the instability of GPS information, its signal strength will be affected by solar activities, etc., so it is necessary to detect GPS information in this paper. If the number of satellites used in the positioning process is less than 5, it is considered that the method obtained The location information of the drone is not accurate enough.

3)采用互补滤波器进行数据融合是:3) Data fusion using complementary filters is:

采用互补滤波算法将采用GPS和光流法获取的无人机位置信息进行融合,实现在较长时间范围内获取无人机的高精度准确位置信息,满足无人机长时间自主飞行控制的需要。以下 将主要对基于互补滤波器的数据融合算法进行介绍。The complementary filtering algorithm is used to fuse the position information of the UAV obtained by GPS and optical flow method, so as to obtain the high-precision and accurate position information of the UAV in a long time range, and meet the needs of long-term autonomous flight control of the UAV. The following will mainly introduce the data fusion algorithm based on complementary filter.

由于采用基于GPS和光流法的定位算法只能直接或间接获取无人机的水平位置信息,所以本章中只考虑水平方向无人机的位置信息融合方法。Since the positioning algorithm based on GPS and optical flow method can only directly or indirectly obtain the horizontal position information of the UAV, this chapter only considers the position information fusion method of the UAV in the horizontal direction.

在理想状态下,无人机的位置信息p=[x y]T与其对应速度信息v=[vx vy]T具有如下关系:In an ideal state, the UAV’s position information p=[xy] T and its corresponding speed information v=[v x v y ] T have the following relationship:

其中x和y分别为无人机X和Y方向的位置信息,vx和vy分别为无人机X和Y方向的速度信息,方程式左面为p的一阶导数。Among them, x and y are the position information of the UAV in the X and Y directions respectively, v x and v y are the velocity information of the UAV in the X and Y directions respectively, and the left side of the equation is the first derivative of p.

但在实际测量过程中,由于传感器自身精度及外界干扰的存在,测量结果往往包含有大量的噪声信号和干扰信息,这里将采用GPS法获取的位置信息和由光流法获取的速度信息简化为如下形式:However, in the actual measurement process, due to the sensor's own accuracy and the existence of external interference, the measurement results often contain a large number of noise signals and interference information. Here, the position information obtained by the GPS method and the speed information obtained by the optical flow method are simplified as In the following form:

yp=p+μp y p =p+μ p

yv=v+μv y v =v+μ v

这里的p、v分别为采用GPS法和光流法获取的真实位置和速度测量值,μp和μv分别为位置信息和速度信息的测量噪声,均为恒定值。Here, p and v are the real position and velocity measurements obtained by GPS method and optical flow method respectively, and μ p and μ v are the measurement noise of position information and velocity information respectively, both of which are constant values.

采用互补滤波算法的结构框图如图1所示,根据系统的结构框图,可以将测量结果通过互补滤波器后的输出的频域表达式写成如下形式:The structural block diagram of the complementary filtering algorithm is shown in Figure 1. According to the structural block diagram of the system, the measurement results can be output after passing through the complementary filter The frequency domain expression of is written as follows:

其中s为拉普拉斯算子,T(s)为一阶低通滤波,S(s)为一阶高通滤波,T(s)+S(s)=1,yp(s)表示由传感器得到在频域内的位置信息。Where s is the Laplacian operator, T(s) is the first-order low-pass filter, S(s) is the first-order high-pass filter, T(s)+S(s)=1, y p (s) means The sensor obtains position information in the frequency domain.

在实际控制系统中,根据yp(s)的低通特性和yv(s)的高通特性,设计互补滤波控制器为比例反馈,即C(s)=k;在这种情况下,可以得到闭环系统的动态方程为In the actual control system, according to the low-pass characteristic of y p (s) and the high-pass characteristic of y v (s), the complementary filter controller is designed as proportional feedback, that is, C(s)=k; in this case, it can be The dynamic equation of the closed-loop system is obtained as

此时滤波器频域表达形式为此时设计的滤波器截止频率为在大于fT的高频段,光流传感器数据对融合结果起主要作用,在小于fT的低频段,GPS数据对融合结果起主要作用。At this time, the filter frequency domain expression form is with The cutoff frequency of the designed filter is In the high-frequency band greater than fT , the optical flow sensor data plays a major role in the fusion result, and in the low-frequency band smaller than fT , the GPS data plays a major role in the fusion result.

下面给出具体的实例:Specific examples are given below:

一、系统硬件连接及配置1. System hardware connection and configuration

本发明的基于视觉的四旋翼无人机自主飞行控制方法采用基于嵌入式架构的飞行控制结构,所搭建的实验平台包括四旋翼无人机本体、地面站、遥控器等。其中四旋翼无人机搭载了嵌入式计算机(该计算机内嵌Intel Core i3双核处理器,主频1.8GHz)、机载PX4FLOW光流传感器、GPS和飞行控制器(含惯性导航单元和气压计模块等)。地面站包括一台装有Linux 操作系统的笔记本计算机,用于机载程序的启动及远程监控。该平台可通过遥控器进行手动起飞和降落,并在发生意外时紧急切换为手动模式,以确保实验安全。The vision-based autonomous flight control method of the quadrotor drone of the present invention adopts a flight control structure based on an embedded architecture, and the built experimental platform includes a quadrotor drone body, a ground station, a remote controller, and the like. Among them, the quadrotor drone is equipped with an embedded computer (the computer is embedded with an Intel Core i3 dual-core processor with a main frequency of 1.8GHz), an airborne PX4FLOW optical flow sensor, GPS and a flight controller (including an inertial navigation unit and a barometer module. Wait). The ground station includes a laptop computer with a Linux operating system, which is used for onboard program startup and remote monitoring. The platform can be manually taken off and landed through the remote control, and can be switched to manual mode in case of an accident to ensure the safety of the experiment.

二、飞行实验结果2. Flight experiment results

本实施例对上述实验平台进行了多组飞行控制实验,飞行实验环境为室外校园环境中。控制目标是实现四旋翼无人机的精准无漂移定位功能,实验中运动轨迹约为85米*135米的矩形。In this embodiment, multiple groups of flight control experiments are carried out on the above-mentioned experimental platform, and the flight experiment environment is an outdoor campus environment. The control goal is to realize the precise and drift-free positioning function of the quadrotor UAV. In the experiment, the movement trajectory is about a rectangle of 85 meters * 135 meters.

室外手持实验过程中无人机的数据曲线如图2所示。其中,图中无人机的起始点为(0,0)点,图a为使用GPS获取的无人机水平位置轨迹,图b为使用光流传感器获取的位置轨迹,图c为使用本文提出的数据融合算法获取的无人机位置轨迹。从图中可以看出,在(0,100)点处,由于GPS信号较差,此时使用GPS获取的无人机位置信息存在较大的波动,而融合后的无人机位置信息波动较小,进而证明了本文融合算法的有效性。此外,使用GPS获取的无人机位置信息起始点与终止点基本重合,误差在5米以内;而采用光流法获取的无人机位置信息在长时间内存在较大的位置漂移,与初始点误差约为29米;而采用本文中提出的融合算法,其定位误差约为11米,证明了本文融合算法的有效性。The data curve of the UAV during the outdoor handheld experiment is shown in Figure 2. Among them, the starting point of the UAV in the figure is (0,0), the figure a is the horizontal position track of the drone obtained by using GPS, the figure b is the position track obtained by using the optical flow sensor, and the figure c is the position track obtained by using the proposed paper. The UAV position track obtained by the data fusion algorithm. It can be seen from the figure that at point (0,100), due to the poor GPS signal, there are large fluctuations in the position information of the UAV obtained by using GPS at this time, while the fluctuation of the position information of the fused UAV is small. Furthermore, the effectiveness of the fusion algorithm in this paper is proved. In addition, the starting point and ending point of the UAV position information obtained by GPS basically coincide, and the error is within 5 meters; while the UAV position information obtained by the optical flow method has a large position drift in a long period of time, which is different from the initial The point error is about 29 meters; while using the fusion algorithm proposed in this paper, the positioning error is about 11 meters, which proves the effectiveness of the fusion algorithm in this paper.

Claims (2)

1. a kind of unmanned plane localization method based on GPS Yu light stream Data Fusion of Sensor, it is characterized in that, comprise the following steps: The velocity information of unmanned plane is obtained using the light stream sensor installed in four rotor wing unmanned aerial vehicle bottoms, unmanned plane is obtained using GPS Positional information and judge the reliability of positional information, the unmanned plane that will be obtained using complementary filter using light stream sensor Velocity information, merged using the positional information of the GPS unmanned planes obtained, in the alignment system of unmanned plane;Described GPS obtains the positional information of unmanned plane and judges that the reliability of positional information is:There is unstability due to GPS information, it is necessary to GPS information is detected, if be less than 5 stars for the number of satellite in position fixing process, then it is assumed that the unmanned seat in the plane that this method is obtained Confidence breath is not accurate enough;
The nothing that the use complementary filter is obtained by the velocity information of the unmanned plane obtained using light stream sensor, using GPS Man-machine positional information is merged specially:Measurement result is passed through into the output after complementary filterFrequency-domain expression write Into following form:
p ^ ( s ) = C ( s ) s + C ( s ) y p ( s ) + s C ( s ) + s y v ( s ) s = T ( s ) y p ( s ) + S ( s ) y v ( s ) s ,
Wherein s is Laplace operator, and T (s) is first-order low-pass ripple, and S (s) is single order high-pass filtering, T (s)+S (s)=1, yp (s) represent to obtain the positional information in frequency domain by sensor;Complementary filter controller adoption rate feeds back, i.e. C (s)=k;Adopt Following form is reduced to the positional information of GPS methods acquisition and by the velocity information that optical flow method is obtained:
y p = p + μ p y v = v + μ v ,
Here p, v is respectively the actual position and velocity measurement obtained using GPS methods and optical flow method, μpAnd μvRespectively position Confidence ceases the measurement noise with velocity information, is steady state value;In this case, the dynamical equation for obtaining closed-loop system is:
p ^ · = y v + k ( y p - p ^ ) .
Now filter frequency domain expression-form isWithThe filter cutoff frequency now designed isMore than fTHigh band, light stream sensing data plays a major role to fusion results, less than fTLow-frequency range, Gps data plays a major role to fusion results, and k is constant.
2. the unmanned plane localization method as claimed in claim 1 based on GPS Yu light stream Data Fusion of Sensor, it is characterized in that, The velocity information of described utilization light stream sensor acquisition unmanned plane is handled:What optical flow method was directly obtained is the level of unmanned plane Velocity information, so need to obtain its positional information by integral operation, simultaneously as other can not survey the influence of factor, Of short duration mistake measurement is had during tachometric survey, it is necessary to artificially be filtered.
CN201510104096.0A 2015-03-10 2015-03-10 Unmanned plane localization method based on GPS Yu light stream Data Fusion of Sensor Active CN104808231B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510104096.0A CN104808231B (en) 2015-03-10 2015-03-10 Unmanned plane localization method based on GPS Yu light stream Data Fusion of Sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510104096.0A CN104808231B (en) 2015-03-10 2015-03-10 Unmanned plane localization method based on GPS Yu light stream Data Fusion of Sensor

Publications (2)

Publication Number Publication Date
CN104808231A CN104808231A (en) 2015-07-29
CN104808231B true CN104808231B (en) 2017-07-11

Family

ID=53693217

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510104096.0A Active CN104808231B (en) 2015-03-10 2015-03-10 Unmanned plane localization method based on GPS Yu light stream Data Fusion of Sensor

Country Status (1)

Country Link
CN (1) CN104808231B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105352495B (en) * 2015-11-17 2018-03-23 天津大学 Acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method
WO2017143553A1 (en) * 2016-02-25 2017-08-31 汪禹 Dispatching/receiving nest used for delivery of objects by unmanned aerial vehicle
JP6799444B2 (en) * 2016-04-01 2020-12-16 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Autonomous mobile system
CN105974934A (en) * 2016-06-24 2016-09-28 天津理工大学 Air quality intelligent monitoring quadrotor formation system based on pilotage-following method
CN106647784A (en) * 2016-11-15 2017-05-10 天津大学 Miniaturized unmanned aerial vehicle positioning and navigation method based on Beidou navigation system
CN106843257A (en) * 2017-04-07 2017-06-13 深圳蚁石科技有限公司 A kind of autocontrol method of aircraft
CN107389968B (en) * 2017-07-04 2020-01-24 武汉视览科技有限公司 Unmanned aerial vehicle fixed point implementation method and device based on optical flow sensor and acceleration sensor
CN109975844B (en) * 2019-03-25 2020-11-24 浙江大学 An anti-drift method of GPS signal based on optical flow method
CN110825117A (en) * 2019-12-12 2020-02-21 扬州大学 A high-precision UAV system and intelligent control method
CN114018241B (en) * 2021-11-03 2023-12-26 广州昂宝电子有限公司 Positioning method and device for unmanned aerial vehicle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102506892A (en) * 2011-11-08 2012-06-20 北京航空航天大学 Configuration method for information fusion of a plurality of optical flow sensors and inertial navigation device
CN102829779A (en) * 2012-09-14 2012-12-19 北京航空航天大学 Aircraft multi-optical flow sensor and inertia navigation combination method
CN103365297A (en) * 2013-06-29 2013-10-23 天津大学 Optical flow-based four-rotor unmanned aerial vehicle flight control method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004012258A (en) * 2002-06-06 2004-01-15 Hitachi Industrial Equipment Systems Co Ltd Remote positioning device, remote positioning method, and computer software

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102506892A (en) * 2011-11-08 2012-06-20 北京航空航天大学 Configuration method for information fusion of a plurality of optical flow sensors and inertial navigation device
CN102829779A (en) * 2012-09-14 2012-12-19 北京航空航天大学 Aircraft multi-optical flow sensor and inertia navigation combination method
CN103365297A (en) * 2013-06-29 2013-10-23 天津大学 Optical flow-based four-rotor unmanned aerial vehicle flight control method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"A low-cost and robust optical flow CMOS camera for velocity estimation";Ke Sun等;《Proceeding of the IEEE International Conference on Robotics and Biomimetics (ROBIO)》;20131231;第1181-1186页 *
"Adding Optical Flow into the GPS/INS Integration for UAV navigation";Weidong Ding等;《International Global Navigation Satellite Systems Society,IGNSS Symposium 2009》;20091231;第1-13页 *
"基于互补滤波器的四旋翼飞行器姿态解算";梁延德等;《传感器与微系统》;20111231;第30卷(第11期);第56-59页 *
"微小型四旋翼飞行器多信息非线性融合导航方法及实现";刘建业等;《南京航空航天大学学报》;20131031;第45卷(第5期);正文第576页第1节及图1,第578页第2.2节第(3)部分,第580页第2栏第1段 *

Also Published As

Publication number Publication date
CN104808231A (en) 2015-07-29

Similar Documents

Publication Publication Date Title
CN104808231B (en) Unmanned plane localization method based on GPS Yu light stream Data Fusion of Sensor
CN105352495B (en) Acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method
CN101858748B (en) Fault-tolerance autonomous navigation method of multi-sensor of high-altitude long-endurance unmanned plane
CN111045454B (en) Unmanned aerial vehicle self-driving instrument based on bionic autonomous navigation
CN104062977B (en) Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM
CN102692225B (en) Attitude heading reference system for low-cost small unmanned aerial vehicle
CN103853156B (en) A kind of small-sized four-rotor aircraft control system based on machine set sensor and method
Wang et al. Bearing-only visual SLAM for small unmanned aerial vehicles in GPS-denied environments
CN105094138A (en) Low-altitude autonomous navigation system for rotary-wing unmanned plane
TWI558617B (en) Unmanned flight vehicle autonomous flight computer system and control method
CN104503467A (en) Autonomous take-off and landing flight control system of unmanned aerial vehicle based on dual-core architecture
CN103697889A (en) A UAV autonomous navigation and positioning method based on multi-model distributed filtering
CN106647784A (en) Miniaturized unmanned aerial vehicle positioning and navigation method based on Beidou navigation system
CN101201627A (en) A method for automatic correction of UAV heading based on magnetic heading sensor
Dorobantu et al. An airborne experimental test platform: From theory to flight
CN108444468B (en) A directional compass integrating downward vision and inertial navigation information
CN102175882B (en) Visual speed measurement method for unmanned helicopter based on natural landmarks
Iwaneczko et al. A prototype of unmanned aerial vehicle for image acquisition
CN204390044U (en) A kind of device optimizing unmanned plane during flying record
CN103712598A (en) Attitude determination system and method of small unmanned aerial vehicle
Youn et al. Model-aided synthetic airspeed estimation of UAVs for analytical redundancy
Steed et al. Algebraic dominant pole placement methodology for unmanned aircraft systems with time delay
Meola et al. Flight control system for small-size unmanned aerial vehicles: Design and software-in-the-loop validation
Denuelle et al. Biologically-inspired visual stabilization of a rotorcraft UAV in unknown outdoor environments
CN101430565A (en) Integrated single loop controller for camera optical axis stable tracing

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant