CN115880328B - A method for real-time localization of moving aerial targets using dual-satellite collaborative observation - Google Patents
A method for real-time localization of moving aerial targets using dual-satellite collaborative observationInfo
- Publication number
- CN115880328B CN115880328B CN202211201760.XA CN202211201760A CN115880328B CN 115880328 B CN115880328 B CN 115880328B CN 202211201760 A CN202211201760 A CN 202211201760A CN 115880328 B CN115880328 B CN 115880328B
- Authority
- CN
- China
- Prior art keywords
- remote sensing
- sensing satellite
- target
- coordinate system
- air moving
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a real-time positioning method of an air moving target through double-star collaborative observation, which comprises the steps of 1, utilizing a target detection and tracking algorithm to detect particle coordinates of the target from air moving target sequence images shot by two optical remote sensing satellites at the same time at the current moment, 2, constructing an air moving target motion track fusion model according to an optical remote sensing satellite imaging mechanism and air target motion characteristics, 3, adopting a motion track sliding solving method to construct a standard error equation to iteratively solve air target motion track parameters by utilizing the particle coordinates of the air moving target on the two satellite sequence images, and 4, solving the spatial position and the speed of the air moving target at the current moment according to the motion track parameters. The invention can eliminate the influence of the double-star time synchronization error on the positioning precision of the air moving target from the model, and realize the high-precision real-time positioning of the air moving target of double-star collaborative observation.
Description
Technical Field
The invention belongs to the technical field of geometric processing of optical remote sensing satellite data, and particularly relates to a real-time positioning method for an air moving target through double-star collaborative observation.
Background
From the perspective of the precise positioning of the moving object, the moving objects can be mainly divided into three types, namely sea surface moving objects, ground moving objects and air moving objects. For an air moving target, the flying height of the target is continuously changed and unknown, the elevation value of the target is difficult to be known in advance, and the single observation of a single satellite is difficult to realize the high-precision positioning of the target. Compared with a static target, the air moving target is in a moving state, and high-precision positioning is difficult to realize by multiple observations of a single satellite. In view of the characteristics that the air movement target is unknown in elevation and in a movement state, two or more remote sensing satellites are required to be used for collaborative observation, and time synchronization and observation from different angles are required to be achieved, so that the high-precision positioning of the air movement target can be realized.
The two satellites are difficult to strictly achieve time synchronization, namely, time synchronization errors exist when the two satellites are utilized to cooperatively observe an air moving target under the influence of factors such as an imaging frame frequency of an optical remote sensing satellite, time control and the like. The conventional model and method for positioning the optical remote sensing satellite image target are mainly aimed at a static target, and double-star time synchronization errors are not required to be considered. However, for an air moving object, the positioning processing is performed by using a positioning model and a positioning method of a conventional static object, the double-star time synchronization error is ignored, and the obtained positioning result is inevitably deviated from the true position. Therefore, the existing model and method have obvious defects in the aspect of the real-time positioning of the aerial moving target, and the high-precision positioning of the aerial moving target is difficult to realize.
Disclosure of Invention
The invention solves the technical problem of overcoming the defects of the prior art, and provides the real-time positioning method for the air moving target, which utilizes the double-star collaborative observation to construct the target moving track fusion model, so as to realize the high-precision positioning of the air moving target.
The technical scheme of the invention is that the method for positioning the air moving target in real time through the cooperative observation of double stars comprises the following steps:
1. Detecting particle coordinates of the target under an image coordinate system from aerial moving target sequence images shot at the current moment of the first remote sensing satellite and the second remote sensing satellite respectively by utilizing a target detection and tracking algorithm;
2. Respectively constructing an aerial moving target motion track fusion model on the first remote sensing satellite and the second remote sensing satellite according to an optical remote sensing satellite imaging mechanism and aerial target motion position, speed and acceleration information;
3. The motion trail sliding solving method is adopted, wherein aiming at the aerial target particle coordinates detected on a first remote sensing satellite image at the current moment, standard error equations are constructed by utilizing the aerial target particle coordinates on continuous N frames before a first remote sensing satellite and continuous M frames before a second remote sensing satellite, and the aerial target motion trail fusion model parameters at the current moment are solved in an iterative mode, wherein M is greater than 1 and N is greater than 1;
4. Substituting the air target motion track fusion model parameters obtained in the step three into a position equation to solve the space position (X, Y, Z) of the air moving target at the current moment, and substituting into a speed equation to solve the speed (V X,VY,VZ) of the air moving target at the current moment.
Further, the target detection and tracking algorithm in the first step is a time domain target detection and tracking algorithm or a space domain target detection and tracking algorithm.
Further, in the second step, an air moving target moving track fusion model constructed for each remote sensing satellite is as follows:
In the above formula, t is imaging time, X 0,δ1,δ2 is an initial value of an X-direction position of an aerial target, a 1-order item of the X-direction position relative to t and a 2-order item parameter of the X-direction position relative to t respectively, Y 0,β1,β2 is an initial value parameter of a Y-direction position of the aerial target, a 1-order item parameter of the Y-direction position relative to t and a 2-order item parameter of the Y-direction position relative to t respectively, Z 0,θ1,θ2 is an initial value of a Z-direction position of the aerial target, a 1-order item of the Z-direction position relative to t and a 2-order item parameter of the Z-direction position relative to t respectively, (X S,1,YS,1,ZS,1) is a spatial coordinate of a first remote sensing satellite under a WGS84 coordinate system, and is measured by an on-board GNSS measuring device, (X S,2,YS,2,ZS,2) is a spatial coordinate of a second remote sensing satellite under the WGS84 coordinate system, lambda 1 is a first scale factor, and lambda 2 is a second scale factor;
A rotation matrix from a J2000 coordinate system to a WGS84 coordinate system; a rotation matrix of the coordinate system to the J2000 coordinate system is measured for the first remote sensing satellite pose, A rotation matrix from the coordinate system to the J2000 coordinate system is measured for the pose of the second remote sensing satellite, the camera on the first remote sensing satellite is referred to as a first camera, the camera on the second remote sensing satellite is referred to as a second camera,A placement matrix for the first camera in a first remote sensing satellite attitude measurement coordinate system,Measuring a placement matrix of the second camera under a coordinate system for a second remote sensing satellite attitude;
respectively measuring the pointing angles of the first camera imaging probe element along the orbit direction and the vertical orbit direction under the first remote sensing satellite attitude measurement coordinate system, Respectively measuring the pointing angles of the second camera imaging probe element along the orbit direction and the vertical orbit direction under a second remote sensing satellite attitude measurement coordinate system;
The origin of the first remote sensing satellite attitude measurement coordinate system is located at the centroid of the attitude measurement equipment on the first remote sensing satellite, the X axis is the flight direction of the first remote sensing satellite, the Z axis points to the earth center, the Y axis is perpendicular to the Z axis and the X axis to form a right hand system, the origin of the second remote sensing satellite attitude measurement coordinate system is located at the centroid of the attitude measurement equipment on the second remote sensing satellite, the X axis is the flight direction of the second remote sensing satellite, the Z axis points to the earth center, and the Y axis is perpendicular to the Z axis and the X axis to form the right hand system.
Further, the first camera imaging probe unit measures the pointing angle along the orbit direction under the first remote sensing satellite attitude measurement coordinate systemAnd the pointing angle of the vertical rail directionThe second camera imaging probe unit measures the pointing angle of the orbital direction under the second remote sensing satellite attitude measurement coordinate systemAnd the pointing angle of the vertical rail directionCalculated according to the following method:
In the above formula, (s 1,l1) is the coordinate of the target particle detected by the first remote sensing satellite in the image coordinate system, (s 2,l2) is the coordinate of the target particle detected by the second remote sensing satellite in the image coordinate system, s 1 is the column number of the target particle coordinate on the first remote sensing satellite, s 2 is the column number of the target particle coordinate on the second remote sensing satellite, l 1 is the column number of the target particle coordinate on the first remote sensing satellite, and l 2 is the column number of the target particle coordinate on the second remote sensing satellite;
h 0,1 is the orbital direction pointing angle of the first remote sensing satellite Constant terms in the direction, h 1,1,h2,1,h3,1,h4,1,h5,1,h6,1,h7,1,h8,1,h9,1 respectively correspond to the orbital direction pointing angles of the first remote sensing satelliteIn the direction s 1,l1,s1l1 of the direction,K 0,1 is the vertical direction pointing angle of the first remote sensing satelliteConstant terms in the direction, k 1,1,k2,1,k3,1,k4,1,k5,1,k6,1,k7,1,k8,1,k9,1 respectively correspond to the vertical-orbit direction pointing angles of the first remote sensing satelliteIn the direction s 1,l1,s1l1 of the direction,Coefficients of (2);
h 0,2 is the pointing angle of the second remote sensing satellite along the orbit direction H 1,2,h2,2,h3,2,h4,2,h5,2,h6,2,h7,2,h8,2,h9,2 respectively corresponds to the direction angle of the second remote sensing satellite along the orbitIn the direction s 2,l2,s2l2 of the direction,K 0,2 is the vertical direction pointing angle of the second remote sensing satelliteConstant terms in the direction, k 1,2,k2,2,k3,2,k4,2,k5,2,k6,2,k7,2,k8,2,k9,2 respectively correspond to the vertical-orbit direction pointing angles of the second remote sensing satelliteIn the direction s 2,l2,s2l2 of the direction,Is a coefficient of (a).
Further, define
U 1、v1、w1 is the first row, the second row and the third row of column vectors obtained by the matrix product of the formula, and the first scaling factor lambda 1 is calculated according to the following ellipsoidal equation:
In the above formula, a=a e+h,B=be+h,ae=6378137.0m,be =6356752.3m, and h is ellipsoidal height.
Further, define
U 2、v2、w2 is the first row, the second row and the third row of column vectors obtained by the matrix product of the formula, and the second scaling factor lambda 2 is calculated according to the following ellipsoidal equation:
In the above formula, a=a e+h,B=be+h,ae=6378137.0m,be =6356752.3m, and h is ellipsoidal height.
Furthermore, the sum of the number of images before the first remote sensing satellite and the second remote sensing satellite required by the third step meets that M+N is more than or equal to 9.
Further, the motion trajectory sliding solving method in the third step includes the following steps:
S1, definition:
R 1、R2 is an orthogonal matrix, and,
The following deformation is carried out on the motion trail fusion models of the first remote sensing satellite and the second remote sensing satellite, so that the following deformation is obtained:
r -1 is multiplied on both sides of the above two equations, and the following formula can be obtained through four arithmetic transformations:
Defining F x,1、Fx,2 as a residual error function model of the pointing angle of the first remote sensing satellite and the second remote sensing satellite along the orbit direction detector, and F y,1、Fy,2 as a residual error function model of the pointing angle of the first remote sensing satellite and the second remote sensing satellite along the orbit direction detector;
s2, constructing a standard error equation aiming at each target particle coordinate of an air moving target on a first remote sensing satellite and a second remote sensing satellite sequence image, wherein the standard error equation is formed as follows:
V=AX-L P
in the above formula, V is a residual vector of X, and the expression form is as follows:
the calculation result on the right of the standard error equation;
a is a design matrix formed by unknown partial derivatives, and the representation form is as follows:
Subscripts 1 and 2 respectively represent a first remote sensing satellite and a second remote sensing satellite, wherein a subscript i is a frame number of a first remote sensing satellite sequence image where an air moving target is located, i=1, 2, and N, a subscript j is a frame number of a second remote sensing satellite sequence image where the air moving target is located, j=1, 2, and M;
X is an unknown number matrix, iterative calculation is carried out according to the least square adjustment principle, and a calculation formula is as follows:
X=(ATPA)-1(ATPL)
the expression form of X is as follows:
X=[X0,δ1,δ2,Y0,β1,β2,Z0,θ1,θ2]
When a standard error equation is calculated for the first time, an initial value is given to X;
l is a constant term matrix obtained by substituting the current calculated value of the unknown matrix X into the expression of F x,1、Fx,2、Fy,1、Fy,2, and the expression is as follows:
L=[Fx,1,i,Fy,1,i,Fx,2,j,Fy,2,j]T
p is the observation weight of the coordinates of the target particles and is an identity matrix;
And S3, solving an aerial target motion track parameter X 0,δ1,δ2,Y0,β1,β2,Z0,θ1,θ2 when iteration stops according to a least square adjustment principle.
Further, according to the least square adjustment principle, the condition for judging X iteration convergence is that the difference value of X 0,δ1,δ2,Y0,β1,β2,Z0,θ1,θ2 obtained by current and later calculation is within 1X 10 -6, the iteration convergence is regarded as stopping, and the value of an unknown number matrix X obtained by the last iteration calculation is used as a calculated air target motion track fusion model parameter.
Further, the method for solving the spatial position (X, Y, Z) and the speed (V X,VY,VZ) of the air moving object at the current moment in the fourth step comprises the steps of fusing model parameters X 0,δ1,δ2,Y0,β1,β2,Z0,θ1,θ2 according to the solved air object moving track, substituting the following position equation and speed equation to calculate the air moving object moving track:
According to the invention, on the basis of a conventional optical remote sensing satellite image static target positioning model, an air moving target motion trail fusion model is expanded and constructed, the influence of double-star time synchronization errors on target positioning accuracy is eliminated from the model, a motion trail sliding solving strategy is adopted, the space position and speed of an air moving target at the current moment are obtained in real time, and high-accuracy real-time positioning of the air moving target through double-star collaborative observation is realized.
Compared with the prior art, the invention has the advantages that:
(1) The invention expands and constructs the air moving object movement track fusion model based on the conventional static object positioning model, can eliminate the influence of double-star time synchronization errors on the air moving object positioning precision from the model, and realizes the high-precision positioning of the air moving object of the optical remote sensing satellite image.
(2) According to the method, a motion trail sliding solving method is adopted, redundant observation of motion trail solving at the current moment is increased by utilizing particle information of a plurality of moments of the preamble of the air motion target, the real-time positioning of the air motion target can be realized while the positioning precision of the air motion target is improved, and the high timeliness requirement of air motion target positioning processing is met.
Drawings
FIG. 1 is a flow chart of an embodiment of the present invention;
Fig. 2 is a schematic diagram of a motion trail sliding solution of an air moving object according to the present invention.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention and/or the technical solutions in the prior art, the following description will explain specific embodiments of the present invention with reference to the accompanying drawings.
The method for positioning the air moving target through the double-star collaborative observation provided by the embodiment is shown in fig. 1, and specifically comprises the following steps:
1. And detecting particle coordinates of the target under the image coordinate system from the aerial moving target sequence images shot at the current moment of the first remote sensing satellite and the second remote sensing satellite respectively by using a target detection and tracking algorithm, wherein in the embodiment, the particle coordinates of the target under the image coordinate system are acquired by using a time domain target detection and tracking algorithm.
2. Constructing an optical imaging geometric model of an optical remote sensing satellite strictly and the position, speed and acceleration motion characteristics of an aerial target according to an optical remote sensing satellite imaging mechanism, namely, a collineation conditional equation and an optical physical imaging principle in photogrammetry, and constructing an aerial moving target motion track fusion model for a first remote sensing satellite and a second remote sensing satellite as follows:
In the method, t is imaging time, X 0,δ1,δ2 is an initial value of an aerial target in the X direction, a 1-order item of t and a 2-order item parameter of t respectively, Y 0,β1,β2 is an initial value of an aerial target in the Y direction, a 1-order item of t and a 2-order item parameter of t respectively, Z 0,θ1,θ2 is an initial value of an aerial target in the Z direction, a 1-order item of t and a 2-order item parameter of t respectively, (X S,1,YS,1,ZS,1) is a space coordinate of a first remote sensing satellite in a WGS84 coordinate system, the space coordinate is measured by an on-board GNSS measuring device, (X S,2,YS,2,ZS,2) is a space coordinate of a second remote sensing satellite in the WGS84 coordinate system, lambda 1 is a first scale factor, and lambda 2 is a second scale factor; defining that the origin of the current satellite attitude measurement coordinate system is positioned at the mass center of the attitude measurement equipment, the X axis is the satellite flight direction, the Z axis points to the earth center, the Y axis is perpendicular to the Z axis and the X axis to form a right hand system, A rotation matrix of the coordinate system to the J2000 coordinate system is measured for the first remote sensing satellite pose,Measuring a rotation matrix from a coordinate system to a J2000 coordinate system for the second remote sensing satellite attitude; a placement matrix for the first camera in a first remote sensing satellite attitude measurement coordinate system, Measuring a placement matrix of the second camera under a coordinate system for a second remote sensing satellite attitude; respectively measuring the pointing angles of the first camera imaging probe element along the orbit direction and the vertical orbit direction under the first remote sensing satellite attitude measurement coordinate system, And respectively measuring the pointing angles of the second camera imaging probe element along the orbit direction and the vertical orbit direction under the second remote sensing satellite attitude measurement coordinate system.
Angle of pointing along the trackAnd the pointing angle of the vertical rail directionCalculated according to the following method:
In the above formula, (s 1,l1) is the coordinate of the target particle detected by the first remote sensing satellite in the image coordinate system, (s 2,l2) is the coordinate of the target particle detected by the second remote sensing satellite in the image coordinate system, s 1 is the column number of the target particle coordinate on the first remote sensing satellite, s 2 is the column number of the target particle coordinate on the second remote sensing satellite, l 1 is the column number of the target particle coordinate on the first remote sensing satellite, and l 2 is the column number of the target particle coordinate on the second remote sensing satellite;
h 0,1 is the orbital direction pointing angle of the first remote sensing satellite Constant terms in the direction, h 1,1,h2,1,h3,1,h4,1,h5,1,h6,1,h7,1,h8,1,h9,1 respectively correspond to the orbital direction pointing angles of the first remote sensing satelliteIn the direction s 1,l1,s1l1 of the direction,K 0,1 is the vertical direction pointing angle of the first remote sensing satelliteConstant terms in the direction, k 1,1,k2,1,k3,1,k4,1,k5,1,k6,1,k7,1,k8,1,k9,1 respectively correspond to the vertical-orbit direction pointing angles of the first remote sensing satelliteIn the direction s 1,l1,s1l1 of the direction,Coefficients of (2);
h 0,2 is the pointing angle of the second remote sensing satellite along the orbit direction H 1,2,h2,2,h3,2,h4,2,h5,2,h6,2,h7,2,h8,2,h9,2 respectively corresponds to the direction angle of the second remote sensing satellite along the orbitIn the direction s 2,l2,s2l2 of the direction,K 0,2 is the vertical direction pointing angle of the second remote sensing satelliteConstant terms in the direction, k 1,2,k2,2,k3,2,k4,2,k5,2,k6,2,k7,2,k8,2,k9,2 respectively correspond to the vertical-orbit direction pointing angles of the second remote sensing satelliteIn the direction s 2,l2,s2l2 of the direction,Is a coefficient of (a).
The calculation method of the first scaling factor lambda 1 and the second scaling factor lambda 2 is as follows:
Definition of the definition
U 1、v1、w1 is the first row, the second row and the third row of column vectors obtained by the matrix product of the formula, and the first scaling factor lambda 1 is calculated according to the following ellipsoidal equation:
In the above formula, a=a e+h,B=be+h,ae=6378137.0m,be =6356752.3m, and h is ellipsoidal height.
Similarly, define
U 2、v2、w2 is the first row, the second row and the third row of column vectors obtained by the matrix product of the formula, and the second scaling factor lambda 2 is calculated according to the following ellipsoidal equation:
In the above formula, a=a e+h,B=be+h,ae=6378137.0m,be =6356752.3m, and h is ellipsoidal height.
3. The motion trail sliding solving method is adopted, wherein for the air target particle coordinates detected on a certain satellite image at the current moment, a standard error equation is constructed by utilizing the air target particle coordinates on continuous N frames before the current moment of the satellite and continuous M frames before the current moment of another satellite, and parameters of an air target motion trail fusion model are solved iteratively X0,δ1,δ2,Y0,β1,β2,Z0,θ1,θ2;M+N≥9.
The motion trail sliding solving method comprises the following steps:
S1, fusing a model and an orientation angle according to the established motion trail Definition of the calculation results of (2) R 1、R2 is an orthogonal matrix;
the following deformation is carried out on the motion trail fusion models of the first remote sensing satellite and the second remote sensing satellite, so that the following deformation is obtained:
By multiplying R -1 on both sides of the above two equations, the following equations can be obtained:
the two formulas are converted into the following two formulas through four arithmetic operations:
Defining F x,1、Fx,2 as a residual error function model of the pointing angle of the first remote sensing satellite and the second remote sensing satellite along the orbit direction detector, and F y,1、Fy,2 as a residual error function model of the pointing angle of the first remote sensing satellite and the second remote sensing satellite along the orbit direction detector;
S2, constructing a standard error equation aiming at each particle coordinate of an air moving target on a first remote sensing satellite and a second remote sensing satellite sequence image, wherein the standard error equation is formed as follows:
V=AX-L P
in the above formula, V is a residual vector of X, and the expression form is as follows:
the calculation result on the right of the standard error equation;
a is a design matrix formed by unknown partial derivatives, and the representation form is as follows:
Wherein: subscripts 1 and 2 respectively represent a first remote sensing satellite and a second remote sensing satellite, subscript i is a frame number of a first remote sensing satellite sequence image in which an air moving target is located, i=1, 2.
X is an unknown number matrix, iterative calculation is carried out according to the least square adjustment principle, and the expression form of X is as follows:
X=[X0,δ1,δ2,Y0,β1,β2,Z0,θ1,θ2]
At the beginning of the calculation, an initial value of X needs to be given, and in this embodiment, the initial value of X is [ 00000 0000 ] T. Each time the iterative calculation of the standard error equation is carried out, the value of the unknown matrix X is changed;
L is a constant term matrix obtained by substituting the calculated value of the current round of the unknown matrix X into the expression of F x,1、Fx,2、Fy,1、Fy,2, and the expression is as follows:
L=[Fx,1,i,Fy,1,i,Fx,2,j,Fy,2,j]T
When iteration starts, substituting the initial value of X into F x,1、Fx,2、Fy,1、Fy,2 to obtain the initial value of L, and starting to perform the next iteration calculation.
P is the observation weight of the coordinates of the target particles and is an identity matrix.
In the iterative process, the method for iteratively calculating the unknown matrix X according to the least square adjustment principle comprises the following steps:
X=(ATPA)-1(ATPL)
And S3, solving an aerial target motion track parameter X 0,δ1,δ2,Y0,β1,β2,Z0,θ1,θ2 when iteration stops according to a least square adjustment principle.
The condition for judging the convergence of the X iteration is that the difference value of X 0,δ1,δ2,Y0,β1,β2,Z0,θ1,θ2 obtained by the current and the later calculation is within 1X 10 -6, the iteration convergence is regarded as the stop of the iteration, and the value of an unknown number matrix X obtained by the last iteration calculation is used as the obtained air target motion track fusion model parameter.
4. The space position (X, Y, Z) and the speed (V X,VY,VZ) of the air moving target at the current moment are solved by fusing the model parameters of the air target moving track obtained in the step three, and the calculation method is as follows:
substituting imaging time parameter t at each moment to obtain the motion trail of the moving object in the air.
As shown in FIG. 2, the principle of realizing the real-time positioning processing of the air moving target by adopting the motion trail sliding solving method is that aiming at an air moving target particle p 6 detected on a first remote sensing satellite image at the current time t 6 and an air moving target particle p 5 detected on the first remote sensing satellite image, the parameters X 0,δ1,δ2,Y0,β1,β2,Z0,θ1,θ2 of the motion trail L 6 of the air target at the time t 6 are accurately solved by combining target particles p 2 and p 4 on images of a plurality of continuous frames (only two frames are listed here) before the current time of the first remote sensing satellite and target particle p 1、p3 on images of a plurality of continuous frames (only two frames are listed here) before the current time of the second remote sensing satellite. Similarly, for the airborne target particle p 7 at time t 7, the trajectory parameters of the motion trajectory L 7 at time t 7 may be jointly solved using the target particles p 3、p5 and p 7 and p 2、p4 and p 6.
Although the present invention has been described in terms of the preferred embodiments, it is not intended to be limited to the embodiments, and any person skilled in the art can make any possible variations and modifications to the technical solution of the present invention by using the methods and technical matters disclosed above without departing from the spirit and scope of the present invention, so any simple modifications, equivalent variations and modifications to the embodiments described above according to the technical matters of the present invention are within the scope of the technical matters of the present invention.
Claims (10)
1. A real-time positioning method for an air moving target through double-star collaborative observation is characterized by comprising the following steps:
1. Detecting particle coordinates of the target under an image coordinate system from aerial moving target sequence images shot at the current moment of the first remote sensing satellite and the second remote sensing satellite respectively by utilizing a target detection and tracking algorithm;
2. Respectively constructing an aerial moving target motion track fusion model on the first remote sensing satellite and the second remote sensing satellite according to an optical remote sensing satellite imaging mechanism and aerial target motion position, speed and acceleration information;
3. The motion trail sliding solving method is adopted, wherein aiming at the aerial target particle coordinates detected on a first remote sensing satellite image at the current moment, standard error equations are constructed by utilizing the aerial target particle coordinates on continuous N frames before a first remote sensing satellite and continuous M frames before a second remote sensing satellite, and the aerial target motion trail fusion model parameters at the current moment are solved in an iterative mode, wherein M is greater than 1 and N is greater than 1;
4. substituting the air target motion track fusion model parameters obtained in the step three into a position equation, solving the spatial position (X, Y, Z) of the air moving target at the current moment, and substituting into a speed equation to solve the speed (V X,VY,VZ) of the air moving target at the current moment.
2. The method for positioning an air moving target in real time through double-star collaborative observation according to claim 1, wherein the target detection and tracking algorithm in the first step is a time domain target detection and tracking algorithm or a space domain target detection and tracking algorithm.
3. The method for positioning the air moving target in real time through double-star collaborative observation according to claim 1, wherein the air moving target motion trail fusion model constructed by the first remote sensing satellite in the second step is as follows:
The air moving target motion trail fusion model constructed by the second remote sensing satellite is as follows:
In the above formula, t is imaging time, X 0,δ1,δ2 is an initial value of an X-direction position of an aerial target, a 1-order item of the X-direction position relative to t and a 2-order item parameter of the X-direction position relative to t respectively, gamma 0,β1,β2 is an initial value parameter of a Y-direction position of the aerial target, a 1-order item parameter of the Y-direction position relative to t and a 2-order item parameter of the Y-direction position relative to t respectively, Z 0,θ1,θ2 is an initial value of a Z-direction position of the aerial target, a 1-order item of the Z-direction position relative to t and a 2-order item parameter of the Z-direction position relative to t respectively, (X S,1,YS,1,ZS,1) is a spatial coordinate of a first remote sensing satellite under a WGS84 coordinate system, measured by an on-board GNSS measuring device, (X S,2,YS,2,ZS,2) is a spatial coordinate of a second remote sensing satellite under the WGS84 coordinate system, lambda 1 is a first scale factor, and lambda 2 is a second scale factor;
A rotation matrix from a J2000 coordinate system to a WGS84 coordinate system; a rotation matrix of the coordinate system to the J2000 coordinate system is measured for the first remote sensing satellite pose, A rotation matrix from the coordinate system to the J2000 coordinate system is measured for the pose of the second remote sensing satellite, the camera on the first remote sensing satellite is referred to as a first camera, the camera on the second remote sensing satellite is referred to as a second camera,A placement matrix for the first camera in a first remote sensing satellite attitude measurement coordinate system,Measuring a placement matrix of the second camera under a coordinate system for a second remote sensing satellite attitude;
respectively measuring the pointing angles of the first camera imaging probe element along the orbit direction and the vertical orbit direction under the first remote sensing satellite attitude measurement coordinate system, Respectively measuring the pointing angles of the second camera imaging probe element along the orbit direction and the vertical orbit direction under a second remote sensing satellite attitude measurement coordinate system;
The origin of the first remote sensing satellite attitude measurement coordinate system is located at the centroid of the attitude measurement equipment on the first remote sensing satellite, the X axis is the flight direction of the first remote sensing satellite, the Z axis points to the earth center, the Y axis is perpendicular to the Z axis and the X axis to form a right hand system, the origin of the second remote sensing satellite attitude measurement coordinate system is located at the centroid of the attitude measurement equipment on the second remote sensing satellite, the X axis is the flight direction of the second remote sensing satellite, the Z axis points to the earth center, and the Y axis is perpendicular to the Z axis and the X axis to form the right hand system.
4. The method for positioning an airborne moving target through double-satellite collaborative observation according to claim 3, wherein the first camera imaging probe unit is used for measuring the pointing angle along the orbit direction under a coordinate system of a first remote sensing satellite attitude measurementAnd the pointing angle of the vertical rail directionThe second camera imaging probe unit measures the pointing angle of the orbital direction under the second remote sensing satellite attitude measurement coordinate systemAnd the pointing angle of the vertical rail directionCalculated according to the following method:
In the above formula, (s 1,l1) is the coordinate of the target particle detected by the first remote sensing satellite in the image coordinate system, (s 2,l2) is the coordinate of the target particle detected by the second remote sensing satellite in the image coordinate system, s 1 is the column number of the target particle coordinate on the first remote sensing satellite, s 2 is the column number of the target particle coordinate on the second remote sensing satellite, l 1 is the column number of the target particle coordinate on the first remote sensing satellite, and l 2 is the column number of the target particle coordinate on the second remote sensing satellite;
h 0,1 is the orbital direction pointing angle of the first remote sensing satellite Constant terms in the direction, h 1,1,h2,1,h3,1,h4,1,h5,1,h6,1,h7,1,h8,1,h9,1 respectively correspond to the orbital direction pointing angles of the first remote sensing satelliteIn the direction s 1,l1,s1l1 of the direction,K 0,1 is the vertical direction pointing angle of the first remote sensing satelliteConstant terms in the direction, k 1,1,k2,1,k3,1,k4,1,k5,1,k6,1,k7,1,k8,1,k9,1 respectively correspond to the vertical-orbit direction pointing angles of the first remote sensing satelliteIn the direction s 1,l1,s1l1 of the direction,Coefficients of (2);
h 0,2 is the pointing angle of the second remote sensing satellite along the orbit direction H 1,2,h2,2,h3,2,h4,2,h5,2,h6,2,h7,2,h8,2,h9,2 respectively corresponds to the direction angle of the second remote sensing satellite along the orbitIn the direction s 2,l2,s2l2 of the direction,K 0,2 is the vertical direction pointing angle of the second remote sensing satelliteConstant terms in the direction, k 1,2,k2,2,k3,2,k4,2,k5,2,k6,2,k7,2,k8,2,k9,2 respectively correspond to the vertical-orbit direction pointing angles of the second remote sensing satelliteIn the direction s 2,l2,s2l2 of the direction,Is a coefficient of (a).
5. The method for locating the moving object in the air through the cooperative observation of two stars according to claim 3, wherein the method is characterized by defining
U 1、v1、w1 is the first row, the second row and the third row of column vectors obtained by the matrix product of the formula, and the first scaling factor lambda 1 is calculated according to the following ellipsoidal equation:
In the above formula, a=a e+h,B=be+h,ae=6378137.0m,be =6356752.3m, and h is ellipsoidal height.
6. The method for locating the moving object in the air through the cooperative observation of two stars according to claim 3, wherein the method is characterized by defining
U 2、v2、w2 is the first row, the second row and the third row of column vectors obtained by the matrix product of the formula, and the second scaling factor lambda 2 is calculated according to the following ellipsoidal equation:
In the above formula, a=a e+h,B=be+h,ae=6378137.0m,be =6356752.3m, and h is ellipsoidal height.
7. The method for positioning the air moving target through double-star collaborative observation according to claim 1, wherein the sum of the number of images before the first remote sensing satellite and the second remote sensing satellite required by the third step is M+N is more than or equal to 9.
8. The method for locating the moving target in the air through the double-star collaborative observation according to claim 3, wherein the method for solving the sliding motion trail in the step three comprises the following steps:
S1, definition:
R 1、R2 is an orthogonal matrix, and,
The following deformation is carried out on the motion trail fusion models of the first remote sensing satellite and the second remote sensing satellite, so that the following deformation is obtained:
r -1 is multiplied on both sides of the above two equations, and the following formula can be obtained through four arithmetic transformations:
Defining F x,1、Fx,2 as a residual error function model of the pointing angle of the first remote sensing satellite and the second remote sensing satellite along the orbit direction detector, and F y,1、Fy,2 as a residual error function model of the pointing angle of the first remote sensing satellite and the second remote sensing satellite along the orbit direction detector;
s2, constructing a standard error equation aiming at each target particle coordinate of an air moving target on a first remote sensing satellite and a second remote sensing satellite sequence image, wherein the standard error equation is formed as follows:
V=AX-LP
in the above formula, V is a residual vector of X, and the expression form is as follows:
the calculation result on the right of the standard error equation;
a is a design matrix formed by unknown partial derivatives, and the representation form is as follows:
Subscripts 1 and 2 respectively represent a first remote sensing satellite and a second remote sensing satellite, wherein a subscript i is a frame number of a first remote sensing satellite sequence image where an air moving target is located, i=1, 2, and N, a subscript j is a frame number of a second remote sensing satellite sequence image where the air moving target is located, j=1, 2, and M;
X is an unknown number matrix, iterative calculation is carried out according to the least square adjustment principle, and a calculation formula is as follows:
X=(ATPA)-1(ATPL)
the expression form of X is as follows:
X=[X0,δ1,δ2,Y0,β1,β2,Z0,θ1,θ2]
When a standard error equation is calculated for the first time, an initial value is given to X;
l is a constant term matrix obtained by substituting the current calculated value of the unknown matrix X into the expression of F x,1、Fx,2、Fy,1、Fy,2, and the expression is as follows:
L=[Fx,1,i,Fy,1,i,Fx,2,j,Fy,2,j]T
p is the observation weight of the coordinates of the target particles and is an identity matrix;
And S3, solving an aerial target motion track parameter X 0,δ1,δ2,Y0,β1,β2,Z0,θ1,θ2 when iteration stops according to a least square adjustment principle.
9. The method for positioning the moving object in the air through the double-star collaborative observation according to claim 8, wherein the condition for judging the convergence of the X iteration is that the difference value of X 0,δ1,δ2,Y0,β1,β2,Z0,θ1,θ2 obtained through the current and the later two times of calculation is within 1X 10 -6, the iteration convergence is regarded as the iteration convergence, the iteration is stopped, and the value of an unknown number matrix X obtained through the last iteration calculation is used as a calculated moving track fusion model parameter of the moving object in the air.
10. The method for real-time positioning of a double-star collaborative observation air moving object according to claim 3, wherein the method for solving the spatial position (X, Y, Z) and the speed (V X,VY,VZ) of the air moving object at the current moment in the fourth step is obtained by substituting the following position equation and speed equation into the calculated air moving object moving track fusion model parameter X 0,δ1,δ2,Y0,β1,β2,Z0,θ1,θ2:
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202211201760.XA CN115880328B (en) | 2022-09-29 | 2022-09-29 | A method for real-time localization of moving aerial targets using dual-satellite collaborative observation |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202211201760.XA CN115880328B (en) | 2022-09-29 | 2022-09-29 | A method for real-time localization of moving aerial targets using dual-satellite collaborative observation |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN115880328A CN115880328A (en) | 2023-03-31 |
| CN115880328B true CN115880328B (en) | 2025-12-12 |
Family
ID=85770180
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202211201760.XA Active CN115880328B (en) | 2022-09-29 | 2022-09-29 | A method for real-time localization of moving aerial targets using dual-satellite collaborative observation |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN115880328B (en) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN116124153B (en) * | 2023-04-18 | 2023-06-16 | 中国人民解放军32035部队 | Double-star co-vision positioning method and equipment for space target |
| CN116879968A (en) * | 2023-06-25 | 2023-10-13 | 中国空间技术研究院 | Optical remote sensing satellite target three-dimensional tracking method |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108226978A (en) * | 2018-01-15 | 2018-06-29 | 电子科技大学 | A kind of Double-Star Positioning System method based on WGS-84 models |
| CN109709537A (en) * | 2018-12-19 | 2019-05-03 | 浙江大学 | A kind of noncooperative target position and speed tracking based on satellites formation |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7372400B2 (en) * | 2005-11-07 | 2008-05-13 | The Boeing Company | Methods and apparatus for a navigation system with reduced susceptibility to interference and jamming |
| CN112802118B (en) * | 2021-01-05 | 2022-04-08 | 湖北工业大学 | On-orbit time-sharing geometric calibration method for optical satellite sensor |
| CN114485668B (en) * | 2022-01-17 | 2023-09-22 | 上海卫星工程研究所 | Optical double-star positioning multi-moving-object association method and system |
-
2022
- 2022-09-29 CN CN202211201760.XA patent/CN115880328B/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108226978A (en) * | 2018-01-15 | 2018-06-29 | 电子科技大学 | A kind of Double-Star Positioning System method based on WGS-84 models |
| CN109709537A (en) * | 2018-12-19 | 2019-05-03 | 浙江大学 | A kind of noncooperative target position and speed tracking based on satellites formation |
Also Published As
| Publication number | Publication date |
|---|---|
| CN115880328A (en) | 2023-03-31 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Peng et al. | Pose measurement and motion estimation of space non-cooperative targets based on laser radar and stereo-vision fusion | |
| CN108375382B (en) | Method and device for accuracy calibration of position and attitude measurement system based on monocular vision | |
| Johnson et al. | Precise image-based motion estimation for autonomous small body exploration | |
| CN102753987B (en) | The calibration steps of the surveying instrument of photonics | |
| CN109709801A (en) | A kind of indoor unmanned plane positioning system and method based on laser radar | |
| Kelly et al. | Visual-inertial simultaneous localization, mapping and sensor-to-sensor self-calibration | |
| CN110470304B (en) | High-precision target positioning and speed measuring method based on unmanned aerial vehicle photoelectric platform | |
| CN111649737B (en) | Visual-inertial integrated navigation method for precise approach landing of airplane | |
| CN115880328B (en) | A method for real-time localization of moving aerial targets using dual-satellite collaborative observation | |
| CN110849331B (en) | Monocular vision measurement and ground test method based on three-dimensional point cloud database model | |
| CN114608554B (en) | Handheld SLAM equipment and robot instant positioning and mapping method | |
| CN105953795A (en) | Navigation apparatus and method for surface inspection of spacecraft | |
| CN105043392B (en) | A kind of aircraft pose determines method and device | |
| CN113022898A (en) | State estimation method for flexible attachment system in weak gravity environment | |
| Deschênes et al. | Lidar scan registration robust to extreme motions | |
| Yu et al. | Full-parameter vision navigation based on scene matching for aircrafts | |
| CN106672265B (en) | A small celestial body fixed-point landing guidance control method based on optical flow information | |
| CN108225276A (en) | A kind of list star imageable target kinetic characteristic inversion method and system | |
| Zsedrovits et al. | Performance analysis of camera rotation estimation algorithms in multi-sensor fusion for unmanned aircraft attitude estimation | |
| Briskin et al. | Estimating pose and motion using bundle adjustment and digital elevation model constraints | |
| Zheng et al. | Integrated navigation system with monocular vision and LIDAR for indoor UAVs | |
| CN113566778A (en) | Multipoint perspective imaging unmanned aerial vehicle ground flight pose measurement method | |
| CN114648577B (en) | Equipment detection method and equipment detection system | |
| Li et al. | Geodetic coordinate calculation based on monocular vision on UAV platform | |
| Post et al. | Visual pose estimation system for autonomous rendezvous of spacecraft |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |