CN119594980A - Method, device, equipment and storage medium for estimating motion of aircraft - Google Patents
Method, device, equipment and storage medium for estimating motion of aircraft Download PDFInfo
- Publication number
- CN119594980A CN119594980A CN202411767823.7A CN202411767823A CN119594980A CN 119594980 A CN119594980 A CN 119594980A CN 202411767823 A CN202411767823 A CN 202411767823A CN 119594980 A CN119594980 A CN 119594980A
- Authority
- CN
- China
- Prior art keywords
- motion estimation
- estimation result
- aircraft
- weight
- point cloud
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/183—Compensation of inertial measurements, e.g. for temperature effects
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a motion estimation method, a device, equipment and a storage medium of an aircraft, and relates to the technical field of computers, wherein the method comprises the steps of analyzing and processing current point cloud data acquired by the aircraft and current inertial data of the aircraft based on an error state Kalman filter ESKF to perform motion estimation on the aircraft to obtain a first motion estimation result of the aircraft; the method comprises the steps of obtaining current image data of an aircraft through optical flow tracking, obtaining a second motion estimation result of the aircraft, determining a first weight of a first motion estimation result according to constraint intensity of nearest neighbor planes corresponding to points in current point cloud data, determining a second weight of a second motion estimation result according to the first weight, fusing the first motion estimation result and the second motion estimation result based on the first weight and the second weight, obtaining a more accurate target motion estimation result of the aircraft, and realizing accurate motion estimation of the aircraft.
Description
Technical Field
Embodiments of the present invention relate to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for motion estimation of an aircraft.
Background
The aircraft needs navigation equipment to provide accurate position information in the flight process, and the traditional navigation equipment is satellite navigation equipment. However, satellite navigation signals are at risk of loss and the aircraft cannot fly normally in areas where the signals are disturbed.
With the development of sensor technology, the laser radar and vision based method for real-time positioning and map construction (Simultaneous Localization AND MAPPING, SLAM) can provide accurate position information for the aircraft in the outdoor scene without satellite navigation signals.
However, SLAM methods based on lidar and vision do not allow accurate motion estimation when the flying height of the aircraft is large.
Disclosure of Invention
The invention provides a motion estimation method, a motion estimation device, motion estimation equipment and a motion estimation storage medium for an aircraft, so that accuracy of motion estimation of the aircraft is improved.
In a first aspect, an embodiment of the present invention provides a method for motion estimation of an aircraft, including:
analyzing and processing current point cloud data acquired by an aircraft and current inertial data of the aircraft based on an Error state Kalman filter (Error STATE KALMAN FILTER, ESKF), and performing motion estimation on the aircraft to obtain a first motion estimation result of the aircraft, wherein an observation equation in ESKF is determined by the distance between each point in the current point cloud data and a nearest neighbor plane of each point in a local point cloud map;
obtaining a second motion estimation result of the aircraft by carrying out optical flow tracking on the current image data acquired by the aircraft, wherein the data constraint of the optical flow tracking is determined by inertial data and image data;
Determining a first weight of the first motion estimation result according to constraint intensity of each point in the current point cloud data and a nearest neighbor plane corresponding to each point, and determining a second weight of the second motion estimation result according to the first weight;
a target motion estimation result is determined based on the first motion estimation result and the first weight and the second motion estimation result and the second weight.
The technical scheme of the embodiment of the invention provides a motion estimation method of an aircraft, which comprises the steps of analyzing and processing current point cloud data acquired by the aircraft and current inertia data of the aircraft based on an error state Kalman filter ESKF, performing motion estimation on the aircraft to obtain a first motion estimation result of the aircraft, wherein an observation equation in ESKF is determined by the distance between each point in the current point cloud data and a nearest neighbor plane of each point in a local point cloud map, performing optical flow tracking on current image data acquired by the aircraft to obtain a second motion estimation result of the aircraft, determining data constraint of optical flow tracking by the inertia data and the image data, determining a first weight of the first motion estimation result according to constraint intensity of the nearest neighbor plane corresponding to each point in the current point cloud data, determining a second weight of the second motion estimation result according to the first weight, and determining a motion estimation result of the second motion estimation result based on the first weight and the second weight. According to the technical scheme, firstly, motion estimation can be performed through current point cloud data acquired by an aircraft and current inertia data of the aircraft to obtain a first motion estimation result of the aircraft, secondly, motion estimation can be performed through current image data acquired by the aircraft and current inertia data of the aircraft to obtain a second motion estimation result of the aircraft, then, the confidence degree of the first motion estimation result can be determined according to the constraint intensity of each point in the current point cloud data and the nearest neighbor plane corresponding to each point, the result can also be called as an evaluation index of the first motion estimation result, the evaluation index of the first motion estimation result is determined through degradation detection of the first motion estimation result, the result is used as a first weight of the first motion estimation result, a second weight of the second motion estimation result is determined according to the first weight of the first motion estimation result, and the first and second motion estimation results are fused, and the target motion estimation result of the aircraft is obtained, and the accurate motion estimation result of the aircraft is achieved.
Further, based on the error state kalman filter ESKF, analyzing and processing current point cloud data acquired by an aircraft and current inertial data of the aircraft, performing motion estimation on the aircraft to obtain a first motion estimation result of the aircraft, including:
inputting the current point cloud data and the current inertia data into the ESKF, so that the ESKF can determine an observed value according to the current point cloud data and the observation equation while analyzing the current inertia data and the last motion estimation result, and the first motion estimation result can be obtained when the observed value is determined to be minimum.
Further, determining an observation value according to the current point cloud data and the observation equation includes:
Determining nearest neighbors of each point in the current point cloud data in the local point cloud map, wherein the local point cloud map stores a plurality of voxels and points in each voxel based on a hash table, and establishes indexes of subdivision space units in the voxels based on a pseudo-Hilbert space filling curve;
determining a nearest neighbor plane of each point in the current point cloud data in the local point cloud map according to the nearest neighbor point of each point in the current point cloud data in the local point cloud map;
substituting each point in the current point cloud data and the nearest neighbor plane corresponding to each point into the observation equation to obtain the observation value.
Further, by performing optical flow tracking on the current image data acquired by the aircraft, a second motion estimation result of the aircraft is obtained, including:
Determining a weight graph and a feature graph of the current image data, and extracting feature points according to the weight graph;
Performing optical flow tracking on the feature map based on the feature points to obtain a current tracking result;
And fusing the current tracking result and the last tracking result based on a factor graph to obtain the second motion estimation result.
Further, determining the first weight of the first motion estimation result according to the constraint intensity of the nearest neighbor plane corresponding to each point in the current point cloud data, including:
Determining the constraint intensity of the nearest neighbor plane corresponding to each point in the current point cloud data according to the vector included angle between a first vector formed by the nearest neighbor plane corresponding to each point in the current point cloud data and a second vector formed by the advancing direction of the aircraft;
and normalizing the intensity and the value obtained by accumulating the constraint intensity of each point and the nearest neighbor plane corresponding to each point in the current point cloud data to obtain an evaluation index for determining the first motion estimation result, and determining the evaluation index as the first weight.
Further, determining a target motion estimation result based on the first motion estimation result and the first weight and the second motion estimation result and the second weight includes:
Under the condition that the first motion estimation result and the second motion estimation result are determined to have no time error, fusing the first motion estimation result and the second motion estimation result based on the first weight and the second weight to obtain the target motion estimation result;
And determining the first motion estimation result as the target motion estimation result under the condition that the first motion estimation result and the second motion estimation result have time errors.
Further, fusing the first motion estimation result and the second motion estimation result based on the first weight and the second weight to obtain the target motion estimation result, including:
Taking a position estimation result and a gesture estimation result in the first motion estimation result as points of a pose graph, and taking position change and gesture change in the second motion estimation result as edges of the pose graph to construct the pose graph, wherein edge weights are determined according to the first weights and the second weights;
And determining the target motion estimation result based on the pose graph.
In a second aspect, an embodiment of the present invention further provides a motion estimation apparatus for an aircraft, including:
The first estimation module is configured to analyze and process current point cloud data obtained by an aircraft and current inertial data of the aircraft based on an error state kalman filter ESKF, perform motion estimation on the aircraft, and obtain a first motion estimation result of the aircraft, where an observation equation in ESKF is determined by a distance between each point in the current point cloud data and a nearest neighbor plane of each point in a local point cloud map;
The second estimation module is used for obtaining a second motion estimation result of the aircraft by carrying out optical flow tracking on the current image data acquired by the aircraft, wherein the data constraint of the optical flow tracking is determined by the inertial data and the image data;
The determining module is used for determining a first weight of the first motion estimation result according to the constraint intensity of the nearest neighbor plane corresponding to each point in the current point cloud data and determining a second weight of the second motion estimation result according to the first weight;
And the execution module is used for determining a target motion estimation result based on the first motion estimation result and the first weight and the second motion estimation result and the second weight.
In a third aspect, an embodiment of the present invention further provides an electronic device, including:
and a memory communicatively coupled to the at least one processor;
Wherein the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of motion estimation of an aircraft according to any one of the first aspects.
In a fourth aspect, embodiments of the invention also provide a storage medium containing computer-executable instructions, characterized in that the computer-executable instructions, when executed by a computer processor, are for performing the method of motion estimation of an aircraft according to any of the first aspects.
In a fifth aspect, the application provides a computer program product comprising computer instructions which, when run on a computer, cause the computer to perform the method of motion estimation of an aircraft as provided in the first aspect.
It should be noted that the above-mentioned computer instructions may be stored in whole or in part on a computer-readable storage medium. The computer readable storage medium may be packaged together with the processor of the motion estimation device of the aircraft, or may be packaged separately from the processor of the motion estimation device of the aircraft, which is not limited in this regard.
The descriptions of the second aspect, the third aspect, the fourth aspect and the fifth aspect of the present application may refer to the detailed descriptions of the first aspect, and the advantages of the descriptions of the second aspect, the third aspect, the fourth aspect and the fifth aspect may refer to the analysis of the advantages of the first aspect, which is not repeated herein.
In the present application, the names of the above-mentioned motion estimation devices of the aircraft do not constitute a limitation on the devices or functional modules themselves, which may appear under other names in a practical implementation. Insofar as the function of each device or function module is similar to that of the present application, it falls within the scope of the claims of the present application and the equivalents thereof.
These and other aspects of the application will be more readily apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for motion estimation of an aircraft according to an embodiment of the present invention;
FIG. 2 is a flow chart of another method for motion estimation of an aircraft according to an embodiment of the present invention;
Fig. 3 is a schematic structural diagram of a motion estimation device of an aircraft according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
The term "and/or" is merely an association relationship describing the associated object, and means that three relationships may exist, for example, a and/or B may mean that a exists alone, while a and B exist together, and B exists alone.
The terms "first" and "second" and the like in the description and in the drawings are used for distinguishing between different objects or between different processes of the same object and not for describing a particular order of objects.
Furthermore, references to the terms "comprising" and "having" and any variations thereof in the description of the present application are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed but may optionally include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Before discussing exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart depicts operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently, or at the same time. Furthermore, the order of the operations may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like. Furthermore, embodiments of the invention and features of the embodiments may be combined with each other without conflict.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g." in an embodiment should not be taken as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the description of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more.
When the flying height of the aircraft is large, the existing SLAM method based on the laser radar and vision has the risk of failure. The laser radar SLAM relies on the registration effect of point cloud data, the point cloud data acquired by the laser radar in the high air can be in an approximate plane shape, the similarity of the point cloud data of each frame is large, and geometric information required by registration is lacked. Visual SLAM relies on feature point correlation between successive frames of image data, and is difficult to operate in conditions of poor desert and jungle texture. The distance between the reference object and the aircraft is large when flying at high altitude, and the binocular vision is difficult to accurately calculate depth information under the condition of limited base line length, so that accurate motion estimation cannot be realized.
Therefore, the application provides a motion estimation method of an aircraft, which can realize accurate positioning of the aircraft under the conditions of large outdoor altitude and no satellite navigation signal and ensure that the aircraft can realize stable flight under various complex conditions.
The motion estimation method of the aircraft according to the application will be described in detail below with reference to the drawings and the examples.
Fig. 1 is a flowchart of a motion estimation method of an aircraft according to an embodiment of the present invention, where the embodiment is applicable to a situation where motion estimation is required for the aircraft, the method may be performed by a motion estimation device of the aircraft, as shown in fig. 1, and specifically includes the following steps:
step 110, analyzing and processing current point cloud data acquired by an aircraft and current inertial data of the aircraft based on an error state Kalman filter ESKF, and performing motion estimation on the aircraft to obtain a first motion estimation result of the aircraft.
Aircraft are typically loaded with a lidar for acquiring point cloud data of the area in which the aircraft is flying, and a inertial odometer for acquiring inertial data of the aircraft.
Specifically, in the flight process of the aircraft, the point cloud data can be acquired in real time based on the laser radar, and the inertial data can be acquired in real time based on the inertial odometer. Since ESKF can better handle the problem of attitude change, the current point cloud data and the current inertial data can be input into the ESKF, the ESKF determines the first motion estimation result by analyzing the current point cloud data, the current inertial data and the previous motion estimation result, and ESKF can obtain the first motion estimation result of the aircraft when the observed value is minimum, so that the determination of the first motion estimation result of the aircraft is more accurate.
In the embodiment of the invention, the motion estimation of the aircraft is realized through the point cloud data and the inertia data, and the first motion estimation result of the aircraft is obtained.
And 120, obtaining a second motion estimation result of the aircraft by performing optical flow tracking on the current image data acquired by the aircraft.
The aircraft is also equipped with an image acquisition device for acquiring image data of the area in which the aircraft is flying, which can cover the field angle of the lidar.
The current image data may be understood as image data that maintains temporal consistency with the current point cloud data.
Specifically, the consistency of convolution features based on images can enhance the stability and reliability of visual feature tracking instead of gray invariance of the conventional optical flow method, and therefore, optical flow tracking can be performed based on feature points of images. Specifically, the current point cloud data and the current image data are aligned in time and strictly correspond in time, and the laser radar loaded on the aircraft is a solid-state laser radar with a single view angle and non-repeated scanning, so that after the current point cloud data are projected to the current image data, the coverage range of the point cloud data in the current image data is gradually improved along with the time, and the position information of a large number of pixels in the current image data in a three-dimensional space can be obtained. After the weight map and the feature map are determined according to the current image data, feature points can be determined based on the weight map, optical flow tracking is performed in the feature map based on the feature points, a current tracking result can be obtained, and a second motion estimation result of the aircraft is determined.
In the embodiment of the invention, the motion estimation of the aircraft is realized through the image data and the inertial data, and the second motion estimation result of the aircraft is obtained.
And 130, determining a first weight of the first motion estimation result according to the constraint intensity of the nearest neighbor plane corresponding to each point in the current point cloud data, and determining a second weight of the second motion estimation result according to the first weight.
When the aircraft flies in an open and flat outdoor environment and the laser radar faces the ground, the shape of each frame of point cloud data acquired by the laser radar is quite similar and similar to a plane, and effective point-plane constraint is absent in the horizontal direction and the course angle, so that the confidence of a first motion estimation result determined based on the point cloud data and the inertia data can be determined according to the strength of each group of point-plane constraint.
Specifically, the strength of each group of point-surface constraint can be determined, the point is each point in the current point cloud data, the surface is the nearest neighbor plane of each point in the current point cloud data in the local point cloud map, the constraint strength of each point in the current point cloud data and the nearest neighbor plane corresponding to each point can be determined according to the vector included angle between a first vector formed by the corresponding nearest neighbor plane corresponding to each point in the current point cloud data and a second vector formed by the advancing direction of the aircraft, and then the confidence degree of the first motion estimation result can be determined according to the constraint strength of each point and the nearest neighbor plane corresponding to each point in the current point cloud data, the result can also be called as an evaluation index of the first motion estimation result, and the evaluation index of the first motion estimation result is determined by detecting the degradation of the first motion estimation result. Further, the evaluation result may be determined as the first weight. Since the target motion estimation result is determined based on the first motion estimation result and the second motion estimation result, the sum of the second weights of the first motion estimation result and the second motion estimation result is 100%, and therefore, after determining the first weight of the first motion estimation result, the second weight of the second motion estimation result can be determined.
In the embodiment of the invention, the first weight of the first motion estimation result is determined by detecting the degradation of the first motion estimation result, and the second weight of the second motion estimation result is determined.
Step 140, determining a target motion estimation result based on the first motion estimation result and the first weight and the second motion estimation result and the second weight.
Specifically, the first motion estimation result and the second motion estimation result can be fused based on the first weight of the first motion estimation result and the second weight of the second motion estimation result to obtain the target motion estimation result, the first weight of the first motion estimation result indicates the confidence of the first motion estimation result, and the accuracy of the target motion estimation result obtained by fusing the first motion estimation result and the second motion estimation result based on the first weight of the first motion estimation result and the second weight of the second motion estimation result is higher.
In the embodiment of the invention, the target motion estimation result is determined according to the first motion estimation result and the first weight, and the target motion estimation result is determined according to the second motion estimation result and the second weight, so that accurate motion estimation for the aircraft is realized.
The motion estimation method of the aircraft comprises the steps of analyzing and processing current point cloud data acquired by the aircraft and current inertia data of the aircraft based on an error state Kalman filter ESKF, performing motion estimation on the aircraft to obtain a first motion estimation result of the aircraft, determining a distance between each point in the current point cloud data and a nearest neighbor plane of each point in a local point cloud map according to an observation equation in ESKF, performing optical flow tracking on current image data acquired by the aircraft to obtain a second motion estimation result of the aircraft, determining data constraint of optical flow tracking by the inertia data and the image data, determining a first weight of the first motion estimation result according to constraint intensity of the nearest neighbor plane corresponding to each point in the current point cloud data, determining a second weight of the second motion estimation result according to the first weight, and determining a target motion estimation result based on the first motion estimation result, the first weight and the second motion estimation result. According to the technical scheme, firstly, motion estimation can be performed through current point cloud data acquired by an aircraft and current inertia data of the aircraft to obtain a first motion estimation result of the aircraft, secondly, motion estimation can be performed through current image data acquired by the aircraft and current inertia data of the aircraft to obtain a second motion estimation result of the aircraft, then, the confidence degree of the first motion estimation result can be determined according to the constraint intensity of each point in the current point cloud data and the nearest neighbor plane corresponding to each point, the result can also be called as an evaluation index of the first motion estimation result, the evaluation index of the first motion estimation result is determined through degradation detection of the first motion estimation result, the result is used as a first weight of the first motion estimation result, a second weight of the second motion estimation result is determined according to the first weight of the first motion estimation result, and the first and second motion estimation results are fused, and the target motion estimation result of the aircraft is obtained, and the accurate motion estimation result of the aircraft is achieved.
Fig. 2 is a flowchart of another method for estimating motion of an aircraft according to an embodiment of the present invention, which is embodied based on the above embodiment. As shown in fig. 2, in this embodiment, the method may further include:
step 210, analyzing and processing current point cloud data acquired by an aircraft and current inertial data of the aircraft based on an error state Kalman filter ESKF, and performing motion estimation on the aircraft to obtain a first motion estimation result of the aircraft.
The point cloud data are obtained by the aircraft based on the laser radar scanning of the flying area of the aircraft, the scanning period is related to the performance of the laser radar, for example, the scanning period can be 0.1 seconds, and the acquired point cloud data in the scanning period are not consistent with the real environment due to the movement of the aircraft, so that the acquired point cloud data can be corrected to the same moment based on the inertia data in the scanning period to obtain the current point cloud data.
Specifically, the point cloud data acquired by the aircraft can be corrected based on the inertial data of the aircraft, specifically, the point cloud data currently acquired by the aircraft can be corrected based on the inertial data of the aircraft in the period of the scanning period from the current moment to the moment, and the positions of all points in the point cloud data currently acquired are corrected to the same moment to obtain the current point cloud data, so that the shape of the current point cloud data is consistent with the shape of an object in a real environment.
In one embodiment, step 210 may specifically include:
inputting the current point cloud data and the current inertia data into the ESKF, so that the ESKF can determine an observed value according to the current point cloud data and the observation equation while analyzing the current inertia data and the last motion estimation result, and the first motion estimation result can be obtained when the observed value is determined to be minimum.
The observation equation in ESKF is determined by the distance between each point in the current point cloud data and the nearest plane of each point in the local point cloud map.
Specifically, the current point cloud data and the current inertial data may be input into an ESKF, which may define a state vector of x= [ a, B, C ] T, where a represents position data, B represents velocity data, C represents attitude data, and may define a state equation of x k+1=f(xk,vk,wk), where x k+1 represents a state at time k+1, x k represents a state at time k, v k represents inertial data at time k, w k represents process noise, and may define an error state of x k+1=f(xk,vk,wk)Wherein, Representing the predicted state, an observation equation may be defined as z k=h(xk,uk), where u k represents the observation noise, and the observation equation is determined by the distance between each point in the current point cloud data and the nearest neighbor plane of each point in the local point cloud map.
Further, determining an observation value according to the current point cloud data and the observation equation includes:
Determining nearest neighbors of each point in the current point cloud data in the local point cloud map, wherein the local point cloud map stores a plurality of voxels and points in each voxel based on a hash table, establishing indexes of subdivision space units in the voxels based on a pseudo-Hilbert space filling curve, determining nearest neighbors of each point in the current point cloud data in the local point cloud map according to the nearest neighbors of each point in the current point cloud data in the local point cloud map, and substituting each point in the current point cloud data and the nearest neighbors corresponding to each point into the observation equation to obtain the observation value.
The motion estimation based on the point cloud data depends on the registration of the current point cloud data and the nearest neighbor points in the local point cloud map, the local point cloud map adopts a data management method based on incremental voxels, namely a sparse voxel map can be maintained, the voxels and points in the voxels are uniformly constructed by using a hash table, the voxel coordinates of each point are used as key values to generate unique indexes of each point, and the indexes of all subdivision space units in the voxels can be built based on pseudo-Hilbert space filling curves so as to quickly find the nearest neighbor points of each point in the local point map.
Specifically, firstly, the nearest neighbor point corresponding to each point in the current point cloud data can be determined in the local point cloud map, the nearest neighbor point can be understood as the point closest to each point in the current point cloud data in the local point cloud map, specifically, the voxel closest to each point can be determined firstly, then the point closest to each point is determined in the voxel closest to each point, the nearest neighbor point of each point in the local point cloud map is determined, secondly, the nearest neighbor plane of each point in the local point cloud map can be determined according to the nearest neighbor point of each point in the local point cloud map, specifically, a group of nearest neighbor points can be determined according to the nearest neighbor point of each point in the local point cloud map, and then the nearest neighbor plane corresponding to each point can be substituted into an observation equation to obtain an observation value.
ESKF, carrying the current point cloud data into an observation equation to determine an observation value while analyzing the current inertial data and the last motion estimation result, and obtaining a first motion estimation result of the aircraft when the observation value is minimum.
In the embodiment of the invention, the motion estimation of the aircraft is realized through the point cloud data and the inertia data, and the first motion estimation result of the aircraft is obtained.
And 220, obtaining a second motion estimation result of the aircraft by performing optical flow tracking on the current image data acquired by the aircraft.
Wherein the data constraint of optical flow tracking is determined by the inertial data and the image data.
In one embodiment, step 220 may specifically include:
Determining a weight graph and a feature graph of the current image data, extracting feature points according to the weight graph, carrying out optical flow tracking on the feature graph based on the feature points to obtain a current tracking result, fusing the current tracking result and a historical tracking result based on a factor graph, and obtaining the second motion estimation result when the data constraint is minimum.
Specifically, the current image data may be first processed based on a neural network to determine a weight map and a feature map corresponding to the current image data, where the weight map may be used for feature point extraction, and the feature map may be used for multi-layer pyramid optical flow tracking. The feature points can be determined based on the weight map, and specifically can be determined based on depth and weight joint evaluation after the current point cloud data are projected to the weight map. And then optical flow tracking can be performed based on the feature points in the feature map, so that a current tracking result is obtained.
To enhance the tracking effect, a second motion estimation result of the aircraft may be determined by fusing the current tracking result and the historical tracking result. Specifically, tracking results corresponding to a plurality of frames of continuous image data can be processed based on a factor graph optimization method, and data constraint for fusion of tracking results corresponding to adjacent frames of image data is determined by inertial data and image data, and specifically, can be determined by pre-integral residual errors of the inertial data and re-projection errors of the image data.
And processing tracking results corresponding to the multi-frame continuous image data based on a factor graph optimization method, determining a re-projection error through the image data, determining an inertial data pre-integration residual error through inertial data, and determining optimal estimation through minimizing the re-projection error and the inertial data pre-integration residual error to realize determination of a second motion estimation result of the aircraft.
It should be noted that, both the acquisition and the triggering of the point cloud data and the image data run on the edge calculation, when the laser radar finishes the scanning of the current point cloud data, a pin on the edge calculation platform generates a rising edge triggering signal, and the pin is connected to the triggering signal interface of the image acquisition device, so that the image acquisition instruction can be immediately sent to the image acquisition device based on the triggering signal interface of the image acquisition device, the image acquisition device can acquire the current image data based on the image acquisition instruction, and the strict correspondence of the current point cloud data and the current image data in the time space is realized.
In the embodiment of the invention, the motion estimation of the aircraft is realized through the image data and the inertial data, and the second motion estimation result of the aircraft is obtained.
Step 230, determining a first weight of the first motion estimation result according to the constraint intensity of the nearest neighbor plane corresponding to each point in the current point cloud data.
In one embodiment, step 230 may specifically include:
Determining the constraint intensity of the nearest neighbor planes corresponding to each point in the current point cloud data according to the vector included angle of a first vector formed by the nearest neighbor planes corresponding to each point in the current point cloud data and a second vector formed by the advancing direction of the aircraft, normalizing the intensity sum value obtained by accumulating the constraint intensity of the nearest neighbor planes corresponding to each point in the current point cloud data to obtain an evaluation index for determining the first motion estimation result, and determining the evaluation index as the first weight.
Specifically, first a first vector formed by corresponding nearest neighbor planes corresponding to each point in the current point cloud data can be determined, second a second vector formed by the advancing direction of the aircraft can be determined, further, the constraint intensity of the nearest neighbor planes corresponding to each point in the current point cloud data can be determined according to a vector included angle determined by the first vector formed by the corresponding nearest neighbor planes corresponding to each point in the current point cloud data and the second vector formed by the advancing direction of the aircraft, constraint intensity of the nearest neighbor planes corresponding to each point in the current point cloud data is accumulated, normalization processing is carried out on the accumulated result to obtain the confidence of the first motion estimation result, and the result is the first weight of the first motion estimation result.
In the embodiment of the invention, the first weight of the first motion estimation result is determined according to the constraint intensity of each point in the current point cloud data and the nearest neighbor plane corresponding to each point.
Step 240, determining a second weight of the second motion estimation result according to the first weight.
Specifically, a difference of 100% from the first weight may be determined, and the difference may be determined as a second weight of the second motion estimation result.
In the embodiment of the invention, the second weight for determining the second motion estimation result is realized.
Step 250, determining a target motion estimation result based on the first motion estimation result and the first weight and the second motion estimation result and the second weight.
In one embodiment, step 250 may specifically include:
Under the condition that the first motion estimation result and the second motion estimation result are determined to have no time error, fusing the first motion estimation result and the second motion estimation result based on the first weight and the second weight to obtain the target motion estimation result; and determining the first motion estimation result as the target motion estimation result under the condition that the first motion estimation result and the second motion estimation result have time errors.
The method comprises the steps of determining a first motion estimation result and determining a second motion estimation result simultaneously, triggering a motion estimation result fusion algorithm after the first motion estimation result is determined, and directly determining the first motion estimation result as a target motion estimation result without a fusion step under the condition that the first motion estimation result and the second motion estimation result are determined to have no time error based on the first weight and the second weight to fuse the first motion estimation result and the second motion estimation result to obtain the target motion estimation result.
Further, fusing the first motion estimation result and the second motion estimation result based on the first weight and the second weight to obtain the target motion estimation result, including:
And taking the position estimation result and the gesture estimation result in the first motion estimation result as points of a pose graph, taking the position change and the gesture change in the second motion estimation result as edges of the pose graph, constructing the pose graph, wherein the edge weight is determined according to the first weight and the second weight, and determining the target motion estimation result based on the pose graph.
Specifically, a first motion estimation result and a second motion estimation result can be fused based on a pose map optimization method, specifically, a position estimation result and a pose estimation result in the first motion estimation result can be used as points of the pose map, a position change and a pose change in the second motion estimation result are used as edges of the pose map, the pose map is constructed, the edge weight is determined according to the first weight and the second weight, and then motion estimation can be performed based on the pose map, so that a target motion estimation result of the aircraft is obtained.
In practical applications, the target motion estimation result can also be used for updating the local point cloud map.
In the embodiment of the invention, the target motion estimation result is obtained by fusing the first motion estimation result and the second motion estimation result of the aircraft, so that accurate motion estimation of the aircraft is realized.
The motion estimation method of the aircraft comprises the steps of analyzing and processing current point cloud data acquired by the aircraft and current inertia data of the aircraft based on an error state Kalman filter ESKF, performing motion estimation on the aircraft to obtain a first motion estimation result of the aircraft, performing optical flow tracking on current image data acquired by the aircraft to obtain a second motion estimation result of the aircraft, determining a first weight of the first motion estimation result according to constraint intensity of a nearest neighbor plane corresponding to each point in the current point cloud data, determining a second weight of the second motion estimation result according to the first weight, and determining a target motion estimation result based on the first motion estimation result, the first weight, the second motion estimation result and the second weight. According to the technical scheme, firstly, motion estimation can be performed through current point cloud data acquired by an aircraft and current inertia data of the aircraft to obtain a first motion estimation result, secondly, motion estimation can be performed through current image data acquired by the aircraft and current inertia data of the aircraft to obtain a second motion estimation result of the aircraft, then, the confidence level of the first motion estimation result can be determined according to the constraint intensity of each point in the current point cloud data and the nearest neighbor plane corresponding to each point, the result can also be called as an evaluation index of the first motion estimation result, the evaluation index of the first motion estimation result is determined through degradation detection of the first motion estimation result, the result is used as the first weight of the first motion estimation result, the second weight of the second motion estimation result is determined according to the first weight of the first motion estimation result, and then, the first motion estimation result and the second motion estimation result can be fused according to the first weight and the second weight, the target motion estimation result is obtained, the accurate motion estimation result of the aircraft is realized, and the aircraft is further accurate and the aircraft is ensured to be positioned under the condition of a plurality of complex navigation signals under the outdoor condition of the aircraft.
Fig. 3 is a schematic structural diagram of a motion estimation device for an aircraft according to an embodiment of the present invention, where the device may be suitable for a situation where motion estimation is required for an aircraft, so as to improve accuracy of motion estimation for the aircraft. The apparatus may be implemented in software and/or hardware and is typically integrated in an electronic device, such as a computer device.
As shown in fig. 3, the apparatus includes:
A first estimation module 310, configured to analyze and process current point cloud data obtained by an aircraft and current inertial data of the aircraft based on an error state kalman filter ESKF, and perform motion estimation on the aircraft to obtain a first motion estimation result of the aircraft, where an observation equation in ESKF is determined by a distance between each point in the current point cloud data and a nearest neighbor plane of each point in a local point cloud map;
a second estimation module 320, configured to obtain a second motion estimation result of the aircraft by performing optical flow tracking on current image data acquired by the aircraft, where a data constraint of optical flow tracking is determined by inertial data and image data;
A determining module 330, configured to determine a first weight of the first motion estimation result according to constraint intensity of each point in the current point cloud data and a nearest neighbor plane corresponding to each point, and determine a second weight of the second motion estimation result according to the first weight;
An execution module 340 is configured to determine a target motion estimation result based on the first motion estimation result and the first weight and the second motion estimation result and the second weight.
According to the motion estimation device of the aircraft, based on the error state Kalman filter ESKF, the current point cloud data acquired by the aircraft and the current inertia data of the aircraft are analyzed and processed, the aircraft is subjected to motion estimation to obtain a first motion estimation result of the aircraft, an observation equation in ESKF is determined by the distance between each point in the current point cloud data and the nearest neighbor plane of each point in a local point cloud map, optical flow tracking is conducted on the current image data acquired by the aircraft to obtain a second motion estimation result of the aircraft, the constraint of the optical flow tracking data is determined by the inertia data and the image data, the first weight of the first motion estimation result is determined according to the constraint intensity of the nearest neighbor plane corresponding to each point in the current point cloud data, the second weight of the second motion estimation result is determined according to the first weight, and the target motion estimation result is determined based on the first motion estimation result, the first weight, the second motion estimation result and the second weight. According to the technical scheme, firstly, motion estimation can be performed through current point cloud data acquired by an aircraft and current inertia data of the aircraft to obtain a first motion estimation result of the aircraft, secondly, motion estimation can be performed through current image data acquired by the aircraft and current inertia data of the aircraft to obtain a second motion estimation result of the aircraft, then, the confidence degree of the first motion estimation result can be determined according to the constraint intensity of each point in the current point cloud data and the nearest neighbor plane corresponding to each point, the result can also be called as an evaluation index of the first motion estimation result, the evaluation index of the first motion estimation result is determined through degradation detection of the first motion estimation result, the result is used as a first weight of the first motion estimation result, a second weight of the second motion estimation result is determined according to the first weight of the first motion estimation result, and the first and second motion estimation results are fused, and the target motion estimation result of the aircraft is obtained, and the accurate motion estimation result of the aircraft is achieved.
Based on the above embodiment, the first estimation module 310 is specifically configured to:
inputting the current point cloud data and the current inertia data into the ESKF, so that the ESKF can determine an observed value according to the current point cloud data and the observation equation while analyzing the current inertia data and the last motion estimation result, and the first motion estimation result can be obtained when the observed value is determined to be minimum.
In one embodiment, determining an observation from the current point cloud data and the observation equation includes:
Determining nearest neighbors of each point in the current point cloud data in the local point cloud map, wherein the local point cloud map stores a plurality of voxels and points in each voxel based on a hash table, establishing indexes of subdivision space units in the voxels based on a pseudo-Hilbert space filling curve, determining nearest neighbors of each point in the current point cloud data in the local point cloud map according to the nearest neighbors of each point in the current point cloud data in the local point cloud map, and substituting each point in the current point cloud data and the nearest neighbors corresponding to each point into the observation equation to obtain the observation value.
Based on the above embodiment, the second estimation module 320 is specifically configured to:
Determining a weight graph and a feature graph of the current image data, extracting feature points according to the weight graph, carrying out optical flow tracking on the feature graph based on the feature points to obtain a current tracking result, fusing the current tracking result and a historical tracking result based on a factor graph, and obtaining the second motion estimation result when the data constraint is minimum.
Based on the above embodiment, the determining module 330 is specifically configured to:
Determining the constraint intensity of the nearest neighbor planes corresponding to each point in the current point cloud data according to the vector included angle of a first vector formed by the nearest neighbor planes corresponding to each point in the current point cloud data and a second vector formed by the advancing direction of the aircraft, normalizing the intensity sum value obtained by accumulating the constraint intensity of the nearest neighbor planes corresponding to each point in the current point cloud data to obtain an evaluation index for determining the first motion estimation result, and determining the evaluation index as the first weight.
Based on the above embodiment, the execution module 340 is specifically configured to:
Under the condition that the first motion estimation result and the second motion estimation result are determined to have no time error, fusing the first motion estimation result and the second motion estimation result based on the first weight and the second weight to obtain the target motion estimation result; and determining the first motion estimation result as the target motion estimation result under the condition that the first motion estimation result and the second motion estimation result have time errors.
In one embodiment, fusing the first motion estimation result and the second motion estimation result based on the first weight and the second weight to obtain the target motion estimation result includes:
And taking the position estimation result and the gesture estimation result in the first motion estimation result as points of a pose graph, taking the position change and the gesture change in the second motion estimation result as edges of the pose graph, constructing the pose graph, wherein the edge weight is determined according to the first weight and the second weight, and determining the target motion estimation result based on the pose graph.
The motion estimation device of the aircraft provided by the embodiment of the invention can execute the motion estimation method of the aircraft provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of executing the motion estimation method of the aircraft.
It should be noted that, in the embodiment of the motion estimation device of the aircraft, the units and modules included are only divided according to the functional logic, but not limited to the above-mentioned division, so long as the corresponding functions can be implemented, and the specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. Fig. 4 shows a block diagram of an exemplary electronic device 4 suitable for use in implementing embodiments of the invention. The electronic device 4 shown in fig. 4 is only an example and should not be construed as limiting the functionality and scope of use of the embodiments of the invention.
As shown in fig. 4, the electronic device 4 is in the form of a general purpose computing electronic device. The components of the electronic device 4 may include, but are not limited to, one or more processors or processing units 16, a system memory 28, and a bus 18 that connects the various system components, including the system memory 28 and the processing units 16.
Bus 18 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, micro channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 4 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by electronic device 4 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 30 and/or cache memory 32. Electronic device 4 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from or write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, commonly referred to as a "hard disk drive"). Although not shown in fig. 4, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In such cases, each drive may be coupled to bus 18 through one or more data medium interfaces. The system memory 28 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of the embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored in, for example, system memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 42 generally perform the functions and/or methods of the embodiments described herein.
Electronic device 4 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), one or more devices that enable a user to interact with electronic device 4, and/or any devices (e.g., network card, modem, etc.) that enable electronic device 4 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 22. Also, the electronic device 4 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through the network adapter 20. As shown in fig. 4, the network adapter 20 communicates with other modules of the electronic device 4 via the bus 18. It should be appreciated that although not shown in FIG. 4, other hardware and/or software modules may be used in connection with the electronic device 4, including, but not limited to, microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processing unit 16 executes various functional applications and page displays by running programs stored in the system memory 28, for example, implementing a method for motion estimation of an aircraft provided by an embodiment of the present invention, the method comprising:
analyzing and processing current point cloud data acquired by an aircraft and current inertia data of the aircraft based on an error state Kalman filter ESKF, and performing motion estimation on the aircraft to obtain a first motion estimation result of the aircraft, wherein an observation equation in ESKF is determined by the distance between each point in the current point cloud data and the nearest neighbor plane of each point in a local point cloud map;
obtaining a second motion estimation result of the aircraft by carrying out optical flow tracking on the current image data acquired by the aircraft, wherein the data constraint of the optical flow tracking is determined by inertial data and image data;
Determining a first weight of the first motion estimation result according to constraint intensity of each point in the current point cloud data and a nearest neighbor plane corresponding to each point, and determining a second weight of the second motion estimation result according to the first weight;
a target motion estimation result is determined based on the first motion estimation result and the first weight and the second motion estimation result and the second weight.
Of course, those skilled in the art will understand that the processor may also implement the technical solution of the method for estimating motion of an aircraft provided by any embodiment of the present invention.
An embodiment of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method for motion estimation of an aircraft, for example as provided by an embodiment of the present invention, the method comprising:
analyzing and processing current point cloud data acquired by an aircraft and current inertia data of the aircraft based on an error state Kalman filter ESKF, and performing motion estimation on the aircraft to obtain a first motion estimation result of the aircraft, wherein an observation equation in ESKF is determined by the distance between each point in the current point cloud data and the nearest neighbor plane of each point in a local point cloud map;
obtaining a second motion estimation result of the aircraft by carrying out optical flow tracking on the current image data acquired by the aircraft, wherein the data constraint of the optical flow tracking is determined by inertial data and image data;
Determining a first weight of the first motion estimation result according to constraint intensity of each point in the current point cloud data and a nearest neighbor plane corresponding to each point, and determining a second weight of the second motion estimation result according to the first weight;
a target motion estimation result is determined based on the first motion estimation result and the first weight and the second motion estimation result and the second weight.
The computer storage media of embodiments of the invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
It will be appreciated by those of ordinary skill in the art that the modules or steps of the invention described above may be implemented in a general purpose computing device, they may be centralized on a single computing device, or distributed over a network of computing devices, or they may alternatively be implemented in program code executable by a computer device, such that they are stored in a memory device and executed by the computing device, or they may be separately fabricated as individual integrated circuit modules, or multiple modules or steps within them may be fabricated as a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
In addition, the technical scheme of the invention can acquire, store, use, process and the like the data, which accords with the relevant regulations of national laws and regulations.
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202411767823.7A CN119594980A (en) | 2024-12-04 | 2024-12-04 | Method, device, equipment and storage medium for estimating motion of aircraft |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202411767823.7A CN119594980A (en) | 2024-12-04 | 2024-12-04 | Method, device, equipment and storage medium for estimating motion of aircraft |
Publications (1)
Publication Number | Publication Date |
---|---|
CN119594980A true CN119594980A (en) | 2025-03-11 |
Family
ID=94840508
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202411767823.7A Pending CN119594980A (en) | 2024-12-04 | 2024-12-04 | Method, device, equipment and storage medium for estimating motion of aircraft |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN119594980A (en) |
-
2024
- 2024-12-04 CN CN202411767823.7A patent/CN119594980A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12198364B2 (en) | Computer vision systems and methods for detecting and modeling features of structures in images | |
JP6812404B2 (en) | Methods, devices, computer-readable storage media, and computer programs for fusing point cloud data | |
US10964054B2 (en) | Method and device for positioning | |
CN109059906B (en) | Vehicle positioning method and device, electronic equipment and storage medium | |
KR102126724B1 (en) | Method and apparatus for restoring point cloud data | |
CN109345596B (en) | Multi-sensor calibration method, device, computer equipment, medium and vehicle | |
CN109270545B (en) | Positioning true value verification method, device, equipment and storage medium | |
CN105865454B (en) | A kind of Navigation of Pilotless Aircraft method generated based on real-time online map | |
CN109461208B (en) | Three-dimensional map processing method, device, medium and computing equipment | |
CN112734852A (en) | Robot mapping method and device and computing equipment | |
CN111079619A (en) | Method and apparatus for detecting target object in image | |
CN109300143A (en) | Determination method, apparatus, equipment, storage medium and the vehicle of motion vector field | |
CN110989619B (en) | Method, apparatus, device and storage medium for locating objects | |
EP3291178B1 (en) | 3d vehicle localizing using geoarcs | |
CN112348887B (en) | Terminal posture determination method and related device | |
CN114641801A (en) | System and method for generating annotations of structured static objects in aerial imagery using geometric transfer learning and probabilistic localization | |
CN111612829B (en) | High-precision map construction method, system, terminal and storage medium | |
CN118999559A (en) | Positioning method and device of inspection unmanned aerial vehicle, computer equipment and storage medium | |
Alliez et al. | Indoor localization and mapping: Towards tracking resilience through a multi-slam approach | |
CN115294234B (en) | Image generation method and device, electronic equipment and storage medium | |
CN119594980A (en) | Method, device, equipment and storage medium for estimating motion of aircraft | |
Farkoushi et al. | Generating Seamless Three-Dimensional Maps by Integrating Low-Cost Unmanned Aerial Vehicle Imagery and Mobile Mapping System Data | |
CN111710039B (en) | High-precision map construction method, system, terminal and storage medium | |
CN113776530A (en) | Point cloud map construction method and device, electronic equipment and storage medium | |
CN119832459B (en) | Camera repositioning method for approach landing of fixed wing aircraft |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |