CN114646991B - A navigation enhancement network system based on "body reference" and its construction method - Google Patents
A navigation enhancement network system based on "body reference" and its construction method Download PDFInfo
- Publication number
- CN114646991B CN114646991B CN202210248804.8A CN202210248804A CN114646991B CN 114646991 B CN114646991 B CN 114646991B CN 202210248804 A CN202210248804 A CN 202210248804A CN 114646991 B CN114646991 B CN 114646991B
- Authority
- CN
- China
- Prior art keywords
- information
- terminal
- reference station
- target
- enhancement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000010276 construction Methods 0.000 title abstract description 7
- 230000005540 biological transmission Effects 0.000 claims abstract description 26
- 238000012545 processing Methods 0.000 claims abstract description 22
- 230000010365 information processing Effects 0.000 claims abstract description 19
- 230000004927 fusion Effects 0.000 claims abstract description 11
- 230000004438 eyesight Effects 0.000 claims abstract description 6
- 238000012937 correction Methods 0.000 claims abstract description 5
- 239000011159 matrix material Substances 0.000 claims description 24
- 238000000034 method Methods 0.000 claims description 14
- 238000005516 engineering process Methods 0.000 claims description 13
- 230000000007 visual effect Effects 0.000 claims description 12
- 239000002245 particle Substances 0.000 claims description 9
- 230000003068 static effect Effects 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 6
- 239000000284 extract Substances 0.000 claims description 6
- 230000011218 segmentation Effects 0.000 claims description 5
- 238000004422 calculation algorithm Methods 0.000 claims description 4
- 238000013519 translation Methods 0.000 claims description 4
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 239000000470 constituent Substances 0.000 claims description 3
- 238000013528 artificial neural network Methods 0.000 claims 1
- 230000002708 enhancing effect Effects 0.000 claims 1
- 238000003909 pattern recognition Methods 0.000 claims 1
- 230000001131 transforming effect Effects 0.000 claims 1
- 238000005259 measurement Methods 0.000 abstract 1
- 230000002093 peripheral effect Effects 0.000 description 11
- 238000011161 development Methods 0.000 description 3
- 230000018109 developmental process Effects 0.000 description 3
- 230000003416 augmentation Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000000638 solvent extraction Methods 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013075 data extraction Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000005043 peripheral vision Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
- G01S19/47—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/13—Receivers
- G01S19/35—Constructional details or hardware or software details of the signal processing chain
- G01S19/37—Hardware or software details of the signal processing chain
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/38—Services specially adapted for particular environments, situations or purposes for collecting sensor information
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Automation & Control Theory (AREA)
- Image Processing (AREA)
- Navigation (AREA)
Abstract
The invention discloses a navigation enhancement network system based on 'body benchmark' and a construction method thereof, comprising a benchmark station determining system, a benchmark station model position resolving system, a benchmark station network data processing system, an information transmission system and a terminal enhancement information processing system, wherein the network system constructs a 'benchmark body' network based on 5G, vision and GNSS multisource fusion information, and faces to a multisource fusion terminal to form enhancement service of 'benchmark body', thereby improving positioning recognition precision of a three-dimensional measurement terminal and a panoramic terminal; compared with the traditional navigation enhancement system which only can provide differential correction information based on point positions, the network system can provide enhancement information based on 'body' information, remove position enhancement and enhance recognition accuracy; meanwhile, based on the characteristic of rapid transmission of 5G information, the application terminal supporting Beidou, INS and vision functions is served, the function of estimating the unknown point position by the known points is realized in any scene, and the indoor and outdoor seamless positioning problem is solved.
Description
Technical Field
The invention relates to the technical field of navigation positioning, in particular to a navigation enhancement network system based on a body benchmark and a construction method thereof.
Background
Satellite navigation positioning is a basic space-time network, accurate position and time are provided for people, a coordinate system acquired by GNSS is point coordinates (longitude latitude height), mapping technology is changed, current holographic mapping is measured from original basis, a future live-action three-dimensional map replaces the existing two-dimensional and image map to become main stream of basic geographic information, the positioning concept is also suitable for new technical development, an object is abstract to be measured as a point, the requirements of scene, materialization and refinement management cannot be met, and an original geographic entity is identified by changing a particle position into a group of space-time point clouds. There is a certain trend from point coordinates to volume coordinates, for which no reasonable solution has been proposed at present.
Disclosure of Invention
The invention aims to provide a navigation enhancement network system based on a body benchmark and a construction method thereof, so as to solve the problems in the background technology.
In order to achieve the above purpose, the present invention provides the following technical solutions: a "body referencing" based navigation augmentation network system comprising: the system comprises a reference station determining system, a reference station model position resolving system, a reference station network data processing system, an information transmission system and a terminal enhanced information processing system;
The reference station determining system selects static ground object points as reference points on a remote sensing image through a visual interpretation method, and extracts and segments a target object to obtain coordinate information of a ground object outline;
The reference station model position resolving system utilizes target identification in a scene to obtain a three-dimensional coordinate sequence of a reference station and ground feature attribute information through a twin scene and a three-dimensional reconstruction technology;
The reference station network data processing system gathers the reference station network data provided by the reference station model position resolving system in a certain area, and extracts the 5G, visual and GNSS multisource fusion information of each reference station in the reference station network data to obtain a feature matrix, accurate coordinate position information and build an enhancement station database so as to prepare enhancement information service of the terminal enhancement information processing system;
The information transmission system transmits information of surrounding reference points to the terminal enhancement information processing system by using 5G transmission;
the terminal enhancement information processing system is dispersed in each terminal, acquires the accurate position of the terminal according to the sensors such as the GNSS and the INS of the terminal, requests the information of the peripheral reference stations from the reference station network data processing system, and enhances the positioning function of the terminal after acquiring the high-precision differential information of the peripheral visual reference stations and the GNSS.
According to another aspect of the present invention, a method for constructing a navigation enhancement network system based on "body reference" is provided, including the following steps:
Step 1, on a remote sensing image, a static feature point is selected as a reference point by a visual interpretation method, and a target object is extracted and segmented to obtain coordinate information of a feature contour;
Step 2, a reference station model position resolving system identifies targets in a scene, and a three-dimensional coordinate sequence of a reference station and ground feature attribute information are obtained through a twinning scene and a three-dimensional reconstruction technology, wherein the ground feature attribute information is expressed in a feature matrix mode;
Step 3, when the datum station network data of the preset area are converged to the datum station network data processing system in a unified mode, the datum station network data processing system builds an enhancement station database for the extracted feature matrix and the accurate coordinate position of each datum station, and prepares for enhancement information service of the terminal;
step 4, transmitting information of surrounding reference points to the surrounding terminals needing service by using 5G transmission;
And 5, the terminal acquires the accurate position of the terminal according to the GNSS sensor of the terminal, and simultaneously requests the information of the peripheral reference stations from the reference station network data processing module, and enhances the positioning function of the terminal after acquiring the peripheral vision reference stations and the GNSS high-precision differential information.
Further, the "body reference" refers to "points" of the ground in a two-dimensional map and are displayed in a "body" manner in a live-action three-dimensional manner based on the theory of "object-oriented".
Furthermore, the mode position refers to an antenna phase center position obtained through Beidou positioning, specifically, a particle position P (X, Y, Z), and a group of point sets with space topological relation are obtained through Beidou+intelligent recognition fusion positioning, namely, the mode position is called, specifically, a particle position set P { P1, P2 … pn }, and pn is an nth particle.
Furthermore, a static target is obtained by using a remote sensing mode recognition technology and used as a reference station, a GNSS is used for providing accurate space-time reference to obtain the position of the reference station, and remote sensing is used for obtaining characteristic points of a reference ground object, wherein the characteristic points are expressed in a characteristic matrix mode.
Further, the reference station network is constructed in a uniformly distributed manner in a certain area, the reference station is characterized by a 'body reference', and enhancement information based on the 'body reference' is taken as a basic constituent unit of the information.
Furthermore, the information transmission module utilizes a 5G transmission network, the data center provides the module positions P { P1, P2 … } of the peripheral 'body datum' for the multi-source fusion terminal, the terminal calculates the position of the terminal or the peripheral ground object by GNSS positioning and a scene internal reference object obtained by a sensor, and the information transmission module is different from the differential correction information based on the point position provided by the traditional navigation enhancement system, the 'body datum' provides enhancement information based on 'body' information, and enhances the recognition accuracy besides the enhancement position information.
Furthermore, the terminal enhancement information processing system works by relying on a terminal, and the terminal enhancement information processing system obtains the data processing system information of the reference station network through a 5G transmission network and works in cooperation with the sensor information of the terminal, so that the function of estimating the position of an unknown point from a known point is realized, and the indoor and outdoor seamless positioning problem is solved.
Further, the reference station mode position calculating system in step S2, the mode position calculating includes the steps of:
Step S2.1>: instance partitioning
Using ResNet-101+FPN as a backup, the instance segmentation is realized through two branches, namely a prototype generation branch and a Mask coefficient branch. Prototype generation branch Protonet generates mask prototypes based primarily on FCN implementations. Mask coefficient branches mainly realize prediction of Mask coefficients through an anchor-based target detector. The two branches are independently calculated, mask synthesis is carried out through matrix multiplication and Sigmoid function, and finally a segmentation result is output;
Step S2.2>: target depth acquisition
And acquiring depth information by adopting a binocular camera. Assuming a target at coordinates (i, j) whose depth value Z (i, j); estimating the depth value at each pixel position by adopting a neighborhood window; s is the size of a neighborhood window; Is a sign function;
step S2.3>: external parameter matrix calculation
Carrying out camera attitude estimation through n 3D points in a group of world coordinate systems and corresponding 2D coordinates in an image, and realizing attitude estimation by adopting EPnP (Efficient PnP) algorithm;
step S2.4>: position resolution
By converting the pixel into a coordinate system, the three-dimensional position of the target in the world coordinate system can be obtained.
Pixel coordinates (u, v), conversion relation with image coordinate system:
if fx and fy are focal lengths of the camera in the x-axis and the y-axis, and Xc, yc and Zc are coordinate values of the target in the camera coordinate system, then:
The (x, y) refers to the position of the object in the image coordinate system, the (u, v) refers to the position of the object in the pixel coordinate system, the (u 0,v0) refers to the parameter that shifts the origin of the image coordinate from the center of the image to the upper left corner, and the (dx, dy) refers to the size of each pixel point in the x-axis, y-axis;
if fx and fy are focal lengths of the camera in the x-axis and the y-axis, and Xc, yc and Zc are coordinate values of the target in the camera coordinate system, then:
knowing the pixel coordinates (u, v) of the target and the internal reference matrix of the camera, and solving the position information according to the external reference of the camera and the depth of the target; the R represents a rotation matrix, i.e. the rotation of the camera compared to the world coordinate system, and the T represents a translation matrix, i.e. the translation of the camera compared to the world coordinate system;
the (Xw, yw, zw) refers to the three-dimensional position of the target in the world coordinate system.
The beneficial effects are that:
Compared with the prior art, the traditional navigation enhancement system and the construction method provide differential correction information based on point positions, the method provides enhancement information based on 'body' information, and enhances recognition accuracy besides position enhancement; based on the characteristic of rapid transmission of 5G information, for the comprehensive application terminal development service based on Beidou/inertial navigation/vision, the function of calculating the unknown point position by the known point is realized in any scene, and the indoor and outdoor seamless positioning problem is thoroughly solved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a "body referencing" based navigation enhancement network construction in accordance with an embodiment of the present invention.
FIG. 2 is a diagram of a navigation enhancement network service based on "body benchmarks" according to embodiments of the present invention.
Detailed Description
The technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, but not all embodiments, and all other embodiments obtained by those skilled in the art without the inventive effort based on the embodiments of the present invention are within the scope of protection of the present invention.
Referring to fig. 1-2, in order to achieve the above object, according to an embodiment of the present invention, the following technical solutions are provided: a "body referencing" based navigation augmentation network system comprising:
the system comprises a reference station determining system, a reference station model position resolving system, a reference station network data processing system, an information transmission system and a terminal enhancement information processing system;
The reference station determining system selects static ground object points as reference points on a high-precision remote sensing image through a visual interpretation method, and extracts and segments a target object to obtain coordinate information of a ground object contour; the high precision is in the centimeter level.
The reference station model position resolving system utilizes target identification in a scene to obtain a three-dimensional coordinate sequence of a reference station and ground feature attribute information through a twin scene and a three-dimensional reconstruction technology;
The reference station network data processing system gathers the reference station network data provided by the reference station model position resolving system in a certain area, and performs data extraction on multi-source fusion information such as 5G, vision, GNSS and the like of each reference station in the reference station network data to obtain information such as a feature matrix, a precise coordinate position and the like, and builds an enhancement station database so as to prepare enhancement information service of the terminal enhancement information processing system;
The information transmission system utilizes the high-speed transmission characteristic of 5G to send the information of the surrounding datum points to the terminal enhancement information processing system;
The terminal enhancement information processing system is dispersed in each terminal, acquires the accurate position of the terminal according to the GNSS, INS and other sensors of the terminal, requests the information of the peripheral reference stations from the reference station network data processing system, and enhances the positioning function of the terminal after acquiring the information of the peripheral visual reference stations, GNSS high-precision difference and the like.
According to another embodiment of the present invention, a method for constructing a navigation enhancement network system based on "body reference" is provided, which is characterized by comprising the following steps:
step S1, on a high-precision remote sensing image, selecting a static ground object point as a reference point by a visual interpretation method, and extracting and dividing a target object to obtain coordinate information of a ground object contour;
S2, the model position of the reference station is identified by utilizing targets in the scene, and a three-dimensional coordinate sequence of the reference station and ground feature attribute information are obtained through a twinning scene and a three-dimensional reconstruction technology, wherein the ground feature attribute information is expressed in a feature matrix mode;
Step S3, when the reference station network data of a certain area are converged to the reference station network data processing system in a unified way, the reference station network data processing system builds an enhancement station database for the extracted feature matrix, the accurate coordinate position and the like of each reference station, and prepares for enhancement information service of the terminal;
s4, transmitting information of surrounding reference points to the surrounding terminals needing service by utilizing the high-speed transmission characteristic of 5G;
And S5, the terminal acquires the accurate position of the terminal according to the GNSS sensor of the terminal, requests the information of the peripheral reference stations from the reference station network data processing system, and enhances the positioning function of the terminal after acquiring the information of the peripheral visual reference stations, the GNSS high-precision difference and the like.
Further, the "body reference" refers to ground "points" under a certain scale in a two-dimensional map based on the theory of "object-oriented", and are displayed in a "body" manner in a live-action three-dimensional manner.
Furthermore, the mode position refers to an antenna phase center position obtained through Beidou positioning, specifically a particle position P (X, Y, Z), and a group of point sets with space topological relation are obtained through Beidou+intelligent identification fusion positioning, namely the mode position, specifically a particle position set P { P1, P2 … }.
Furthermore, a static target is obtained by using a remote sensing mode recognition technology and used as a reference station, a GNSS is used for providing accurate space-time reference to obtain the position of the reference station, and remote sensing is used for obtaining characteristic points of a reference ground object, wherein the characteristic points are expressed in a characteristic matrix mode.
Further, the reference station network is constructed in a uniformly distributed manner in a certain area, and further, the reference station is characterized by a 'body reference', and the enhancement information based on the 'body reference' is taken as a basic constituent unit of the technology.
Furthermore, the information transmission system utilizes a 5G high transmission network, the data center provides the mode positions P { P1, P2 … } of the peripheral 'body datum' for the multi-source fusion terminal, and the terminal calculates the position of the terminal or the peripheral ground object by GNSS positioning and the scene internal reference object obtained by the sensor.
Furthermore, the terminal enhancement information processing system works by relying on the terminal, and further, the terminal enhancement information processing system obtains the information of the reference station network data processing system through a 5G transmission network and works in cooperation with the sensor information of the terminal, so that the function of estimating the position of an unknown point from a known point is realized in any scene, and the problem of indoor and outdoor seamless positioning is thoroughly solved.
According to one embodiment of the present invention, the die position calculation in step S2 includes the steps of:
step S2.1: instance partitioning
Using ResNet-101+FPN as a backup, the instance segmentation is realized through two branches, namely a prototype generation branch and a Mask coefficient branch. Prototype generation branch Protonet generates mask prototypes based primarily on FCN implementations. Mask coefficient branches mainly realize prediction of Mask coefficients through an anchor-based target detector. The two branches are independently calculated, mask synthesis is carried out through matrix multiplication and Sigmoid function, and finally a segmentation result is output;
Step S2.2: target depth acquisition
And acquiring depth information by adopting a binocular camera. Assuming a target at coordinates (i, j) whose depth value Z (i, j); estimating the depth value at each pixel position by adopting a neighborhood window; wherein S is the size of the neighborhood window.Is a sign function;
step S2.3: external parameter matrix calculation
Carrying out camera attitude estimation through n 3D points in a group of world coordinate systems and corresponding 2D coordinates in an image, and realizing attitude estimation by adopting EPnP (Efficient PnP) algorithm;
Step S2.4: position resolution
By converting the pixel into a coordinate system, the three-dimensional position of the target in the world coordinate system can be obtained.
Pixel coordinates (u, v), conversion relation with image coordinate system:
The (x, y) refers to the position of the object in the image coordinate system, the (u, v) refers to the position of the object in the pixel coordinate system, the (u 0,v0) refers to the parameter that shifts the origin of the image coordinate from the center of the image to the upper left corner, and the (dx, dy) refers to the size of each pixel point in the x-axis, y-axis;
if fx and fy are focal lengths of the camera in the x axis and the y axis, xc, yc and Zc are coordinate values of the target in a camera coordinate system, knowing pixel coordinates (u, v) of the target and an internal reference matrix of the camera, and solving the external reference of the camera and the depth of the target according to the previous solution and the position information;
the (Xw, yw, zw) refers to the three-dimensional position of the target in the world coordinate system.
Compared with the prior art, the traditional navigation enhancement system provides differential correction information based on point positions, the technology provides enhancement information based on 'body' information, and the recognition accuracy is enhanced besides position enhancement; based on the characteristic of rapid transmission of 5G information, for the comprehensive application terminal development service based on Beidou/inertial navigation/vision, the function of estimating the unknown point position by the known point is realized in any scene, and the indoor and outdoor seamless positioning problem is solved.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
Claims (6)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210248804.8A CN114646991B (en) | 2022-03-14 | 2022-03-14 | A navigation enhancement network system based on "body reference" and its construction method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210248804.8A CN114646991B (en) | 2022-03-14 | 2022-03-14 | A navigation enhancement network system based on "body reference" and its construction method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114646991A CN114646991A (en) | 2022-06-21 |
CN114646991B true CN114646991B (en) | 2024-11-26 |
Family
ID=81993790
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210248804.8A Active CN114646991B (en) | 2022-03-14 | 2022-03-14 | A navigation enhancement network system based on "body reference" and its construction method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114646991B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108919305A (en) * | 2018-08-07 | 2018-11-30 | 北斗导航位置服务(北京)有限公司 | Beidou ground enhances band-like method of servicing and system in communications and transportation |
CN111045068A (en) * | 2019-12-27 | 2020-04-21 | 武汉大学 | An autonomous orbit and attitude determination method for low-orbit satellites based on non-navigation satellite signals |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111337031B (en) * | 2020-02-24 | 2022-04-15 | 南京航空航天大学 | Spacecraft landmark matching autonomous position determination method based on attitude information |
CN111462241B (en) * | 2020-04-08 | 2023-03-28 | 北京理工大学 | Target positioning method based on monocular vision |
CN111696162B (en) * | 2020-06-11 | 2022-02-22 | 中国科学院地理科学与资源研究所 | Binocular stereo vision fine terrain measurement system and method |
-
2022
- 2022-03-14 CN CN202210248804.8A patent/CN114646991B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108919305A (en) * | 2018-08-07 | 2018-11-30 | 北斗导航位置服务(北京)有限公司 | Beidou ground enhances band-like method of servicing and system in communications and transportation |
CN111045068A (en) * | 2019-12-27 | 2020-04-21 | 武汉大学 | An autonomous orbit and attitude determination method for low-orbit satellites based on non-navigation satellite signals |
Also Published As
Publication number | Publication date |
---|---|
CN114646991A (en) | 2022-06-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111275750B (en) | Indoor space panoramic image generation method based on multi-sensor fusion | |
JP7273927B2 (en) | Image-based positioning method and system | |
CN105096386B (en) | A wide range of complicated urban environment geometry map automatic generation method | |
Teller et al. | Calibrated, registered images of an extended urban area | |
CN108801274B (en) | A landmark map generation method integrating binocular vision and differential satellite positioning | |
CN111060924B (en) | A SLAM and Object Tracking Method | |
CN116883604A (en) | Three-dimensional modeling technical method based on space, air and ground images | |
CN114063127A (en) | Method for fusing multi-focal-length visual SLAM and GPS and storage medium | |
CN112132950B (en) | Three-dimensional point cloud scene updating method based on crowdsourcing image | |
CN115578539B (en) | Indoor space high-precision visual position positioning method, terminal and storage medium | |
Haala et al. | A multi-sensor system for positioning in urban environments | |
CN113608234A (en) | An urban data collection system | |
Yuan et al. | Fully automatic DOM generation method based on optical flow field dense image matching | |
JP7365385B2 (en) | Map generation method and image-based positioning system using the same | |
CN114646991B (en) | A navigation enhancement network system based on "body reference" and its construction method | |
Deng et al. | Automatic true orthophoto generation based on three-dimensional building model using multiview urban aerial images | |
CN118644554A (en) | Aircraft navigation method based on monocular depth estimation and ground feature point matching | |
CN116894923A (en) | High-resolution remote sensing image mapping conversion dense matching and three-dimensional reconstruction method | |
CN116824067A (en) | Indoor three-dimensional reconstruction method and device thereof | |
Hairuddin et al. | Development of 3D city model using videogrammetry technique | |
CN114387532A (en) | Boundary identification method and device, terminal, electronic equipment and unmanned equipment | |
CN118229755B (en) | Method for estimating urban building height by using street view image under severe shielding condition | |
CN118334263B (en) | High-precision modeling method for fusion laser point cloud based on truncated symbol distance function | |
CN117928519B (en) | Multi-sensor fusion positioning and mapping method and system for service robots | |
Zhou et al. | Object detection and spatial location method for monocular camera based on 3D virtual geographical scene |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |