Zou et al., 2019 - Google Patents
StructVIO: Visual-inertial odometry with structural regularity of man-made environmentsZou et al., 2019
View PDF- Document ID
- 4905544432024403709
- Author
- Zou D
- Wu Y
- Pei L
- Ling H
- Yu W
- Publication year
- Publication venue
- IEEE Transactions on Robotics
External Links
Snippet
In this paper, we propose a novel visual-inertial odometry (VIO) approach that adopts structural regularity in man-made environments. Instead of using Manhattan world assumption, we use Atlanta world model to describe such regularity. An Atlanta world is a …
- 238000005259 measurement 0 description 35
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in preceding groups
- G01C21/10—Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in preceding groups by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00624—Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in preceding groups
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00221—Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
- G06K9/00288—Classification, e.g. identification
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/78—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
- G01S3/782—Systems for determining direction or deviation from predetermined direction
- G01S3/785—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
- G01S3/786—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically, i.e. tracking systems
- G01S3/7864—T.V. type tracking systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image, e.g. from bit-mapped to bit-mapped creating a different image
- G06T3/0068—Geometric image transformation in the plane of the image, e.g. from bit-mapped to bit-mapped creating a different image for image registration, e.g. elastic snapping
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Zou et al. | StructVIO: Visual-inertial odometry with structural regularity of man-made environments | |
| Zhou et al. | StructSLAM: Visual SLAM with building structure lines | |
| Yin et al. | Dynam-SLAM: An accurate, robust stereo visual-inertial SLAM method in dynamic environments | |
| Qin et al. | Vins-mono: A robust and versatile monocular visual-inertial state estimator | |
| Yang et al. | Pop-up slam: Semantic monocular plane slam for low-texture environments | |
| US11127203B2 (en) | Leveraging crowdsourced data for localization and mapping within an environment | |
| Bleser et al. | Advanced tracking through efficient image processing and visual–inertial sensor fusion | |
| Panahandeh et al. | Vision-aided inertial navigation based on ground plane feature detection | |
| Tang et al. | LE-VINS: A robust solid-state-LiDAR-enhanced visual-inertial navigation system for low-speed robots | |
| US20150235367A1 (en) | Method of determining a position and orientation of a device associated with a capturing device for capturing at least one image | |
| Deretey et al. | Visual indoor positioning with a single camera using PnP | |
| Liu et al. | PLC-VIO: Visual–inertial odometry based on point-line constraints | |
| US10977810B2 (en) | Camera motion estimation | |
| Chen et al. | VPL-SLAM: a vertical line supported point line monocular SLAM system | |
| Feng et al. | VIMOT: A tightly coupled estimator for stereo visual-inertial navigation and multiobject tracking | |
| Meilland et al. | A spherical robot-centered representation for urban navigation | |
| Lu et al. | Vision-based localization methods under GPS-denied conditions | |
| Alliez et al. | Real-time multi-SLAM system for agent localization and 3D mapping in dynamic scenarios | |
| Morelli et al. | COLMAP-SLAM: A framework for visual odometry | |
| Chen et al. | Stereo visual inertial pose estimation based on feedforward and feedbacks | |
| Lin et al. | A sparse visual odometry technique based on pose adjustment with keyframe matching | |
| Xu et al. | Bifocal-binocular visual SLAM system for repetitive large-scale environments | |
| CN112945233A (en) | Global drift-free autonomous robot simultaneous positioning and map building method | |
| Xie et al. | Angular tracking consistency guided fast feature association for visual-inertial slam | |
| Shewail et al. | Survey of indoor tracking systems using augmented reality |