[go: up one dir, main page]

CN110992413A - High-precision rapid registration method for airborne remote sensing image - Google Patents

High-precision rapid registration method for airborne remote sensing image Download PDF

Info

Publication number
CN110992413A
CN110992413A CN201911279840.5A CN201911279840A CN110992413A CN 110992413 A CN110992413 A CN 110992413A CN 201911279840 A CN201911279840 A CN 201911279840A CN 110992413 A CN110992413 A CN 110992413A
Authority
CN
China
Prior art keywords
image
registered
remote sensing
registration
precision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911279840.5A
Other languages
Chinese (zh)
Inventor
王涛
常红伟
苏延召
姜柯
韩德帅
曹继平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rocket Force University of Engineering of PLA
Original Assignee
Rocket Force University of Engineering of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rocket Force University of Engineering of PLA filed Critical Rocket Force University of Engineering of PLA
Priority to CN201911279840.5A priority Critical patent/CN110992413A/en
Publication of CN110992413A publication Critical patent/CN110992413A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

本发明属于遥感影像处理技术领域,提出了一种机载遥感影像的高精度快速配准方法,包括:S1,根据待配准影像的影响特点选择特征区域,其中所述特征区域包含至少一对对应的特征点;S2,获取特征点在特征区域中的位置;S3,连接参考影像与待配准影像中对应的特征点对;S4,根据特征点对的位置关系来配准参考影像和待配准影像。本发明体现了简单、快速和高精度的特点,并且能够消除对高精度POS的依赖,同时,对各种异源影像的配准具有较强的适应性,对机载遥感影像的配准具有很大的实用价值。

Figure 201911279840

The invention belongs to the technical field of remote sensing image processing, and proposes a high-precision and rapid registration method for airborne remote sensing images. corresponding feature point; S2, obtain the position of the feature point in the feature area; S3, connect the reference image and the corresponding feature point pair in the image to be registered; S4, register the reference image and the to-be-registered image according to the positional relationship of the feature point pair Register images. The invention embodies the characteristics of simplicity, speed and high precision, and can eliminate the dependence on high-precision POS. At the same time, it has strong adaptability to the registration of various heterologous images, and has the advantages of registration of airborne remote sensing images. great practical value.

Figure 201911279840

Description

High-precision rapid registration method for airborne remote sensing image
Technical Field
The invention belongs to the technical field of remote sensing image processing, and particularly relates to a high-precision rapid registration method for an airborne remote sensing image.
Background
The airborne remote sensing image has very important application value in the military field. Due to the safety of the acquisition mode and the richness of the acquired image data, the method plays a vital role in the aspects of camouflage effect detection and evaluation, firepower investigation and accurate fighting, battlefield environment perception and the like. However, due to the influence of various factors such as the vibration of the airborne platform and the airflow disturbance, preprocessing is also needed before the airborne remote sensing images are fused for analysis, wherein the realization of high-precision registration of the images is an important prerequisite for image analysis.
The purpose of image registration is to spatially align images of the same region, which are derived from different time instants, different angles, different environments, different imaging mechanisms, and the like. The registration of multi-source images is always a difficult problem in the field of remote sensing image processing, and mainly shows two aspects: firstly, the imaging mechanisms of different sensors are different, and although the characteristics of the ground object can be reflected from different aspects, the characteristics of the same ground object expressed in different source images are greatly different; secondly, due to the influence of various factors such as flight attitude, position and the like, the remote sensing images of different sources have differences in shooting angles, distortion degrees and the like.
The current registration method of remote sensing images can be roughly divided into three categories: a registration method based on grey correlation, a registration algorithm based on a transform domain and a registration method based on characteristics. The image registration method based on the gray-scale related information mainly utilizes the gray-scale information of the image to establish similarity measurement, and realizes registration by seeking a transformation model when the similarity measurement is maximum or minimum, and common algorithms comprise normalized cross correlation, sequential similarity detection and the like. The algorithm is generally high in robustness and precision, but large in calculation amount. The image registration algorithm based on the transform domain firstly carries out the transformation from space domain data to frequency domain data, and then determines matching parameters by measuring the similarity between two images, wherein the commonly used method comprises Fourier transform and Gabor transform. The method is simple and has high precision, but the requirement on the image is high, estimation errors are easy to generate, and image registration fails. The feature-based registration method is the most widely studied algorithm, and mainly comprises feature extraction, feature matching and image interpolation. The algorithm has good robustness in the aspects of processing image rotation, scale change, brightness change and noise, and the matching method based on the image characteristics has great advantages under the conditions of low image quality and obvious deformation. However, the method has the defects of large calculation amount, more extracted feature points, easy mismatching and the like in different degrees in the using process.
Images shot by different types of sensors represent different characteristic information of the ground object, for example, a visible light image and a full-color image reflect space reflection information of the ground object, and an infrared image reflects heat radiation characteristics of the ground object. Therefore, the fusion of multi-source remote sensing image data is realized, the advantages of multi-source images can be fully exerted, and more accurate, comprehensive and rich information is provided. The realization of high-precision multi-source image registration is a key step and an important premise of data fusion, but most of the existing image registration algorithms register specific single-type remote sensing images, the method is not strong in universality, and generally cannot be directly applied to the registration of multi-source images.
Disclosure of Invention
Aiming at the problems in the background art, the invention provides a high-precision and quick registration method for an airborne remote sensing image by combining geographic coordinates and feature point matching, which comprises the following steps:
s1, selecting a characteristic region according to the influence characteristics of the image to be registered, wherein the characteristic region comprises at least one pair of corresponding characteristic points;
s2, acquiring the position of the feature point in the feature area;
s3, connecting the corresponding feature point pairs in the reference image and the image to be registered;
and S4, registering the reference image and the image to be registered according to the position relation of the characteristic point pairs.
Preferably, step S4 further includes: and establishing a mapping relation between the image to be registered and the reference image according to the positions of the matched feature points on the image to be registered and the reference image.
Preferably, the method further comprises: and S5, registering the image to be registered by using the mapping relation between the image to be registered and the reference image.
Preferably, step S5 further includes: and calculating the positions corresponding to the feature points of the image to be registered, and performing registration on the image to be registered by utilizing the mapping relation between the image to be registered and the reference image.
Preferably, step S5 further includes: and converting the coordinates of the feature points in the image to be registered into a reference image coordinate system according to the mapping relation to obtain the positions of the feature points in the reference image.
Preferably, the method further comprises: s6, calculating the matching error and the total registration error of each pair of feature points, if the total error is less than a given threshold, finishing the image registration process, otherwise, returning to the step S3 to continue the feature point matching.
Preferably, in step S1, the feature region includes 4-6 sets of feature point pairs.
Preferably, in step S2, feature point position information in the feature region of the reference image and the image to be registered is calculated by the Harris corner detector.
Preferably, step S3 further includes: and respectively searching and marking the characteristic point pairs in the two images according to the calculation result in the S2, and completing the connection of the characteristic point pairs.
Preferably, in step S1, a region including an edge point, a corner point, or a line intersection of the object is taken as the feature region.
The invention has the beneficial effects that: the method improves the operation efficiency and the accuracy of characteristic point selection by pertinently adopting the characteristic region to extract the characteristic points, and simplifies the calculation process by utilizing the characteristics of high processing speed and powerful functions of ArcGIS software, thereby quickly obtaining the high-accuracy remote sensing image registration image. Moreover, the method is suitable for registering various remote sensing image combinations, and multi-source images such as visible light, infrared and synthetic aperture radars and the like can be registered by the method. The method is simple to operate, reliable, good in applicability, strong in practicability, high in precision of the obtained experimental result and wide in application prospect.
Drawings
In order that the invention may be more readily understood, it will be described in more detail with reference to specific embodiments thereof that are illustrated in the accompanying drawings. These drawings depict only typical embodiments of the invention and are not therefore to be considered to limit the scope of the invention.
FIG. 1 is a flow chart of one embodiment of the method of the present invention.
FIG. 2 is a flow chart of one embodiment of the method of the present invention.
FIG. 3 is a schematic diagram of selected feature areas, wherein (a) is a reference image and (b) is a to-be-registered image.
Fig. 4 is a post-registration effect map.
Fig. 5 shows the matching error and the total registration error for each set of feature points.
Fig. 6 is a graph showing the registration effect of the visible reference image and the infrared image.
Detailed Description
Embodiments of the present invention will be described below with reference to the drawings, and technical features in the following examples and embodiments may be combined with each other without conflict.
Fig. 1-2 are flow charts of the registration method of the present invention. Referring to fig. 1, in step S1, a feature region is selected according to the influence characteristics of the image to be registered, wherein the feature region includes at least one pair of corresponding feature points.
Specifically, the reference image and the image to be registered are observed, the image distribution characteristics are analyzed, the corresponding characteristic area is selected, and the coordinate and the size of the selected characteristic area are recorded. In one embodiment of the present invention, the normal image is used as a reference image, and the visible image and the infrared image are respectively used as images to be registered. The ortho image is an airborne image corrected with UTM (Universal transmercator coordinate system) coordinates, has no distortion and high precision, contains an area of the image to be registered, and is suitable for being used as a reference image.
The purpose of selecting the feature regions is to calculate the feature point information more efficiently, so the feature regions should be selected based on the more obvious feature points in the reference image and the image to be registered, and each set of feature regions should include at least one pair of corresponding feature points for selection. The principle of selecting the characteristic point pairs is as follows: object edge points, corner points, line intersections, etc. are selected. The feature area should be as small as possible to avoid the portion of the image that affects the feature point calculation, and large enough to include the portion where the feature point may exist, ensuring the accuracy and validity of the calculated feature point position.
In one embodiment, two images, i.e., a reference image and an image to be registered, are opened, the shape and distribution characteristics of the two images are observed, and an area including a crossing point or an angular point of a road or a building, which is relatively obvious, is selected as a feature area, so as to further extract feature points. As shown in fig. 3 (a) and (b), fig. 3 is a schematic diagram of selecting feature regions, including feature point distribution, where fig. 3 (a) is a reference image and fig. 3 (b) is an image to be registered, and feature regions with the same number in the two images correspond to each other one by one, and a feature region 1 is selected. And (3) acquiring coordinate information of the upper left corner of the selected feature region, wherein the coordinate information is (x, y) and (x ', y') in the reference image and the image to be registered respectively.
Referring again to fig. 1, in step S2, the positions of the feature points in the feature region are acquired.
Finding the exact coordinates of the feature points in the feature area requires the use of a feature detector. According to the property of the characteristic points, Harris angular points can be selected for characteristic point extraction. Due to the feature region selection operation in step S1, the accuracy of the feature detector in selecting the corresponding feature points in the image can be effectively enhanced. The characteristic point extraction process is carried out based on Harris angular point detection, and the specific implementation comprises the following steps.
S21, calculating the gradient I of the image I (X, Y) in the X and Y directionsx、Iy
Figure BDA0002316426300000061
Figure BDA0002316426300000062
S22, calculating the product of the gradients of the two directions of the image.
Figure BDA0002316426300000063
Ixy=Ix·Iy
Figure BDA0002316426300000064
At S23, the resulting three gradient products are gaussian-weighted (assuming σ as 1) using a gaussian function, and elements A, B and C of the matrix M are generated.
Figure BDA0002316426300000065
Figure BDA0002316426300000066
Figure BDA0002316426300000067
S24, calculating a Harris response value r for each pixel, and setting r less than a certain threshold t to zero, α taking 0.05.
r={r:det M-α(traceM)2<t}
S25, performing non-maximum suppression in a neighborhood of a certain size (e.g., 3 × 3 or 5 × 5), where the local maximum point is a corner point in the image.
The position information of the feature points in the feature areas of the reference image and the image to be registered is calculated by the Harris corner detector, and is (u, v) and (u ', v'), respectively, so that the corresponding relation of the feature point pairs can be obtained. Assuming that the coordinates of the selected feature areas at the upper left corners in the reference image and the image to be registered are (x, y) and (x ', y'), respectively, the positions of the feature points in the corresponding original images obtained by simple addition calculation are:
(x0,y0)=(x+u,y+v)
(x1,y1)=(x′+u′,y′+v′)
referring again to fig. 1, in step S3, pairs of characteristic points are connected.
And respectively searching and marking the characteristic point pairs in the two images according to the calculation result in the S2, thus completing the connection work of a group of characteristic point pairs.
And selecting n groups of corresponding characteristic points for matching (the matching is to determine the corresponding relation of the characteristic point pairs, and for a group of characteristic point pairs, the characteristic point pairs are located at the same position in the registered image), wherein n cannot be too small, otherwise, the precision cannot be ensured, and the precision cannot be too large, otherwise, the calculation efficiency is influenced. Generally, n is 4-6, so that the accuracy of the result can be ensured, the calculated amount can be reduced, and the registration efficiency is improved.
And respectively connecting other 3 groups of characteristic point pairs according to the method to finally obtain the corresponding relation of 4 groups of characteristic point pairs, as shown in fig. 3.
Referring to fig. 1 again, in S4, a mapping relationship between the image to be registered and the reference image is established according to the coordinates of the matching feature points.
Establishing a mapping relation between a reference image and an image to be registered by the following formula:
I0(x0,y0)=f(I1(x1,y1))
wherein, I1Pointing to the image, I0Representing a reference image. By resolving the corresponding relation of the characteristic point pairs, the mapping relation between the reference image and the image to be registered can be established and resolved.
Referring to fig. 1 again, in step S5, the mapping relationship between the image to be registered and the reference image is used to perform registration of the image to be registered, and the geographic coordinates corresponding to the feature points of the image to be registered are calculated.
And converting the image to be registered into the coordinate space of the reference image by utilizing the mapping relation between the image to be registered and the reference image, thereby completing the registration. And simultaneously, converting the coordinates of the feature points in the image to be registered into a coordinate system of the reference image according to the mapping relation to obtain the geographic coordinate values of the feature points in the reference image.
Referring to fig. 1 again, in step S6, the matching error and the total registration error of each pair of feature points are calculated, if the total error is smaller than a given threshold, the image registration process is completed, otherwise, the process returns to step S3 to continue the feature point matching.
The feature point matching error refers to a deviation between the feature point geographical coordinate value calculated by the mapping relationship and the real (or assumed) feature point geographical coordinate value. The coordinates of the ith corresponding feature point in the reference image and the image to be registered can be obtained in step S2, which are (x)i0,yi0) And (x)i1,yi1) The geographic coordinate value of the mapped feature point obtained in step S5 is (x)i2,yi2) Then the set of feature point matching errors is
Figure BDA0002316426300000081
And the same calculation method can be used for completing the calculation of the matching errors of all the feature points. Calculating the total error:
Figure BDA0002316426300000082
if the total error is smaller than the given threshold E, i.e. Δ < E, the image registration process is completed, otherwise, the process returns to step S3 to continue the feature point matching.
The method is realized through ArcGIS software, the registration result is shown in figure 4 for the reference image and the image to be registered in figure 3, and the image edge can be seen to be well connected with the ground objects such as roads, buildings and the like corresponding to the base map, so that the subsequent image operation requirement is met. In addition, the linktable of the georreferencen module of the ArcGIS software can also realize high-precision automatic correction on the matching of the feature points, and fig. 5 shows that the linktable of the georreferencen module of the ArcGIS software includes the matching error and the total registration error of each group of feature points. Therefore, the matching error of each group of feature points is less than 2, the total registration error value is less than 2, and the registration accuracy of the visible image is good. In addition, the method can also be used for registering the visible light image and the infrared image, and the registration effect is as shown in fig. 6, so that the effect is very good.
The above-described embodiments are merely preferred embodiments of the present invention, and general changes and substitutions by those skilled in the art within the technical scope of the present invention are included in the protection scope of the present invention.

Claims (10)

1. A high-precision rapid registration method for airborne remote sensing images is characterized by comprising the following steps:
s1, selecting a characteristic region according to the influence characteristics of the image to be registered, wherein the characteristic region comprises at least one pair of corresponding characteristic points;
s2, acquiring the position of the feature point in the feature area;
s3, connecting the corresponding feature point pairs in the reference image and the image to be registered;
and S4, registering the reference image and the image to be registered according to the position relation of the characteristic point pairs.
2. The method for high-precision and rapid registration of airborne remote sensing images according to claim 1, wherein the step S4 further comprises:
and establishing a mapping relation between the image to be registered and the reference image according to the positions of the matched feature points on the image to be registered and the reference image.
3. The method for high-precision and rapid registration of airborne remote sensing images according to claim 2, further comprising:
and S5, registering the image to be registered by using the mapping relation between the image to be registered and the reference image.
4. The method for high-precision and rapid registration of airborne remote sensing images according to claim 3, wherein the step S5 further comprises:
and calculating the positions corresponding to the feature points of the image to be registered, and performing registration on the image to be registered by utilizing the mapping relation between the image to be registered and the reference image.
5. The method for high-precision and rapid registration of airborne remote sensing images according to claim 4, wherein the step S5 further comprises:
and converting the coordinates of the feature points in the image to be registered into a reference image coordinate system according to the mapping relation to obtain the positions of the feature points in the reference image.
6. The method for high-precision and rapid registration of airborne remote sensing images according to claim 5, further comprising:
s6, calculating the matching error and the total registration error of each pair of feature points, if the total error is less than a given threshold, finishing the image registration process, otherwise, returning to the step S3 to continue the feature point matching.
7. The method for high-precision and rapid registration of airborne remote sensing images according to claim 1, wherein in step S1, the characteristic region comprises 4-6 groups of characteristic point pairs.
8. The method for high-precision and rapid registration of airborne remote sensing images according to claim 2, wherein in step S2, feature point position information in the feature areas of the reference image and the image to be registered is calculated by a Harris corner detector.
9. The method for high-precision and rapid registration of airborne remote sensing images according to claim 8, wherein the step S3 further comprises: and respectively searching and marking the characteristic point pairs in the two images according to the calculation result in the S2, and completing the connection of the characteristic point pairs.
10. The method for high-precision and rapid registration of airborne remote sensing images according to claim 1, wherein in step S1, regions including object edge points, corner points or line intersections are used as feature regions.
CN201911279840.5A 2019-12-13 2019-12-13 High-precision rapid registration method for airborne remote sensing image Pending CN110992413A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911279840.5A CN110992413A (en) 2019-12-13 2019-12-13 High-precision rapid registration method for airborne remote sensing image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911279840.5A CN110992413A (en) 2019-12-13 2019-12-13 High-precision rapid registration method for airborne remote sensing image

Publications (1)

Publication Number Publication Date
CN110992413A true CN110992413A (en) 2020-04-10

Family

ID=70093279

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911279840.5A Pending CN110992413A (en) 2019-12-13 2019-12-13 High-precision rapid registration method for airborne remote sensing image

Country Status (1)

Country Link
CN (1) CN110992413A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112150522A (en) * 2020-09-22 2020-12-29 上海商汤智能科技有限公司 Remote sensing image registration method, device, equipment, storage medium and system
CN115830087A (en) * 2022-12-09 2023-03-21 陕西航天技术应用研究院有限公司 Batch rapid registration technology for translational motion continuous frame image sets

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104318548A (en) * 2014-10-10 2015-01-28 西安电子科技大学 Rapid image registration implementation method based on space sparsity and SIFT feature extraction
CN105354841A (en) * 2015-10-21 2016-02-24 武汉工程大学 Fast matching method and system for remote sensing images
CN107610164A (en) * 2017-09-11 2018-01-19 北京空间飞行器总体设计部 A kind of No. four Image registration methods of high score based on multiple features mixing
CN108335320A (en) * 2017-01-20 2018-07-27 中电科海洋信息技术研究院有限公司 The spatial registration method and spatial registration system of multi-source Remote Sensing Images

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104318548A (en) * 2014-10-10 2015-01-28 西安电子科技大学 Rapid image registration implementation method based on space sparsity and SIFT feature extraction
CN105354841A (en) * 2015-10-21 2016-02-24 武汉工程大学 Fast matching method and system for remote sensing images
CN108335320A (en) * 2017-01-20 2018-07-27 中电科海洋信息技术研究院有限公司 The spatial registration method and spatial registration system of multi-source Remote Sensing Images
CN107610164A (en) * 2017-09-11 2018-01-19 北京空间飞行器总体设计部 A kind of No. four Image registration methods of high score based on multiple features mixing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张中原等著: "输电线路测量与新技术应用", 郑州:黄河水利出版社, pages: 85 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112150522A (en) * 2020-09-22 2020-12-29 上海商汤智能科技有限公司 Remote sensing image registration method, device, equipment, storage medium and system
CN115830087A (en) * 2022-12-09 2023-03-21 陕西航天技术应用研究院有限公司 Batch rapid registration technology for translational motion continuous frame image sets
CN115830087B (en) * 2022-12-09 2024-02-20 陕西航天技术应用研究院有限公司 Batch rapid registration method for translational motion continuous frame image set

Similar Documents

Publication Publication Date Title
Yang et al. Automated registration of dense terrestrial laser-scanning point clouds using curves
WO2018142900A1 (en) Information processing device, data management device, data management system, method, and program
CN106960591B (en) A high-precision vehicle positioning device and method based on road surface fingerprints
CN105354841B (en) A kind of rapid remote sensing image matching method and system
WO2002025592A2 (en) Sar and flir image registration method
Bejanin et al. Model validation for change detection [machine vision]
Hansen et al. Online continuous stereo extrinsic parameter estimation
CN109341668A (en) A Multi-Camera Measurement Method Based on Refraction Projection Model and Beam Tracing Method
CN112150358A (en) An Image Feature Matching Method Against Large Geometric Distortion
Zhang LILO: A novel LiDAR–IMU SLAM system with loop optimization
Crispel et al. All-sky photogrammetry techniques to georeference a cloud field
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN111583342A (en) Target rapid positioning method and device based on binocular vision
CN114413958A (en) Monocular visual ranging and speed measurement method for unmanned logistics vehicles
CN110992413A (en) High-precision rapid registration method for airborne remote sensing image
An et al. Survey of extrinsic calibration on lidar-camera system for intelligent vehicle: Challenges, approaches, and trends
Mei et al. A Novel scene matching navigation system for UAVs based on vision/inertial fusion
Zhu et al. Generation of thermal point clouds from uncalibrated thermal infrared image sequences and mobile laser scans
CN118424150B (en) Measurement method, scanning device, and storage medium
Tang et al. Robust calibration of vehicle solid-state LiDAR-camera perception system using line-weighted correspondences in natural environments
CN109493356B (en) An intelligent calibration method for automatic target reporting system based on machine vision
CN114255457B (en) Direct geographic positioning method and system based on airborne LiDAR point cloud assistance
Xiong et al. Camera pose determination and 3-D measurement from monocular oblique images with horizontal right angle constraints
Li et al. Image matching techniques for vision-based indoor navigation systems: A 3D map-based approach1
CN109827578A (en) Satellite relative attitude estimation method based on profile similitude

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200410