CN115760999B - Monocular camera calibration and target geographic position extraction method based on GIS assistance - Google Patents
Monocular camera calibration and target geographic position extraction method based on GIS assistance Download PDFInfo
- Publication number
- CN115760999B CN115760999B CN202211356471.7A CN202211356471A CN115760999B CN 115760999 B CN115760999 B CN 115760999B CN 202211356471 A CN202211356471 A CN 202211356471A CN 115760999 B CN115760999 B CN 115760999B
- Authority
- CN
- China
- Prior art keywords
- camera
- point
- target
- calibration
- control point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000605 extraction Methods 0.000 title claims abstract description 13
- 238000004364 calculation method Methods 0.000 claims abstract description 13
- 238000000034 method Methods 0.000 claims abstract description 13
- 238000009434 installation Methods 0.000 claims description 10
- 238000004458 analytical method Methods 0.000 claims description 9
- 238000012876 topography Methods 0.000 claims description 3
- 238000012544 monitoring process Methods 0.000 description 6
- 238000010276 construction Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- IJGRMHOSHXDMSA-UHFFFAOYSA-N Atomic nitrogen Chemical compound N#N IJGRMHOSHXDMSA-UHFFFAOYSA-N 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003754 machining Methods 0.000 description 1
- 229910052757 nitrogen Inorganic materials 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a monocular camera calibration and target geographic position extraction method based on GIS assistance, which comprises the following steps: acquiring camera parameters; selecting a control point; reading PTZ parameters of a camera cradle head; calculating azimuth angle, pitch angle and linear distance of the current control point relative to the camera; generating a calibration control point connection table; calculating optimal calibration parameters of the camera point positions, and calibrating a cradle head of the monocular camera; selecting any target point in the calibrated camera picture, calculating the azimuth angle and the pitch angle of the target point in the middle, and calculating the accurate azimuth angle and the accurate pitch angle of the target point relative to the camera; and determining the geographic position of the target point based on the GIS digital elevation model. The method can be used for various scenes, has more general adaptability, more accurate target positions, more practical value of results, small operand and quick calculation of the target positions.
Description
Technical Field
The invention relates to the technical field of nitrogen spring machining equipment, in particular to a monocular camera calibration and target geographic position extraction method based on GIS assistance.
Background
Currently, video monitoring is widely applied to various fields such as security protection, forest fire prevention, fishery supervision and the like, most of scene-mounted hardware equipment is a monocular optical camera with a tripod head function, and the use of the video monitoring is limited to video watching or further target recognition is carried out on video picture analysis. However, there is urgent business demand for extracting the real geographic position of the target in the monitoring video frame, and the related technology is still to be further studied.
In the prior art, the geographic position of the target object is extracted based on an optical camera, and the geographic position is mostly assisted by a binocular optical camera or a laser radar. However, geographic position extraction technology based on binocular optical cameras is generally applied to smaller scenes, and has the problems of complex calculation and high cost, and the pose of the camera is often required to be fixed, so that the implementation of a dynamically changed large scene (the linear distance from the camera to a target is more than 500 meters) is difficult. The geographic position extraction technology based on laser radar assistance has the characteristics of high precision and few influencing factors, but the laser radar has high cost and is not suitable for large-scale popularization. Meanwhile, the above technologies have higher requirements on equipment installation technologies such as a monitoring camera and the like, and have higher precision requirements on stability, levelness, north-right azimuth and the like of a fixed position, and the requirements cannot be met in general field construction operation.
Disclosure of Invention
Aiming at the defects of the prior art, the invention aims to provide a monocular camera calibration and target geographic position extraction method based on GIS assistance, which is characterized in that a certain number of control points are selected firstly based on a GIS digital elevation model and three-dimensional space analysis, a cradle head of a monocular camera is calibrated by calculating three calibration parameters including azimuth angle, pitch angle and height, and then the geographic position of any target in a camera picture is reversely calculated by combining the calibrated parameters with PTZ parameters of the cradle head of the camera, so that the problem of geographic position information extraction of the target object in a common monocular camera monitoring video can be solved, the method has the characteristics of economy and rapidness, and certain precision requirements are met.
In order to achieve the above purpose, the invention adopts the following technical scheme:
a monocular camera calibration and target geographic position extraction method based on GIS assistance is characterized by comprising the following steps:
step 1, acquiring camera parameters;
step 2, selecting a control point and centering the control point to the center of a camera picture;
step 3, reading PTZ parameters of a camera holder, and obtaining azimuth angle alpha of a holder sensor when the camera observes the current control point _ And pitch angle beta _ ;
Step 4, calculating the azimuth angle alpha of the current control point relative to the camera based on a GIS azimuth rule, calculating the pitch angle beta of the current control point observed by the camera based on a trigonometric function, and calculating the linear distance D from the camera to the control point based on a GIS spatial distance calculation rule;
step 5, repeating the steps 1-4 to generate a calibration control point connection table;
step 6, calculating calibration parameters of three dimensions of azimuth angle, pitch angle and height corresponding to each control point, adopting a least square method to calculate three optimal calibration parameters, taking the optimal calibration parameters into calculation residual errors of each control point, removing the control point with larger residual error in a connection table, and repeating the steps 1-6 to finish the calibration of the monocular camera holder;
step 7, selecting any target point in the calibrated camera picture, and calculating the azimuth angle alpha of the target point centering __ And pitch angle beta __ ;
Step 8, centering the azimuth angle alpha of the target point by using the azimuth angle calibration parameter and the pitch angle calibration parameter __ And pitch angle beta __ Performing calibration to obtain accurate azimuth angle alpha of target point relative to camera 0 Accurate pitch angle beta 0 ;
And 9, determining the geographic position of the target point by utilizing the altitude calibration parameters and utilizing three-dimensional space analysis based on the GIS digital elevation model.
Further, the camera parameters in step 1 include installation longitude and latitude [ X ] C ,Y C ]An absolute height H of the mounting location.
Further, the requirements of the control points selected in the step 2 are as follows:
the control point marks are clear, and the camera can clearly observe the target object;
the control point is close to the ground surface, so that no perspective influence is caused;
the control point has relatively stable topography and low elevation change;
the control points are uniformly distributed in four directions of the camera, namely the east, the west, the south and the north;
the distance between the control point and the camera is evenly distributed.
Further, the azimuth calibration parameter, the elevation calibration parameter and the altitude calibration parameter in step 6 are calculated as follows:
azimuth calibration parameter Δα: Δα=α _ -α;
Pitch angle calibration parameter Δβ: Δβ=β - -β;
Height calibration parameter Δh:
further, in step 7, the azimuth α of the target point centering __ And pitch angle beta __ The calculation mode of (a) is as follows:
wherein [ x ] p ,y p ]For the coordinates of the target point in the camera picture, w and h are the length and width of the current picture, and a is the water of the current cameraA plane angle of view, b is the vertical angle of view of the current camera, α _ And beta _ The azimuth angle and the pitch angle of the pan-tilt sensor are respectively.
Further, in step 8, the exact azimuth α of the target point with respect to the camera 0 Accurate pitch angle beta 0 The acquisition mode of (a) is as follows:
α 0 =α __ -Δα,
β 0 =β __ -Δβ,
wherein alpha is __ And beta __ Respectively an azimuth angle and a pitch angle centered by the target point, wherein delta alpha is an azimuth angle calibration parameter, and delta beta is a pitch angle calibration parameter.
The determining step of the geographic position of the target point in the step 9 is as follows:
step 9.1, starting from the camera position, acquiring the elevation of any point M from the digital elevation model along the sight line direction;
step 9.2, calculating the projection height of the M point in the vertical direction;
step 9.3, calculating the sum of the elevation of the M point and the projection height of the M point;
step 9.4, calculating the real height of the camera based on the height calibration parameters;
and 9.5, comparing the sum with the real height of the camera, if the sum and the real height are equal, the coordinate of the M point is the geographic position of the target point, otherwise, returning to the step 9.1 for loop iteration.
The invention has the remarkable effects that:
1. the camera has no strict requirements on construction and installation of the monitoring camera, only the relative level and the fixed stability of the camera are guaranteed, the approximate installation height of the camera is recorded, a strict installation environment is not needed, the camera can be used for various scenes, and the adaptability is more general.
2. Based on a GIS digital elevation model and a three-dimensional space analysis method, uniformly distributed control points are selected to calibrate azimuth angles, pitch angles and installation heights of cameras and optimize calibration parameters, and target geographic positions are calculated by using the calibrated parameters, so that the target positions are more accurate, and the result has a practical value.
3. Compared with a binocular camera positioning method, the GIS digital elevation model and three-dimensional space analysis-based method has small operand and rapid target position calculation; compared with a laser radar scheme, the method does not need additional hardware facilities, is economical and practical, and is beneficial to popularization and promotion to various industries.
Drawings
FIG. 1 is a camera calibration flow chart in the present invention;
FIG. 2 is a flow chart of target geographic location identification in the present invention;
fig. 3 is a three-dimensional schematic of a camera-target.
Detailed Description
The following describes the embodiments and working principles of the present invention in further detail with reference to the drawings.
As shown in fig. 1 and 2, in this embodiment, control points are selected based on a GIS digital elevation model and three-dimensional space analysis, and the cradle head of the monocular camera is calibrated by calculating three calibration parameters including azimuth angle, pitch angle and altitude, and the execution flow is as shown in fig. 1, corresponding to steps 1-6. The calibrated parameters are used to combine with the PTZ parameters of the camera cradle head, so that the geographic position of any target in the camera picture can be reversely calculated, and the execution flow is shown in figure 2 and corresponds to steps 7-9. The following detailed description of related implementation principles and processes will be made with reference to the accompanying drawings:
a monocular camera calibration and target geographic position extraction method based on GIS assistance comprises the following specific steps:
step 1, obtaining camera parameters, including mounting longitude and latitude [ X ] C ,Y C ]The absolute height H of the installation position (the sum of the elevation of the camera point and the installation height of the camera), wherein the elevation can be obtained from a GIS data digital elevation model, and the installation height is obtained from construction information);
step 2, selecting a certain number of control points (at least 4 control points): rotating a camera picture through a control function of a camera holder, searching a proper control point, and centering the control point to the center of the picture;
the required control point satisfies the following requirements:
(1) the control point marks are clear, and the camera can clearly observe the target object;
(2) the control point is close to the ground surface, so that no perspective influence is caused;
(3) the control point has relatively stable topography and low elevation change;
(4) the control points are uniformly distributed in four directions of the camera, namely the east, the west, the south and the north;
(5) the distance between the control point and the camera is evenly distributed.
Step 3, reading PTZ parameters of a camera holder, and obtaining azimuth angle alpha of a holder sensor when the camera observes the current control point _ And pitch angle beta _ ;
Step 4, calculating the azimuth angle alpha of the current control point relative to the camera based on a GIS azimuth rule, calculating the pitch angle beta of the current control point observed by the camera based on a trigonometric function, and calculating the linear distance D from the camera to the control point based on a GIS spatial distance calculation rule;
step 5, repeating the steps 1-4 to generate a calibration control point connection table;
step 6, calculating calibration parameters of three dimensions of azimuth angle, pitch angle and height corresponding to each control point, adopting a least square method to calculate three optimal calibration parameters, taking the optimal calibration parameters into calculation residual errors of each control point, removing the control point with larger residual error in a connection table, and repeating the steps 1-6 to finish the calibration of the monocular camera holder;
the azimuth angle calibration parameter, the pitch angle calibration parameter and the altitude calibration parameter are calculated as follows:
azimuth calibration parameter Δα: Δα=α _ -α;
Pitch angle calibration parameter Δβ: Δβ=β - -β;
Height calibration parameter Δh:
in step 6, firstly repeatedly calculating azimuth calibration parameters, pitch calibration parameters and altitude calibration parameters corresponding to each control point, then adopting a least square method to calculate three optimal calibration parameters, and bringing the optimal calibration parameters into calculation residual errors of each control point; and then judging whether the residual error meets the maximum allowable error, if so, calibrating the cradle head of the monocular camera, otherwise, eliminating the control point with larger residual error in the connection table, and returning to the step 1 for circularly iterating and updating the calibration parameters again.
Step 7, selecting any target point in the calibrated camera picture, and reading the coordinate [ x ] of the target point in the camera picture p ,y p ]The length, width, w and h of the current picture, the horizontal angle of view, a, the vertical angle of view, b, the azimuth angle alpha of the pan-tilt sensor of the current camera _ And pitch angle beta _ The method comprises the steps of carrying out a first treatment on the surface of the And calculates the azimuth angle alpha of the centering of the target point __ And pitch angle beta __ ;
Specific:
calculating the azimuth angle and pitch angle of the centering of the target:
converting the target into the center of the picture by utilizing the field angle and the position proportion of the target in the picture to obtain the azimuth angle alpha of the target object under the condition of centering the picture __ And pitch angle beta __ . The calculation method is as follows:
step 8, centering the azimuth angle alpha of the target point by using the azimuth angle calibration parameter and the pitch angle calibration parameter __ And pitch angle beta __ Performing calibration to obtain accurate azimuth angle alpha of target point relative to camera 0 Accurate pitch angle beta 0 ;
Using the azimuth calibration parameter Δα and the azimuth α when the target is centered __ According to formula alpha 0 =α __ -delta alpha can calculate the exact azimuth angle alpha of the target relative to the camera 0 The line of sight direction is determined.
And setting a distance range, and calculating a series of coordinate positions of the target points by adopting unknown point coordinate rules f (p, alpha, d) according to the known point coordinates, azimuth angles and distances in the GIS.
Using pitch angle calibration parameters Δβ and pitch angle β when the target is centered __ According to formula beta 0 =β __ -delta beta can calculate the accurate pitch angle beta of the target relative to the camera 0 The line of sight orientation is determined.
And 9, determining the geographic position of the target point by utilizing the altitude calibration parameters and utilizing three-dimensional space analysis based on the GIS digital elevation model.
And calculating the elevation and visual angle height values of any point M in the visual line direction, and comparing the relation between the elevation and the visual angle height value and the real camera height OC to determine the position of the target point. The specific process is as follows:
step 9.1, starting from the camera position, acquiring an elevation MN of any point M from the digital elevation model along the sight line direction;
step 9.2, calculating the projection height EF of the point in the vertical direction by combining the pitch angle;
the calculation formula is ef=arctan (β 0 ) D, wherein d is the projection distance from the M point to the camera position;
step 9.3, calculating the sum Z=EF+MN of the elevation of the M point and the projection height thereof;
step 9.4, calculating the real height OC of the camera based on the height calibration parameter delta H;
the real camera height OC is the absolute height (the sum of the elevation of a camera mounting position point and the camera mounting height), the elevation can be obtained from a GIS data digital elevation model, and the mounting height is obtained from construction information) of the camera, and the distance is calibrated by a height calibration parameter, so that the calculation formula is OC=H-delta H;
step 9.5, comparing the sum value with the real height of the camera, if the sum value and the real height are equal or approximately equal, namely Z=OC, the coordinate of the M point is the geographic position of the target point at the moment, otherwise, returning to the step 9.1 for loop iteration.
According to the embodiment, based on a digital elevation model and three-dimensional space analysis in a GIS, a certain number of control points are selected, and a cradle head of a monocular camera is calibrated by calculating three calibration parameters including an azimuth angle, a pitch angle and a height, so that accurate PTZ attitude data of the camera are obtained; when the camera is different from a new target, the corrected PTZ parameters of the camera are utilized to reversely calculate the geographic position of any target in the camera picture, so as to obtain the geographic position of the target.
The technical scheme provided by the invention is described in detail. The principles and embodiments of the present invention have been described herein with reference to specific examples, the description of which is intended only to facilitate an understanding of the method of the present invention and its core ideas. It should be noted that it will be apparent to those skilled in the art that various modifications and adaptations of the invention can be made without departing from the principles of the invention and these modifications and adaptations are intended to be within the scope of the invention as defined in the following claims.
Claims (3)
1. A monocular camera calibration and target geographic position extraction method based on GIS assistance is characterized by comprising the following steps:
step 1, acquiring camera parameters;
step 2, selecting a control point and centering the control point to the center of a camera picture;
step 3, reading PTZ parameters of a camera holder, and obtaining azimuth angle alpha of a holder sensor when the camera observes the current control point _ And pitch angle beta _ ;
Step 4, calculating the azimuth angle alpha of the current control point relative to the camera based on a GIS azimuth rule, calculating the pitch angle beta of the current control point observed by the camera based on a trigonometric function, and calculating the linear distance D from the camera to the control point based on a GIS spatial distance calculation rule;
step 5, repeating the steps 1-4 to generate a calibration control point connection table;
step 6, calculating calibration parameters of three dimensions of azimuth angle, pitch angle and altitude corresponding to each control point,
azimuth calibration parameter Δα: Δα=α _ -α;
Pitch angle calibration parameter Δβ: Δβ=β - -β;
Height calibration parameter Δh:
calculating three optimal calibration parameters by adopting a least square method, bringing the optimal calibration parameters into each control point to calculate residual errors, removing the control point with larger residual errors in a connection table, and repeating the steps 1-6 to finish the calibration of the monocular camera cradle head;
step 7, selecting any target point in the calibrated camera picture, and calculating the azimuth angle alpha of the target point centering __ And pitch angle beta __ :
Wherein [ x ] p ,y p ]For the coordinates of the target point in the camera frame, w and h are the length and width of the current frame, a is the horizontal angle of view of the current camera, b is the vertical angle of view of the current camera, α _ And beta _ Respectively azimuth angle and pitch angle of the pan-tilt sensor;
step 8, centering the azimuth angle alpha of the target point by using the azimuth angle calibration parameter and the pitch angle calibration parameter __ And pitch angle beta __ Performing calibration to obtain accurate azimuth angle alpha of target point relative to camera 0 Accurate pitch angle beta 0 ;
Step 9, determining the geographic position of the target point by utilizing the altitude calibration parameters and utilizing three-dimensional space analysis based on the GIS digital elevation model;
the camera parameters in step 1 include the installation longitude and latitude [ X ] C ,Y C ]Absolute height H of the mounting location;
the determining step of the geographic position of the target point in the step 9 is as follows:
step 9.1, starting from the camera position, acquiring the elevation of any point M from the digital elevation model along the sight line direction;
step 9.2, calculating the projection height of the M point in the vertical direction;
step 9.3, calculating the sum of the elevation of the M point and the projection height of the M point;
step 9.4, calculating the real height of the camera based on the height calibration parameters;
and 9.5, comparing the sum with the real height of the camera, if the sum and the real height are equal, the coordinate of the M point is the geographic position of the target point, otherwise, returning to the step 9.1 for loop iteration.
2. The GIS-assistance-based monocular camera calibration and target geographic location extraction method of claim 1, wherein: the requirements of the control points selected in the step 2 are as follows:
the control point marks are clear, and the camera can clearly observe the target object;
the control point is close to the ground surface, so that no perspective influence is caused;
the control point has relatively stable topography and low elevation change;
the control points are uniformly distributed in four directions of the camera, namely the east, the west, the south and the north;
the distance between the control point and the camera is evenly distributed.
3. The GIS-assistance-based monocular camera calibration and target geographic location extraction method of claim 1, wherein: accurate azimuth angle alpha of target point relative to camera in step 8 0 Accurate pitch angle beta 0 The acquisition mode of (a) is as follows:
α 0 =α __ -Δα,
β 0 =β __ -Δβ,
wherein alpha is __ And beta __ Respectively an azimuth angle and a pitch angle centered by the target point, wherein delta alpha is an azimuth angle calibration parameter, and delta beta is a pitch angle calibration parameter.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211356471.7A CN115760999B (en) | 2022-11-01 | 2022-11-01 | Monocular camera calibration and target geographic position extraction method based on GIS assistance |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211356471.7A CN115760999B (en) | 2022-11-01 | 2022-11-01 | Monocular camera calibration and target geographic position extraction method based on GIS assistance |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115760999A CN115760999A (en) | 2023-03-07 |
CN115760999B true CN115760999B (en) | 2023-07-18 |
Family
ID=85355075
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211356471.7A Active CN115760999B (en) | 2022-11-01 | 2022-11-01 | Monocular camera calibration and target geographic position extraction method based on GIS assistance |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115760999B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116758119B (en) * | 2023-06-27 | 2024-04-19 | 重庆比特数图科技有限公司 | Multi-target circulation detection tracking method and system based on motion compensation and linkage |
CN117593414A (en) * | 2024-01-17 | 2024-02-23 | 亿海蓝(北京)数据技术股份公司 | Method and system for drawing shooting area of camera, electronic equipment and storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100005378A (en) * | 2008-07-07 | 2010-01-15 | 한국전자통신연구원 | Method and apparatus for controlling camera position using of geographic information system |
EP3157255A1 (en) * | 2016-06-01 | 2017-04-19 | Continental Automotive GmbH | Calibration apparatus and calibration method |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101852623B (en) * | 2010-06-02 | 2011-12-21 | 中国资源卫星应用中心 | On-track calibration method for internal element of satellite optical remote sensing camera |
JP5923422B2 (en) * | 2012-09-24 | 2016-05-24 | クラリオン株式会社 | Camera calibration method and apparatus |
CN107247458A (en) * | 2017-05-24 | 2017-10-13 | 中国电子科技集团公司第二十八研究所 | UAV Video image object alignment system, localization method and cloud platform control method |
US11631197B2 (en) * | 2020-10-01 | 2023-04-18 | Ford Global Technologies, Llc | Traffic camera calibration |
CN114838740B (en) * | 2022-05-20 | 2024-04-26 | 北京市遥感信息研究所 | A geometric calibration method for satellite images considering different latitude and longitude regions |
-
2022
- 2022-11-01 CN CN202211356471.7A patent/CN115760999B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100005378A (en) * | 2008-07-07 | 2010-01-15 | 한국전자통신연구원 | Method and apparatus for controlling camera position using of geographic information system |
EP3157255A1 (en) * | 2016-06-01 | 2017-04-19 | Continental Automotive GmbH | Calibration apparatus and calibration method |
Also Published As
Publication number | Publication date |
---|---|
CN115760999A (en) | 2023-03-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115760999B (en) | Monocular camera calibration and target geographic position extraction method based on GIS assistance | |
CN112949478B (en) | Target detection method based on tripod head camera | |
CN108648241B (en) | A PTZ camera on-site calibration and focusing method | |
CN112013830B (en) | Accurate positioning method for inspection image detection defects of unmanned aerial vehicle of power transmission line | |
CN107063189A (en) | The alignment system and method for view-based access control model | |
CN104200086A (en) | Wide-baseline visible light camera pose estimation method | |
CN109146958B (en) | Traffic sign space position measuring method based on two-dimensional image | |
CN107330927B (en) | Airborne visible light image positioning method | |
CN103017666A (en) | Process and device for determining the position of a measuring point in geometric space | |
CN114067533A (en) | Geological disaster photographing monitoring and early warning method | |
CN106871900A (en) | Image matching positioning method in ship magnetic field dynamic detection | |
Martins et al. | Monocular camera calibration for autonomous driving—a comparative study | |
CN115511961A (en) | Three-dimensional space positioning method, system and storage medium | |
CN118209081B (en) | Method for measuring distance and positioning target by multi-photoelectric linkage with turntable | |
CN119131157A (en) | A method and computer device for realizing target positioning based on a single camera | |
CN118247187B (en) | Video image correction method, device and storage medium for fusing DEM and orthographic image | |
CN111796311B (en) | Method and device for monitoring state of target object and computer readable medium | |
CN110068313A (en) | A kind of digital zenith instrument orientation method based on projective transformation | |
GB2573090A (en) | Calibration of object position-measuring apparatus | |
CN107505611B (en) | Real-time correction method for video distance estimation of ship photoelectric reconnaissance equipment | |
CN110702145A (en) | Map error compensation method and system of two-dimensional navigation system | |
CN117651114A (en) | Shooting method and device for inspection image of power system and computer equipment | |
CN114255457B (en) | Direct geographic positioning method and system based on airborne LiDAR point cloud assistance | |
CN110866951B (en) | Method for correcting optical axis inclination of monocular camera | |
CN116205992A (en) | Total station long-focus camera internal reference and installation angle calibration method, device and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |