CN109816774B - Three-dimensional reconstruction system and three-dimensional reconstruction method based on unmanned aerial vehicle - Google Patents
Three-dimensional reconstruction system and three-dimensional reconstruction method based on unmanned aerial vehicle Download PDFInfo
- Publication number
- CN109816774B CN109816774B CN201811651070.8A CN201811651070A CN109816774B CN 109816774 B CN109816774 B CN 109816774B CN 201811651070 A CN201811651070 A CN 201811651070A CN 109816774 B CN109816774 B CN 109816774B
- Authority
- CN
- China
- Prior art keywords
- data
- laser
- aerial vehicle
- unmanned aerial
- point cloud
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 230000000007 visual effect Effects 0.000 claims abstract description 22
- 230000004927 fusion Effects 0.000 claims abstract description 7
- 238000004422 calculation algorithm Methods 0.000 claims description 12
- 239000011159 matrix material Substances 0.000 claims description 8
- 238000013519 translation Methods 0.000 claims description 5
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 230000005540 biological transmission Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 6
- 238000011161 development Methods 0.000 description 5
- 229920000049 Carbon (fiber) Polymers 0.000 description 4
- 239000004917 carbon fiber Substances 0.000 description 4
- 238000000605 extraction Methods 0.000 description 4
- 238000009434 installation Methods 0.000 description 4
- VNWKTOKETHGBQD-UHFFFAOYSA-N methane Chemical compound C VNWKTOKETHGBQD-UHFFFAOYSA-N 0.000 description 4
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 3
- 241000255925 Diptera Species 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000009418 renovation Methods 0.000 description 2
- 229910000838 Al alloy Inorganic materials 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000013440 design planning Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
Landscapes
- Length Measuring Devices By Optical Means (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The invention discloses a three-dimensional reconstruction system and a three-dimensional reconstruction method based on an unmanned aerial vehicle, wherein the method comprises the following steps: acquiring a joint calibration parameter; registering laser data; the laser data is projected onto the visual data by utilizing a joint calibration method, so that partial images can obtain actual three-dimensional coordinates, and data fusion is realized; and (3) calibrating to obtain the laser point cloud data which are completely registered finally, and reconstructing the three-dimensional scene. According to the invention, the unmanned aerial vehicle is used for carrying the three-dimensional laser radar and the visible light cradle head camera to collect data, so that the accuracy of the collected data, namely, the reconstruction accuracy is higher. Compared with the common stereoscopic camera with multiple eyes, the method has the advantages that the feasibility is increased, and the situation that a single camera has an imaginary scene is avoided.
Description
Technical Field
The invention belongs to the technical field of three-dimensional reconstruction, and particularly relates to a three-dimensional reconstruction system and a three-dimensional reconstruction method based on an unmanned plane.
Background
The acquisition method for the three-dimensional reconstruction image comprises the steps of controlling the brightness periodical change of each light source in at least two light sources which are arranged in a space separation mode, and respectively acquiring the three-dimensional reconstruction image by adopting three cameras at least three positions. The three-dimensional reconstruction based on the image is a process of automatically calculating and matching by a computer according to two or more two-dimensional images shot by an object or a scene, calculating two-dimensional geometric information and depth information of the object or the scene, and establishing a three-dimensional stereoscopic model. However, there are also disadvantages in that, first, when a real scene to be reconstructed cannot acquire a real perceived image, for example, an object or a scene does not exist at all, it is fictitious, or when the scene is in a design planning stage, it is changed at a time, and the image-based modeling technique cannot be used. Secondly, because objects in the scene become two-dimensional objects in the image, users are difficult to interact with the two-dimensional graphic objects to acquire needed information; there is also a certain requirement for cameras and photographic equipment, which is a need to obtain a realistic perceived image. While these large numbers of image files also require sufficient storage space to be preserved.
In the photovoltaic industry, the pipeline is often damaged, but the condition of large-scale renovation cannot be immediately carried out, so that the current scene is usually recorded first, and the positioning can be carried out when the subsequent renovation is carried out. Three-dimensional reconstruction is widely used in such practical scenarios. Three-dimensional model reconstruction is carried out on the whole environment at intervals, each time the environment is changed, and different points can be found in the later comparison maintenance. The traditional three-dimensional reconstruction is to scan the environment to be reconstructed by means of manual hand-held laser, the reliability is not high, only point cloud information is obtained by a simple laser radar, a model framework is obtained by reconstruction, a real scene cannot be truly restored, and information including shape, texture, color and the like is obtained by matching with a visual sensor.
Disclosure of Invention
In order to solve the problems, the invention provides a three-dimensional reconstruction method based on three-dimensional laser and a camera, which fuses laser data and visual data and avoids the situation that a single camera has an imaginary scene.
The technical scheme of the invention is as follows: a three-dimensional reconstruction method based on three-dimensional laser and a camera comprises the following steps:
(1) Acquiring joint calibration parameters: respectively extracting corner features of laser data and visual data of the timestamp, and obtaining joint calibration parameters based on the corner features;
(2) Registration of laser data: registering the point cloud data at different moments continuously, namely registering the newly added point cloud data into the historical point cloud data, and calibrating the newly added point cloud data;
(3) Extracting features of the visual data;
(4) The laser data is projected onto the visual data by utilizing the joint calibration method in the step (1), so that the actual three-dimensional coordinates of partial images can be obtained, and the fusion of the data is realized;
(5) And (3) calibrating to obtain the laser point cloud data which are completely registered finally, and reconstructing the three-dimensional scene.
Preferably, in the step (1), a rotation matrix rα is obtained x ,α y ,α z ]And a translation matrix T [ T ] x ,t y ,t z ]So that the mutual conversion can be established, as shown in the formula (a), the laser data can be converted into the following coordinate system of the camera:
P A =RP B +T (a);
wherein P is B =[x B ,y B ,z B ] T The coordinates of the laser data in a laser coordinate system; p (P) A =[x A ,y A ,z A ] T Is the coordinates of the laser data in the camera coordinate system.
Preferably, in the step (2), the positioning navigation data obtained by the RTK is used to calibrate the newly added point cloud data, where the calibration is performed using a closest point iterative algorithm, and the closest point iterative algorithm includes:
in the process of each iteration, each laser point in each frame of point cloud is calculated, the point closest to the point cloud in the other frame of point cloud is searched, and a rotation matrix is calculated through the corresponding point pairs, so that the distance between all the point cloud pairs is minimum, as shown in a formula (b),
wherein, P and Q are two frames of laser point cloud data respectively.
Preferably, in the step (3), the corner point is used as a tracking point, wherein when the corner point feature extraction is moved on the image through the window, the gray level change in the window is calculated, and if the gray level change is not large, no corner point exists in the area. If the gray value changes greatly when the window moves along one direction, but the gray value does not change basically when the window moves along other directions, the gray value may be a straight line in the area; if the gray value changes greatly when the window moves along all directions, the region is considered to have angular points.
Preferably, in the step (4), when the laser data and the visual data are fused, the laser data needs to be first rotated and translated to a certain extent and then projected onto the corresponding visual data image.
The invention also provides a three-dimensional reconstruction system based on the unmanned aerial vehicle, which comprises an unmanned aerial vehicle body, a power supply part and a camera for acquiring visual data, and is characterized by further comprising a laser radar for acquiring laser data, wherein the unmanned aerial vehicle body comprises a load plate for bearing and installing a fixed camera and the laser radar, and the camera and the laser radar are respectively arranged on two sides of the load plate.
In the invention, the whole power supply is needed, and the power supply of the external member such as a power supply, a flight control and the like, the image transmission power supply, the laser radar power supply and the camera power supply are needed. All power sources are unmanned aerial vehicle battery, two 6s batteries are used in parallel, a single battery can supply 24V, except that the voltage required by image transmission is 12V, the rest of the batteries can be directly supplied by the batteries, so that a voltage reduction module is connected to the power part to provide 12V voltage for image transmission.
Preferably, the unmanned aerial vehicle body further comprises a horn, and a fixing bolt for fixing the RTK antenna and the GPS is arranged on the horn.
Preferably, the distance between two antennas of the RTK antenna is greater than 30 cm, and the arrow of the GPS points to the direction of the nose of the unmanned aerial vehicle body.
Preferably, the unmanned aerial vehicle further comprises a remote control server for remotely controlling the unmanned aerial vehicle body, the laser radar and the camera.
The software environment is built, the whole system is based on a Linux system, an ROS operating system is added, and a Windows system is matched.
In order to perform relevant development and operation instructions on an onboard computer of the unmanned aerial vehicle, XShell is installed on a user computer (Windows), so that a server can be remotely logged in, namely, the remote logging on the onboard computer can realize corresponding development work.
The unmanned aerial vehicle airborne computer is remotely logged in through XShell, and ip is immediately modified, so that the unmanned aerial vehicle and laser ip are kept in the same network segment, and normal operation of laser is ensured.
And installing an mqtt client (mosquito) on a user computer (Linux) to realize the purpose of issuing a flight command to an onboard computer and acquiring related information of the unmanned aerial vehicle. (remote communication is realized by the 4G of the onboard computer, and the onboard computer is provided with a mqtt service end (paho-mqtt). A user computer can issue a flight command to the onboard computer through the mqtt to trigger the onboard computer to execute tasks).
In order to realize the control of unmanned aerial vehicle flight at the PC end, a ground station is installed on a user computer (Windows), the flight state and position information of the aircraft can be seen in real time, the video shot by the small cradle head can be seen in real time, the surrounding environment can be observed, and the like.
The method comprises the steps of collecting and storing laser point cloud information, and utilizing a packet capturing tool to capture real-time data of laser in the laser working process (a packet capturing tool wireshark or tcpdump is installed on an onboard computer, the onboard computer is logged in remotely through an xshell, a startup self-starting script is newly established, and the laser packet capturing and temporary storage are realized).
A user computer (Windows) installs RSview, and can save and view point cloud pictures on the laser data of the grabbing package; or view the point cloud image through rviz and save it at the user computer (ROS).
Compared with the prior art, the invention has the beneficial effects that:
according to the invention, the unmanned aerial vehicle is used for carrying the three-dimensional laser radar and the visible light cradle head camera to collect data, so that the accuracy of the collected data, namely, the reconstruction accuracy is higher. Compared with the common stereoscopic camera with multiple eyes, the method has the advantages that the feasibility is increased, and the situation that a single camera has an imaginary scene is avoided.
Drawings
FIG. 1 is a block diagram of the connection of a hardware system according to the present invention.
Fig. 2 is a schematic view of a structure in which a laser radar and a camera are mounted on a body of a unmanned aerial vehicle according to the present invention.
FIG. 3 is a flow chart of the three-dimensional reconstruction method of the present invention.
Fig. 4 is a schematic diagram of an installation mode of a propeller of an unmanned aerial vehicle in the invention.
Fig. 5 is a schematic structural diagram of the interior of the unmanned aerial vehicle body according to the present invention.
Fig. 6 is a schematic structural diagram of a unmanned aerial vehicle body in the present invention.
Detailed Description
Example 1
The embodiment comprises an unmanned aerial vehicle body, a flight control system, a data link system, a power supply part and the like. The load board is designed, and the laser radar and the visible light cradle head camera are effectively carried, so that laser data and video information are obtained.
The hardware system block diagram is shown in fig. 1, and comprises various components of the whole unmanned aerial vehicle system.
The invention mainly improves the design and the installation of a loading plate of a laser radar and a cradle head camera. The invention redesigns and makes a carbon fiber board for suspending the laser and the small cradle head nacelle, the installation mode of the laser radar is a lying mode, the foot rest of the unmanned aerial vehicle is ensured not to shade the laser, the laser radar is as heavy as 840g, the laser radar is installed on one side, and the small cradle head nacelle, the image transmission air end and the power adapter of the laser are installed on the other side in order to achieve the balance state as a whole.
An aluminum alloy fixing bolt for fixing an RTK antenna and a GPS is manufactured on a horn, the distance between the two antennas of the RTK is ensured to be larger than 30 cm, the arrows of the two GPS point to the direction of the machine head of the unmanned aerial vehicle, the navigation stability is ensured, the installation schematic diagram of a laser radar and a camera is shown in figure 2 (blue in the figure is the two antennas of the RTK, dark red is the GPS, one GPS is omitted, purple represents laser and a power adapter thereof, the side surface of a cylinder is a laser working area, yellow represents a small cradle head nacelle, the image transmission aerial end is shielded by the small cradle head, and the description is omitted
And the power supply part is used for integrally supplying power to the unmanned aerial vehicle, the flight control and other external members, the image transmission power supply, the laser radar power supply and the camera power supply. All power sources are unmanned aerial vehicle battery, two 6s batteries are used in parallel, a single battery can supply 24V, except that the voltage required by image transmission is 12V, the rest of the batteries can be directly supplied by the batteries, so that a voltage reduction module is connected to the power part to provide 12V voltage for image transmission.
The power part of the unmanned aerial vehicle comprises a motor and an electric regulator, and the unmanned aerial vehicle is a six-rotor wing, so that a propeller is arranged according to an arrow in fig. 4.
As shown in fig. 5 and 6, in the present invention, the original carbon fiber board of the fuselage is disassembled, there is some space inside, the FCU (right side up and the arrow direction is consistent with the direction of the nose) and PMU of the master are installed, and the main board and NCU of the RTK, and the receiver of the remote controller is also installed in the available slot. The space of the innermost layer is used up and the original carbon fiber plate is covered. And an airborne computer (equivalent to an industrial personal computer and applicable to development) and a data transmission space terminal are arranged on the carbon fiber plate.
The debugging mode in the invention is as follows: the remote controller is connected with a parameter adjusting interface and a computer through a USB (universal serial bus) line, parameter adjusting software at a PC (personal computer) end is utilized for parameter adjusting, firstly, the multi-rotor type is set to be six rotors, the remote controller is calibrated, the remote controller is adjusted to be American hands, japanese hands and Chinese hands according to personal habits (I set to be American hands according to personal habits), a remote controller receiver is of an SBus type, electric adjustment calibration, GPS (global positioning system) calibration and horizontal calibration are achieved one by one, and all the calibrations are completed one by one according to the requirements of using manual.
The software environment is built, the whole system is based on a Linux system, an ROS operating system is added, and a Windows system is matched.
In order to perform relevant development and operation instructions on an onboard computer of the unmanned aerial vehicle, XShell is installed on a user computer (Windows), so that a server can be remotely logged in, namely, the remote logging on the onboard computer can realize corresponding development work.
The unmanned aerial vehicle airborne computer is remotely logged in through XShell, and ip is immediately modified, so that the unmanned aerial vehicle and laser ip are kept in the same network segment, and normal operation of laser is ensured.
And installing an mqtt client (mosquito) on a user computer (Linux) to realize the purpose of issuing a flight command to an onboard computer and acquiring related information of the unmanned aerial vehicle. (remote communication is realized by the 4G of the onboard computer, and the onboard computer is provided with a mqtt service end (paho-mqtt). A user computer can issue a flight command to the onboard computer through the mqtt to trigger the onboard computer to execute tasks).
In order to realize the control of unmanned aerial vehicle flight at the PC end, a ground station is installed on a user computer (Windows), the flight state and position information of the aircraft can be seen in real time, the video shot by the small cradle head can be seen in real time, the surrounding environment can be observed, and the like.
The method comprises the steps of collecting and storing laser point cloud information, and utilizing a packet capturing tool to capture real-time data of laser in the laser working process (a packet capturing tool wireshark or tcpdump is installed on an onboard computer, the onboard computer is logged in remotely through an xshell, a startup self-starting script is newly established, and the laser packet capturing and temporary storage are realized).
A user computer (Windows) installs RSview, and can save and view point cloud pictures on the laser data of the grabbing package; or view the point cloud image through rviz and save it at the user computer (ROS).
The embodiment also relates to a three-dimensional reconstruction algorithm, and the whole algorithm idea is shown in fig. 3. The method specifically comprises the following steps:
1. the algorithm realizes the joint calibration of the three-dimensional laser and the cradle head camera to obtain joint calibration parameters
And respectively extracting corner features of the laser data and the visual data of the time stamp, and solving the joint calibration parameters based on the corner features. The purpose is that the laser data can be transplanted into a visual coordinate system, and the opposite visual data can also be transplanted into the laser coordinate system, so that the fusion of later-stage data is facilitated.
Specifically: the joint calibration parameters of the laser and the camera are needed to be obtained, namely, a rotation matrix Ralpha is needed to be obtained x ,α y ,α z ]And a translation matrix T [ T ] x ,t y ,t z ]So that the mutual conversion can be established, as shown in the formula, the laser data can be converted into P under the camera coordinate system A =RP B +T, where P B =[x B ,y B ,z B ] T Is the coordinates of the laser data in the laser coordinate system,
P A =[x A ,y A ,z A ] T is the coordinates of the laser data in the camera coordinate system.
2. Algorithm for realizing registration of laser data
Three-dimensional reconstruction is to be continuous for different momentsThe point cloud data is registered, that is, the newly added point cloud data is registered to the historical point cloud data, and the newly added point cloud data is calibrated by using positioning navigation data obtained by RTK, wherein a nearest point iterative algorithm is used. Specifically: in the process of each iteration, each laser point in each frame of point cloud is calculated, the point closest to the point in the other frame of point cloud is searched, and a rotation matrix is calculated through the corresponding point pairs, so that the distance between all the point cloud pairs is minimum, as shown in a formula,wherein, P and Q are two frames of laser point cloud data respectively.
3. Algorithm implementation of feature extraction on visual data
In order to enable the images of the frames to correspond, angular point feature extraction is carried out on the visual data, and the angular points are used as tracking points to better match with adjacent images. Specifically, when the corner feature extraction is carried out on an image through a window, the gray level change in the window is calculated, and if the gray level change is not large, no corner exists in the area. If the gray value changes greatly when the window moves along one direction, but the gray value does not change basically when the window moves along other directions, the gray value may be a straight line in the area; if the gray value changes greatly when the window moves along all directions, the region is considered to have angular points
4. Algorithm implementation of fusion data information
And (3) projecting the three-dimensional laser data onto the visual image data by using the combined calibration method in the step (1), so that the actual three-dimensional coordinates of part of the images can be obtained, and the fusion of the data is realized. In particular, the method comprises the steps of,
(1) The laser adopts the same working frequency as the camera and can be corresponding to the same time stamp, but the general working frequencies are different;
(2) The working frequency of the laser is different from that of the camera, so that the time stamps are not necessarily completely aligned, and therefore, the two types of data are fused, and the laser data need to be subjected to certain rotation and translation and then projected onto the corresponding image.
5. Three-dimensional scene reconstruction based on fusion data is realized by algorithm
And (3) performing various calibrations including estimation of motion of the laser points by adopting the conventional method, and obtaining laser point cloud data which are completely registered finally to realize reconstruction of the three-dimensional scene.
Claims (6)
1. The three-dimensional reconstruction method based on the three-dimensional laser and the camera is characterized by comprising the following steps of:
(1) Acquiring joint calibration parameters: respectively extracting corner features of laser data and visual data of the timestamp, and obtaining joint calibration parameters based on the corner features;
(2) Registration of laser data: registering the point cloud data at different moments continuously, namely registering the newly added point cloud data into the historical point cloud data, and calibrating the newly added point cloud data;
(3) Extracting features of the visual data, extracting corner features of the visual data, and matching adjacent images by taking the corner as a tracking point;
(4) The laser data is projected onto the visual data by utilizing the joint calibration method in the step (1), so that the actual three-dimensional coordinates of partial images can be obtained, and the fusion of the data is realized;
(5) Calibrating to obtain laser point cloud data which are completely registered finally, and reconstructing a three-dimensional scene;
in the step (1), a rotation matrix is obtainedAnd a translation matrix->So that the mutual conversion can be established, as shown in the formula (a), the laser data can be converted into the following coordinate system of the camera:
(a);
wherein,the coordinates of the laser data in a laser coordinate system;
the coordinates of the laser data in a camera coordinate system;
in the step (2), positioning navigation data obtained by the RTK is utilized to calibrate newly added point cloud data, wherein a nearest point iterative algorithm is used for calibration, and the nearest point iterative algorithm comprises:
in the process of each iteration, each laser point in each frame of point cloud is calculated, the point closest to the point cloud in the other frame of point cloud is searched, and a rotation matrix is calculated through the corresponding point pairs, so that the distance between all the point cloud pairs is minimum, as shown in a formula (b),
,
wherein,and->Respectively two frames of laser point cloud data;
in the step (3), angular points are used as tracking points, wherein when the angular point features are extracted and move on the image through the window, gray level change in the window is calculated, and if the gray level change is not large, no angular point exists in the area; if the gray value changes greatly when the window moves along one direction, but the gray value does not change basically when the window moves along other directions, the gray value may be a straight line in the area; if the gray value changes greatly when the window moves along all directions, the region is considered to have angular points.
2. The three-dimensional reconstruction method based on three-dimensional laser and camera according to claim 1, wherein in the step (4), when the laser data and the visual data are fused, the laser data need to be first subjected to a certain rotational translation and then projected onto the corresponding visual data image.
3. An unmanned aerial vehicle-based three-dimensional reconstruction system using the three-dimensional laser and camera-based three-dimensional reconstruction method according to claim 1, comprising an unmanned aerial vehicle body, a power supply part and a camera for acquiring visual data, and further comprising a laser radar for acquiring the laser data, wherein the unmanned aerial vehicle body comprises a load board for carrying and installing a fixed camera and the laser radar, and the camera and the laser radar are respectively installed on two sides of the load board.
4. The unmanned aerial vehicle-based three-dimensional reconstruction system of claim 3, wherein the unmanned aerial vehicle body further comprises a horn on which a dead bolt for fixing the RTK antenna and GPS is disposed.
5. A three-dimensional reconstruction system based on an unmanned aerial vehicle as claimed in claim 3, wherein the distance between the two antennas of the RTK antenna is greater than 30 cm, and the arrow of the GPS points in the direction of the head of the unmanned aerial vehicle body.
6. The unmanned aerial vehicle-based three-dimensional reconstruction system of claim 3, further comprising a remote manipulation server for remotely manipulating the unmanned aerial vehicle body, the lidar and the camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811651070.8A CN109816774B (en) | 2018-12-31 | 2018-12-31 | Three-dimensional reconstruction system and three-dimensional reconstruction method based on unmanned aerial vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811651070.8A CN109816774B (en) | 2018-12-31 | 2018-12-31 | Three-dimensional reconstruction system and three-dimensional reconstruction method based on unmanned aerial vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109816774A CN109816774A (en) | 2019-05-28 |
CN109816774B true CN109816774B (en) | 2023-11-17 |
Family
ID=66603299
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811651070.8A Active CN109816774B (en) | 2018-12-31 | 2018-12-31 | Three-dimensional reconstruction system and three-dimensional reconstruction method based on unmanned aerial vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109816774B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110672091B (en) * | 2019-09-29 | 2023-05-23 | 哈尔滨飞机工业集团有限责任公司 | Flexible drag nacelle positioning system of time domain aircraft |
CN110864725A (en) * | 2019-10-24 | 2020-03-06 | 大连理工大学 | Panoramic three-dimensional color laser scanning system and method based on lifting motion |
CN111047631B (en) * | 2019-12-04 | 2023-04-07 | 广西大学 | Multi-view three-dimensional point cloud registration method based on single Kinect and round box |
CN111199578B (en) * | 2019-12-31 | 2022-03-15 | 南京航空航天大学 | Unmanned aerial vehicle three-dimensional environment modeling method based on vision-assisted laser radar |
CN112767475B (en) * | 2020-12-30 | 2022-10-18 | 重庆邮电大学 | Intelligent roadside sensing system based on C-V2X, radar and vision |
CN112419417B (en) * | 2021-01-25 | 2021-05-18 | 成都翼比特自动化设备有限公司 | Unmanned aerial vehicle-based photographing point positioning method and related device |
CN113379910B (en) * | 2021-06-09 | 2023-06-02 | 山东大学 | Mine scene reconstruction method and system for mobile robot based on SLAM |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104268935A (en) * | 2014-09-18 | 2015-01-07 | 华南理工大学 | Feature-based airborne laser point cloud and image data fusion system and method |
WO2018072433A1 (en) * | 2016-10-19 | 2018-04-26 | 杭州思看科技有限公司 | Three-dimensional scanning method including a plurality of lasers with different wavelengths, and scanner |
CN108828606A (en) * | 2018-03-22 | 2018-11-16 | 中国科学院西安光学精密机械研究所 | Laser radar and binocular visible light camera-based combined measurement method |
-
2018
- 2018-12-31 CN CN201811651070.8A patent/CN109816774B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104268935A (en) * | 2014-09-18 | 2015-01-07 | 华南理工大学 | Feature-based airborne laser point cloud and image data fusion system and method |
WO2018072433A1 (en) * | 2016-10-19 | 2018-04-26 | 杭州思看科技有限公司 | Three-dimensional scanning method including a plurality of lasers with different wavelengths, and scanner |
CN108828606A (en) * | 2018-03-22 | 2018-11-16 | 中国科学院西安光学精密机械研究所 | Laser radar and binocular visible light camera-based combined measurement method |
Non-Patent Citations (1)
Title |
---|
基于激光与立体视觉同步数据的场景三维重建;杜宇楠等;《软件》;20121115(第11期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN109816774A (en) | 2019-05-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109816774B (en) | Three-dimensional reconstruction system and three-dimensional reconstruction method based on unmanned aerial vehicle | |
US20210239815A1 (en) | Movable object performing real-time mapping using a payload assembly | |
Udin et al. | Assessment of photogrammetric mapping accuracy based on variation flying altitude using unmanned aerial vehicle | |
EP3678100A1 (en) | Augmented reality system using enhanced models | |
CN111091613A (en) | Three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey | |
CN103472847A (en) | Method and system for monitoring track of unmanned aerial vehicle power line inspection | |
CN105739512A (en) | Unmanned aerial vehicle automatic tour inspection system and method | |
US20220113421A1 (en) | Online point cloud processing of lidar and camera data | |
CN110060332A (en) | High-precision three-dimensional based on airborne acquisition equipment builds figure and modeling | |
CN109297978B (en) | Power line UAV inspection and defect intelligent diagnosis system based on binocular imaging | |
CN105758384A (en) | Unmanned aerial vehicle rocking oblique photograph system | |
CN110880202B (en) | Three-dimensional terrain model creating method, device, equipment and storage medium | |
WO2020088414A1 (en) | A movable object performing real-time mapping using a payload assembly | |
CN111244822B (en) | Fixed-wing unmanned aerial vehicle line patrol method, system and device in complex geographic environment | |
Ahmed et al. | Development of smart quadcopter for autonomous overhead power transmission line inspections | |
CN104638562A (en) | Helicopter electric power inspection system and method | |
CN111340942A (en) | Three-dimensional reconstruction system based on unmanned aerial vehicle and method thereof | |
CN110864725A (en) | Panoramic three-dimensional color laser scanning system and method based on lifting motion | |
CN113110534A (en) | Unmanned aerial vehicle control and perception system | |
CN113031462A (en) | Port machine inspection route planning system and method for unmanned aerial vehicle | |
CN110598359A (en) | A 3D Modeling System for Aircraft Structure Maintenance | |
CN204258162U (en) | Helicopter in electric inspection process system | |
CN113379908A (en) | Three-dimensional GISVR circuit live-action platform building system for automatic inspection of power equipment | |
CN205809689U (en) | A kind of airframe checks system | |
CN205594455U (en) | Three -dimensional modeling system of transmission line shaft tower |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20191230 Address after: 213031 No.2 Tianhe Road, Xinbei District, Changzhou City, Jiangsu Province Applicant after: Jiangsu Tianze Robot Technology Co.,Ltd. Address before: 213022, No. 2, Tianhe Road, Tianhe photovoltaic industrial park, Xinbei District, Jiangsu, Changzhou Applicant before: TRINASOLAR Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |