CN112235041A - Real-time point cloud processing system and method and airborne data acquisition device and method - Google Patents
Real-time point cloud processing system and method and airborne data acquisition device and method Download PDFInfo
- Publication number
- CN112235041A CN112235041A CN202011510697.9A CN202011510697A CN112235041A CN 112235041 A CN112235041 A CN 112235041A CN 202011510697 A CN202011510697 A CN 202011510697A CN 112235041 A CN112235041 A CN 112235041A
- Authority
- CN
- China
- Prior art keywords
- data
- ground
- airborne
- acquisition device
- time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012545 processing Methods 0.000 title claims abstract description 65
- 238000000034 method Methods 0.000 title claims description 23
- 238000013480 data collection Methods 0.000 claims description 15
- 230000006835 compression Effects 0.000 claims description 12
- 238000007906 compression Methods 0.000 claims description 12
- 238000005259 measurement Methods 0.000 claims description 12
- 230000005540 biological transmission Effects 0.000 claims description 9
- 238000005070 sampling Methods 0.000 claims description 8
- 238000003672 processing method Methods 0.000 claims description 6
- 238000005516 engineering process Methods 0.000 description 5
- 239000011159 matrix material Substances 0.000 description 4
- 238000003860 storage Methods 0.000 description 4
- 230000018109 developmental process Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000011835 investigation Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 208000028257 Joubert syndrome with oculorenal defect Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B7/00—Radio transmission systems, i.e. using radiation field
- H04B7/14—Relay systems
- H04B7/15—Active relay systems
- H04B7/185—Space-based or airborne stations; Stations for satellite systems
- H04B7/18502—Airborne stations
- H04B7/18506—Communications with or from aircraft, i.e. aeronautical mobile service
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/49—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/003—Transmission of data between radar, sonar or lidar systems and remote stations
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0841—Registering performance data
- G07C5/085—Registering performance data using electronic data carriers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L69/00—Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
- H04L69/04—Protocols for data compression, e.g. ROHC
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Computer Security & Cryptography (AREA)
- Astronomy & Astrophysics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
The system comprises an airborne data acquisition device, a ground data acquisition device and a ground data processing device, wherein the airborne data acquisition device is used for acquiring the operation data of an airborne laser radar and compressing the data, the ground data acquisition device acquires the datum station data of the airborne laser radar, and the ground data processing device acquires point cloud data according to the compressed operation data and datum station data. Therefore, data are transferred to the ground to be processed, the size, the weight and the power consumption of an airborne computer are reduced, and therefore the operation efficiency is improved when the airplane is in voyage.
Description
Technical Field
The application relates to the technical field of computers, in particular to a processing system of real-time point cloud, a processing method of real-time point cloud, an airborne data acquisition device and an airborne data acquisition method.
Background
In recent years, with the rapid development of radar technology (e.g., laser radar technology) and unmanned aerial vehicle technology, unmanned aerial vehicle-mounted laser radar systems are increasingly widely applied in various fields. For example, surveying and mapping, agricultural plant protection, forestry investigation and electric power inspection and other fields, a large amount of airborne laser radars are adopted for operation.
Considering that if a sensor fails or data of a satellite navigation system (GNSS) is lost at an aircraft end (including an unmanned aerial vehicle, a passenger plane, a cargo plane, etc.), the data of the whole flight deck becomes invalid data, a real-time processing scheme is proposed in the industry.
Specifically, pose data, laser scanner data and photos are collected at the airplane end, then data processing is carried out through an onboard computer, and the processed data are downloaded to a ground terminal through a data link to be displayed.
Because data processing needs to be performed on the onboard computer, the configuration requirement on the onboard computer is high, and the size, weight and power consumption of the onboard computer which generally meets the requirement are relatively high. Therefore, the on-board data processing may increase the weight of the system and increase power consumption, which may shorten the flight time of the aircraft (such as an unmanned aerial vehicle), and further reduce the operation efficiency and increase the operation cost.
Disclosure of Invention
The application provides a processing system including real-time point cloud, this system gathers airborne laser radar's operating data and compresses through airborne data collection system, ground data collection system gathers airborne laser radar's reference station data, ground data processing apparatus acquires airborne laser radar's operating data and airborne laser radar's reference station data after the compression, obtains point cloud data to the consumption of aircraft has been reduced, the efficiency of operation has been improved. The application also provides a processing method of the real-time point cloud corresponding to the system, an airborne data acquisition device and an airborne data acquisition method.
In a first aspect, the present application provides a system for processing a real-time point cloud, the system comprising an airborne data acquisition device, a ground data acquisition device and a ground data processing device, wherein,
the airborne data acquisition device is used for acquiring the operation data of the airborne laser radar, compressing the operation data and transmitting the compressed operation data to the ground data processing device;
the ground data acquisition device is used for acquiring reference station data of the airborne laser radar and then transmitting the reference station data to the ground data processing device;
and the ground data processing device is used for obtaining point cloud data according to the compressed operating data and the reference station data.
In some possible implementation manners, the airborne data acquisition device comprises an upper computer and a lower computer, wherein the upper computer is used for acquiring photo data and measurement data, and the lower computer is used for acquiring pose data.
In some possible implementations, the pose data includes inertial data and navigation data, and the lower computer is further configured to time-synchronize the inertial data according to the navigation data.
In some possible implementations, the lower machine is configured to time-synchronize the inertial data by recording increments of each synchronization time and the primary synchronization time.
In some possible implementation manners, the lower computer includes a single chip microcomputer with a real-time operating system, and the upper computer includes an embedded system with a high-level operating system.
In some possible implementations, the onboard data acquisition device is specifically configured to compress the operational data in a downsampling manner.
In a second aspect, the present application provides a method for processing a real-time point cloud, including:
the airborne data acquisition device acquires the operation data of the airborne laser radar, compresses the operation data and transmits the compressed operation data to the ground data processing device;
the ground data acquisition device acquires reference station data of the airborne laser radar and then transmits the reference station data to the ground data processing device;
and the ground data processing device acquires point cloud data according to the compressed operating data and the reference station data.
In some possible implementation manners, the airborne data acquisition device comprises an upper computer and a lower computer, wherein the upper computer is used for acquiring photo data and measurement data, and the lower computer is used for acquiring pose data.
In some possible implementations, the pose data includes inertial data and navigation data, and the lower computer is further configured to time-synchronize the inertial data according to the navigation data.
In some possible implementations, the lower machine is configured to time-synchronize the inertial data by recording increments of each synchronization time and the primary synchronization time.
In some possible implementation manners, the lower computer includes a single chip microcomputer with a real-time operating system, and the upper computer includes an embedded system with a high-level operating system.
In some possible implementations, the onboard data acquisition device is specifically configured to compress the operational data in a downsampling manner.
In a third aspect, the present application provides an onboard data acquisition device comprising an acquisition module, a compression module, and a transmission module, wherein,
the acquisition module is used for acquiring the operating data of the airborne laser radar;
the compression module is used for compressing the operation data;
and the transmission module is used for transmitting the compressed operation data to the ground data processing device.
In some possible implementation manners, the airborne data acquisition device comprises an upper computer and a lower computer, wherein the upper computer is used for acquiring photo data and measurement data, and the lower computer is used for acquiring pose data.
In some possible implementations, the pose data includes inertial data and navigation data, and the lower computer is further configured to time-synchronize the inertial data according to the navigation data.
In some possible implementations, the lower machine is configured to time-synchronize the inertial data by recording increments of each synchronization time and the primary synchronization time.
In some possible implementation manners, the lower computer includes a single chip microcomputer with a real-time operating system, and the upper computer includes an embedded system with a high-level operating system.
In some possible implementations, the compression module is specifically configured to compress the operation data in a down-sampling manner.
In a fourth aspect, the present application provides an airborne data acquisition method, which is applied to an airborne data acquisition device, and includes:
collecting operation data of an airborne laser radar;
compressing the operating data;
and transmitting the compressed operation data to a ground data processing device.
In some possible implementation manners, the airborne data acquisition device comprises an upper computer and a lower computer, wherein the upper computer is used for acquiring photo data and measurement data, and the lower computer is used for acquiring pose data.
In some possible implementations, the pose data includes inertial data and navigation data, and the lower computer is further configured to time-synchronize the inertial data according to the navigation data.
In some possible implementations, the lower machine is configured to time-synchronize the inertial data by recording increments of each synchronization time and the primary synchronization time.
In some possible implementation manners, the lower computer includes a single chip microcomputer with a real-time operating system, and the upper computer includes an embedded system with a high-level operating system.
In some possible implementations, compressing the operation data specifically includes compressing the operation data in a down-sampling manner.
The present application can further combine to provide more implementations on the basis of the implementations provided by the above aspects.
Drawings
In order to more clearly illustrate the technical method of the embodiments of the present application, the drawings used in the embodiments will be briefly described below.
Fig. 1 is an architecture diagram of a real-time point cloud processing system according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of data acquisition by an onboard data acquisition device according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a process for obtaining point cloud data by a ground data processing apparatus according to an embodiment of the present disclosure;
fig. 4 is a schematic flow chart of a processing method of a real-time point cloud provided in an embodiment of the present application
Fig. 5 is an architecture diagram of an onboard data acquisition device according to an embodiment of the present application;
fig. 6 is a schematic flow chart of an onboard data acquisition method according to an embodiment of the present application.
Detailed Description
The terms "first" and "second" in the embodiments of the present application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature.
Some technical terms referred to in the embodiments of the present application will be first described.
Point cloud (Pointcloud) data refers to a collection of vectors in a three-dimensional coordinate system. These vectors are usually expressed in terms of X, Y, Z three-dimensional coordinates and are generally used primarily to represent the shape of the external surface of an object. Not so, the point cloud data may represent RGB color, gray value, depth, segmentation result, etc. of one point in addition to the geometric position information represented by (X, Y, Z). The point cloud data is widely used in shape detection and classification, stereoscopic vision, motion recovery structure and multi-view reconstruction, and the problems of point cloud storage, compression, rendering and the like are also hot points of research.
The primary method of collecting point cloud data is to use lidar (lidar). Lidar is a radar-like technology that operates on the principle that light is emitted from a device and bounces off an object, and the distance is calculated from the time the light returns to the device, along with the speed of the light. In contrast, radio uses large wavelength electric waves, while lidar uses small wavelength laser light to achieve high accuracy.
Along with the continuous development of laser radar technology and unmanned aerial vehicle technique, the application of unmanned aerial vehicle machine carries laser radar system in each field is more and more extensive, for example survey and drawing, agricultural plant protection, forestry investigation and electric power are patrolled and examined etc. collect real-time point cloud data through laser radar and carry out the operation.
In the existing processing scheme of the airborne laser radar real-time point cloud, pose data, laser scanner data and pictures are collected at an airplane end, then data processing is carried out through an airborne computer, and the processed data are downloaded to a ground terminal through a data link to be displayed. Because data processing needs to be carried out on the airborne computer, the configuration requirement on the airborne computer is higher, and the size, the weight and the power consumption of the airborne computer meeting the configuration requirement are higher relatively, so that the weight of the system is increased, the power consumption is increased and the navigation time of the unmanned aerial vehicle is influenced by carrying out data processing on the aircraft. The existing data show that if the existing processing scheme of the airborne laser radar real-time point cloud is used, the time of flight of the unmanned aerial vehicle is shortened by about 20% compared with the normal flight, so that the operating efficiency of the unmanned aerial vehicle is reduced, and the economic benefit of a user is influenced.
Based on this, the application provides a processing system of real-time point cloud, including airborne data collection system, ground data collection system and ground data processing apparatus, gather airborne laser radar's operating data and compress through airborne data collection system, ground data collection system gathers airborne laser radar's reference station data, ground data processing apparatus acquires the operating data of airborne laser radar and airborne laser radar's reference station data after the compression, obtain the point cloud data, thereby the consumption of aircraft has been reduced, the efficiency of operation has been improved. In this application, will be located the data processing on the machine originally and change ground into and go on, can reduce the required configuration requirement of airborne computer to reduce the volume, weight and the consumption of airborne computer, thereby when promoting the aircraft, promote the operating efficiency.
For the convenience of understanding, the processing system of the real-time point cloud provided by the embodiment of the present application is described below with reference to the accompanying drawings.
Referring to the architecture diagram of the processing system for real-time point clouds shown in fig. 1, the system 100 includes: an onboard data acquisition device 102, a surface data acquisition device 104, and a surface data processing device 106. The various components of the system 100 are described in detail below.
The process of collecting data by the onboard data collection device 102 is shown in fig. 2.
S202: the onboard data acquisition device 102 acquires operating data of the onboard laser radar.
In some possible implementations, the onboard data collection device 102 includes an upper computer and a lower computer that communicate using a network.
In some possible implementation manners, the upper computer may select an embedded system with a high-level operating system, which includes a CPU and a large-capacity storage device, such as Linux or Windows, to facilitate program development, and the lower computer may select a single chip with a real-time operating system.
The acquisition of the operation data of the airborne laser radar by the airborne data acquisition device 102 mainly comprises the following steps: collecting pose data, collecting photo data and collecting measurement data.
Pose (Pose) refers to the position and attitude of an object, and in this application, Pose data includes inertial data (IMU data) and navigation data (GNSS data). The IMU data includes angular velocity and acceleration data of the aircraft, and the GNSS data includes position information of the aircraft, such as latitude and longitude information.
A Satellite Navigation System (GNSS) is a Satellite System covering autonomous spatial positioning of the world, allowing a small electronic receiver to determine its location (longitude, latitude and altitude) and broadcast a time signal transmitted along a line of sight via a Satellite to within a range of exactly 10 meters. The precise time and position calculated by the receiver can be used as a reference for scientific experiments. Such as the Global Positioning System (GPS), the beidou satellite navigation system (BDS), the global navigation satellite system (GLONASS), etc.
Because the real-time requirement of the pose data is very high, a singlechip with a real-time operating system of the lower computer is used for data acquisition. The IMU data needs to be time-synchronized with the GNSS data, and optionally, the time synchronization precision should be less than 1 millisecond. And the single chip microcomputer forwards the original GNSS data and the IMU data subjected to time synchronization to the upper computer.
The acquisition of the photo data needs to perform time synchronization on the photo, and when the camera takes a ground photo, the lower computer acquires an exposure synchronization signal output by the camera and records time, wherein the time is synchronized with GNSS time. After the lower computer completes the photo synchronization, an interrupt signal is generated and sent to the upper computer, after the upper computer receives the interrupt signal, the exposure time of the photo is read from the lower computer, and meanwhile, the upper computer obtains the latest photo data from the camera.
The acquisition of measured data is acquireed from laser scanner by the host computer, including distance and angle data, because laser scanner self possesses the time synchronization function, consequently the measured data who acquires all includes GNSS's synchronizing time.
S204: the onboard data collection device 102 compresses the operational data.
Because the frequency of IMU data is usually several hundred hertz, and the measurement data collected by the laser scanner is larger, usually several hundred kilohertz, the amount of data to be processed is large, and therefore, the data needs to be compressed.
Specifically, since each data collected in S202 has a corresponding time, in order to guarantee the data precision, the data is usually stored by using double-precision floating-point data, but such storage consumes a large amount of bandwidth in data transmission. Therefore, in the present application, the lower computer records the start time t when the first GNSS data grant is completed0Subsequently, the current time t is used in each time of time synchronizationcMinus t0Only incremental milliseconds are recorded. I.e. the current synchronization time t = tc-t0. Thus, the time can be recorded by using the integer data, and half of the storage space and the transmission bandwidth can be saved.
The scanning interval of the laser scanner is very small, the density of the collected laser points is very high, and the point cloud can be thinned because the real-time point cloud is usually used for state monitoring and the requirement on the point density is not high.
In particular, since the airborne lidar only needs to acquire scan data to ground, the effective field of view is typically not large (typically less than 90 degrees). Therefore, data with unexpected field angle is filtered, the sampling proportion is set to be N (N > 1), then an effective value is extracted every N-1 data to be used as data to be transmitted, and therefore point cloud thinning processing can be achieved.
S206: the onboard data collection device 102 transmits the compressed operational data to the surface data processing device 106.
The compressed IMU data, GNSS data, laser scanner data, and photo data all include synchronization time, and the data are sent to the ground data processing device 106 via a data link.
In some possible implementations, the data link may be a wide line-of-sight link or a 5G link.
The ground data acquisition device 104 is used for acquiring reference station data of the airborne laser radar and then transmitting the reference station data to the ground data processing device 106.
The reference station is a ground fixed observation station which continuously observes satellite navigation signals for a long time and transmits observation data to a data center in real time or at regular time through a communication facility.
The ground data acquisition device 104 is usually a GNSS data Reference station, such as a beidou satellite navigation system Reference station, and in some possible implementations, a network Reference station (CORS) may be used instead.
The process of obtaining point cloud data from the compressed operating data and reference station data by the ground data processing device 106 is shown in fig. 3.
The ground data processing device 106 is typically a high-performance computer, and communicates with an external device through a network or other hardware interface to transmit data.
S302: the ground data processing device 106 performs track calculation by using the pose data and the reference station data to obtain the track and attitude data of the aircraft.
The pose data includes IMU data and GNSS data, and is compressed and transmitted by the airborne data acquisition device 102, and the reference station data is acquired and transmitted by the ground data acquisition device 104. The attitude data of the aircraft includes roll angle, pitch angle, yaw angle, etc. of the aircraft.
S304: the ground data processing device 106 calculates the projection coordinates of the picture pixel points to the geodetic coordinate system by using the picture data and the synchronization time.
Specifically, the ground data processing device 106 obtains the position and attitude of the airplane at the moment by using the synchronous time of the photo data, and then calculates the accurate attitude of the camera according to the installation position and the mounting angle correction data of the cameraOrientation elements, i.e. the specific coordinate values of the camera relative to the geodetic coordinate system, to calculate a rotation matrix T from the picture pixels to the geodetic coordinate systemP. Optionally, the distortion of the photo is corrected according to the calibration parameters of the camera, and the coordinate P of each pixel on the photo is obtainedk. According to the coordinates P of each pixel on the picturekCombining the pixel points of the photograph to a rotation matrix T of the geodetic coordinate systemPAnd converting the coordinates of the pixel points to a geodetic coordinate system, namely: pk W=TP*PkWherein P isk WThe coordinates of the pixel points in the geodetic coordinate system.
S306: the ground data processing device 106 calculates projection coordinates of the laser scanning point to the geodetic coordinate system using the laser scanner data and the synchronization time.
Specifically, the position and the attitude of the airplane at the moment are obtained by comparing the time of each piece of laser scanner data, and then the specific coordinate value of the laser scanner relative to the geodetic coordinate system is calculated according to the installation position and the arrangement angle correction data of the laser scanner, so that the rotation matrix T from the laser scanning point to the geodetic coordinate system is calculatedL. According to the three-dimensional coordinate P of the laser scanning point under the self coordinate systemiCombining the laser scanning points to a rotation matrix T of the geodetic coordinate systemLAnd converting the coordinates of the laser scanning points to a geodetic coordinate system, namely: pi W=TL*PiIn which P isi WIs the coordinate of the laser scanning point in the geodetic coordinate system.
S308: the ground data processing device 106 colors the corresponding laser points by the color of the pixel points.
Specifically, for each laser point data, determining a photo pixel point in an adjacent space through a projection coordinate, and coloring the laser point according to the photo color of the projection point.
S310: and the display displays the laser point cloud data obtained through conversion.
Therefore, the ground data processing device 106 can obtain point cloud data according to the compressed operation data and the reference station data.
The processing system 100 for real-time point cloud provided by the embodiment of the present application is described in detail above with reference to fig. 1, and next, a processing method for real-time point cloud provided by the embodiment of the present application is described with reference to the accompanying drawings.
Referring to fig. 4, a flow chart of a processing method of a real-time point cloud is shown, the method is applied to a system 100 shown in fig. 1, the system 100 includes an onboard data acquisition device 102, a ground data acquisition device 104 and a ground data processing device 106, and the method includes:
s402: the airborne data acquisition device 102 acquires the operation data of the airborne laser radar, compresses the operation data, and transmits the compressed operation data to the ground data processing device 106.
In some possible implementations, the onboard data collection device 102 includes an upper computer for collecting the photo data and the measurement data and a lower computer for collecting the pose data.
In some possible implementations, the pose data includes inertial data and navigation data, and the lower computer is further configured to time-synchronize the inertial data according to the navigation data.
In some possible implementations, the lower machine is configured to time-synchronize the inertial data by recording increments of each synchronization time and the primary synchronization time.
In some possible implementation manners, the lower computer includes a single chip microcomputer with a real-time operating system, and the upper computer includes an embedded system with a high-level operating system.
In some possible implementations, the on-board data acquisition device 102 is specifically configured to compress the operational data by down-sampling.
S404: the ground data acquisition device 104 acquires reference station data of the airborne lidar and then transmits the reference station data to the ground data processing device 106.
S406: the ground data processing device 106 obtains point cloud data according to the compressed operating data and the reference station data.
In some possible implementations, the onboard data collection device 102 includes an upper computer for collecting the photo data and the measurement data and a lower computer for collecting the pose data.
In some possible implementations, the pose data includes inertial data and navigation data, and the lower computer is further configured to time-synchronize the inertial data according to the navigation data.
In some possible implementations, the lower machine is configured to time-synchronize the inertial data by recording increments of each synchronization time and the primary synchronization time.
In some possible implementation manners, the lower computer includes a single chip microcomputer with a real-time operating system, and the upper computer includes an embedded system with a high-level operating system.
In some possible implementations, the on-board data acquisition device 102 is specifically configured to compress the operational data by down-sampling.
The present embodiment also provides an onboard data collection apparatus 102, as shown in fig. 5, which includes a collection module 502, a compression module 504, and a transmission module 506, wherein,
an acquisition module 502, configured to acquire operating data of the airborne laser radar;
a compression module 504, configured to compress the operation data;
a transmission module 506, configured to transmit the compressed operation data to the surface data processing apparatus 106.
In some possible implementations, the data acquisition device 102 includes an upper computer for acquiring the photo data and the measurement data and a lower computer for acquiring the pose data.
In some possible implementations, the pose data includes inertial data and navigation data, and the lower computer is further configured to time-synchronize the inertial data according to the navigation data.
In some possible implementations, the lower machine is configured to time-synchronize the inertial data by recording increments of each synchronization time and the primary synchronization time.
In some possible implementation manners, the lower computer includes a single chip microcomputer with a real-time operating system, and the upper computer includes an embedded system with a high-level operating system.
In some possible implementations, the compression module 504 is specifically configured to compress the operation data in a down-sampling manner.
Correspondingly, the present application also provides an airborne data acquisition method, which is applied to the airborne data acquisition device 102, as shown in fig. 6, and includes:
s602: acquisition module 502 acquires operating data of the airborne lidar.
S604: the compression module 504 compresses the operational data.
S606: the transmission module 506 transmits the compressed operational data to the surface data processing device 106.
In some possible implementations, the onboard data collection device 102 includes an upper computer for collecting the photo data and the measurement data and a lower computer for collecting the pose data.
In some possible implementations, the pose data includes inertial data and navigation data, and the lower computer is further configured to time-synchronize the inertial data according to the navigation data.
In some possible implementations, the lower machine is configured to time-synchronize the inertial data by recording increments of each synchronization time and the primary synchronization time.
In some possible implementation manners, the lower computer includes a single chip microcomputer with a real-time operating system, and the upper computer includes an embedded system with a high-level operating system.
In some possible implementations, compressing the operation data specifically includes compressing the operation data in a down-sampling manner.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined or deleted according to actual needs; the modules in the device of the embodiment of the application can be divided, combined or deleted according to actual needs.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (10)
1. The system for processing the real-time point cloud is characterized by comprising an airborne data acquisition device, a ground data acquisition device and a ground data processing device;
the airborne data acquisition device is used for acquiring the operation data of an airborne laser radar, compressing the operation data and transmitting the compressed operation data to the ground data processing device;
the ground data acquisition device is used for acquiring reference station data of the airborne laser radar and then transmitting the reference station data to the ground data processing device;
and the ground data processing device is used for obtaining point cloud data according to the compressed operating data and the reference station data.
2. The system of claim 1, wherein the onboard data acquisition device comprises an upper computer and a lower computer, the upper computer is used for acquiring photo data and measurement data, and the lower computer is used for acquiring pose data.
3. The system of claim 2, wherein the pose data comprises inertial data and navigation data, the lower computer further configured to time synchronize the inertial data according to the navigation data.
4. The system of claim 3, wherein the lower computer is specifically configured to time synchronize the inertial data by recording increments of each synchronization time and a primary synchronization time.
5. The system according to any one of claims 2 to 4, wherein the lower computer comprises a single chip microcomputer with a real-time operating system, and the upper computer comprises an embedded system with a high-level operating system.
6. The system according to any one of claims 1 to 4, characterized in that said onboard data acquisition means are particularly adapted to compress said operating data by means of down-sampling.
7. The processing method of the real-time point cloud is characterized by comprising the following steps:
the method comprises the following steps that an airborne data acquisition device acquires operation data of an airborne laser radar, compresses the operation data, and then transmits the compressed operation data to a ground data processing device;
a ground data acquisition device acquires reference station data of the airborne laser radar and then transmits the reference station data to the ground data processing device;
and the ground data processing device acquires point cloud data according to the compressed operating data and the reference station data.
8. An airborne data acquisition device is characterized by comprising an acquisition module, a compression module and a transmission module, wherein,
the acquisition module is used for acquiring the operating data of the airborne laser radar;
the compression module is used for compressing the operation data;
and the transmission module is used for transmitting the compressed operating data to a ground data processing device.
9. The device of claim 8, wherein the acquisition module comprises an upper computer and a lower computer, the upper computer is used for acquiring photo data and measurement data, and the lower computer is used for acquiring pose data.
10. An airborne data collection method, the method comprising:
collecting operation data of an airborne laser radar;
compressing the operating data;
and transmitting the compressed operation data to a ground data processing device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011510697.9A CN112235041A (en) | 2020-12-18 | 2020-12-18 | Real-time point cloud processing system and method and airborne data acquisition device and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011510697.9A CN112235041A (en) | 2020-12-18 | 2020-12-18 | Real-time point cloud processing system and method and airborne data acquisition device and method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112235041A true CN112235041A (en) | 2021-01-15 |
Family
ID=74124921
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011510697.9A Pending CN112235041A (en) | 2020-12-18 | 2020-12-18 | Real-time point cloud processing system and method and airborne data acquisition device and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112235041A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113267186A (en) * | 2021-07-16 | 2021-08-17 | 成都纵横大鹏无人机科技有限公司 | Data synchronous acquisition system and data synchronous acquisition method |
CN114124909A (en) * | 2021-09-14 | 2022-03-01 | 福州大学 | System and method for real-time acquisition, compression and transmission of point cloud based on lidar |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101335431A (en) * | 2008-07-27 | 2008-12-31 | 广西电力工业勘察设计研究院 | Overhead power transmission line optimized line selection method based on airborne laser radar data |
CN207717980U (en) * | 2017-09-25 | 2018-08-10 | 张亮 | Unmanned plane carry looks into separated apparatus and system |
CN108415034A (en) * | 2018-04-27 | 2018-08-17 | 绵阳天眼激光科技有限公司 | A kind of laser radar real-time imaging devices |
US20180290748A1 (en) * | 2017-04-03 | 2018-10-11 | Versatol, Llc | Autonomous in-tunnel intelligence, surveillance, and reconnaissance drone |
EP3696577A1 (en) * | 2017-10-13 | 2020-08-19 | Chongqing Survey Institute | Method and device for acquiring point cloud data in the absence of gnss signal |
-
2020
- 2020-12-18 CN CN202011510697.9A patent/CN112235041A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101335431A (en) * | 2008-07-27 | 2008-12-31 | 广西电力工业勘察设计研究院 | Overhead power transmission line optimized line selection method based on airborne laser radar data |
US20180290748A1 (en) * | 2017-04-03 | 2018-10-11 | Versatol, Llc | Autonomous in-tunnel intelligence, surveillance, and reconnaissance drone |
CN207717980U (en) * | 2017-09-25 | 2018-08-10 | 张亮 | Unmanned plane carry looks into separated apparatus and system |
EP3696577A1 (en) * | 2017-10-13 | 2020-08-19 | Chongqing Survey Institute | Method and device for acquiring point cloud data in the absence of gnss signal |
CN108415034A (en) * | 2018-04-27 | 2018-08-17 | 绵阳天眼激光科技有限公司 | A kind of laser radar real-time imaging devices |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113267186A (en) * | 2021-07-16 | 2021-08-17 | 成都纵横大鹏无人机科技有限公司 | Data synchronous acquisition system and data synchronous acquisition method |
CN114124909A (en) * | 2021-09-14 | 2022-03-01 | 福州大学 | System and method for real-time acquisition, compression and transmission of point cloud based on lidar |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110926474B (en) | Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method | |
Nagai et al. | UAV-borne 3-D mapping system by multisensor integration | |
CN102353377B (en) | High altitude long endurance unmanned aerial vehicle integrated navigation system and navigating and positioning method thereof | |
US9001203B2 (en) | System and program for generating integrated database of imaged map | |
CN103822631B (en) | Localization method and the device of a kind of satellite towards rotor and the combination of optical flow field vision | |
Toth et al. | Performance analysis of the airborne integrated mapping system (AIMS) | |
CN105549060A (en) | Target positioning system based on position and attitude of airborne photoelectric pod | |
CN109828291B (en) | Method and device for monitoring man-machine emergency remote sensing | |
CN112861660B (en) | Laser radar array and camera synchronization device, method, equipment and storage medium | |
CN112489032A (en) | Unmanned aerial vehicle-mounted small target detection and positioning method and system under complex background | |
KR20110134076A (en) | 3D spatial information construction method using attitude control of unmanned aerial vehicle | |
CN114964209B (en) | Autonomous navigation method and system for long-endurance unmanned aerial vehicle based on infrared array imaging | |
CN112601928A (en) | Position coordinate estimation device, position coordinate estimation method, and program | |
CN110706273B (en) | Real-time collapse area measurement method based on unmanned aerial vehicle | |
US20190373184A1 (en) | Image display method, image display system, flying object, program, and recording medium | |
CN112235041A (en) | Real-time point cloud processing system and method and airborne data acquisition device and method | |
Vallet et al. | Development and experiences with a fully-digital handheld mapping system operated from a helicopter | |
CN114779190A (en) | A kind of integrated reconnaissance system suitable for photoelectric radar | |
CN206291896U (en) | A kind of three-dimensional mapping system | |
CN114544006B (en) | Low-altitude remote sensing image correction system and method based on ambient illumination condition | |
CN202452974U (en) | Global navigation satellite system (GNSS) close shot smartstation | |
KR20160099336A (en) | Mobile mapping system | |
JP2007107962A (en) | Measuring system of measuring object and separated measuring system under transmission line | |
JP3353571B2 (en) | Earth shape measurement device | |
KR100560699B1 (en) | Method of Manufacturing 3D Virtual Reality Digital Map Using Model Aerial Photography System and WPS / INS System |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210115 |