[go: up one dir, main page]

CN113985487B - Three-dimensional rendering method and system for underground buried object based on three-dimensional ground penetrating radar - Google Patents

Three-dimensional rendering method and system for underground buried object based on three-dimensional ground penetrating radar Download PDF

Info

Publication number
CN113985487B
CN113985487B CN202110963447.9A CN202110963447A CN113985487B CN 113985487 B CN113985487 B CN 113985487B CN 202110963447 A CN202110963447 A CN 202110963447A CN 113985487 B CN113985487 B CN 113985487B
Authority
CN
China
Prior art keywords
data
channel
radar
dimensional
dimensional rendering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110963447.9A
Other languages
Chinese (zh)
Other versions
CN113985487A (en
Inventor
狄毅
库捷峰
董超
黄钰琳
唐剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ande Space Technology Co ltd
Original Assignee
Shenzhen Ande Space Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ande Space Technology Co ltd filed Critical Shenzhen Ande Space Technology Co ltd
Priority to CN202110963447.9A priority Critical patent/CN113985487B/en
Publication of CN113985487A publication Critical patent/CN113985487A/en
Application granted granted Critical
Publication of CN113985487B publication Critical patent/CN113985487B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V3/00Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation
    • G01V3/12Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation operating with electromagnetic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V3/00Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation
    • G01V3/38Processing data, e.g. for analysis, for interpretation, for correction

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Geology (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geophysics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides a three-dimensional rendering method and a three-dimensional rendering system for an underground buried object based on a three-dimensional ground penetrating radar, wherein the three-dimensional rendering method for the underground buried object comprises the following steps: step S1, extracting target radar data according to a designated position; s2, preprocessing the extracted radar data, including removing direct waves of the extracted target radar data, eliminating background transverse waves of data of each channel, applying reverse amplitude attenuation gain to the data of each channel and removing low-frequency noise of the channel; step S3, calculating and projecting the processed radar data to obtain a full-channel data slice set; s4, processing data of the full-channel data slice set by utilizing an edge detection algorithm to obtain a slice set for separating a target waveform from background clutter; and S5, performing three-dimensional rendering on the targets separated in the slice set. The invention combines various filtering and data preprocessing, thereby being capable of effectively separating underground buried objects and effectively performing three-dimensional rendering.

Description

Three-dimensional rendering method and system for underground buried object based on three-dimensional ground penetrating radar
Technical Field
The invention relates to a three-dimensional rendering method of an underground buried object, in particular to a three-dimensional rendering method of an underground buried object based on a three-dimensional ground penetrating radar, and further relates to a three-dimensional rendering system of an underground buried object, which adopts the three-dimensional rendering method of an underground buried object based on the three-dimensional ground penetrating radar.
Background
The Ground penetrating radar (group PENETRATING RADAR) is also called a geological radar, detects an underground medium by emitting high-frequency pulse electromagnetic waves (the frequency is between 1MHz and 1 GHz), determines the distribution of the underground medium, has the characteristics of simple operation, high detection precision, no damage, high acquisition speed and the like, and is the most active detection technology for engineering detection and investigation at present.
The traditional detection technology is used for detecting by a multi-purpose two-dimensional radar, and only one two-dimensional longitudinal vertical data profile reflects underground space information. In recent years, three-dimensional ground penetrating radars are widely accepted and used, and through high-dynamic rapid real-time sampling, 16 two-dimensional longitudinal vertical data sectional views with small enough fixed intervals, namely three-dimensional data volumes, are acquired, so that the shape, position, trend and other information of underground buried objects can be reflected more accurately. However, at present, no matter two-dimensional ground penetrating radar data or three-dimensional ground penetrating radar data, each data processing software does not fully utilize the data advantages of the three-dimensional ground penetrating radar to construct an interpretation result based on a three-dimensional data body form, but presents the interpretation result in a form of a single two-dimensional section waveform diagram or a two-dimensional plane top view, and the information such as the shape, the position, the trend and the like of the underground buried object cannot be reflected in a visual manner, so that the information is difficult for non-professional to understand deeply.
In addition, the ground penetrating radar wave has the characteristics of complex waveform, easy energy attenuation and more interference clutter, and can locate and find underground buried objects through various filtering treatments; the underground buried object is deep underground, cannot be directly observed, is easily confused with the surrounding environment, and needs to be detected from the background radar wave by a professional or a special algorithm.
Disclosure of Invention
The invention aims to solve the technical problem of providing a three-dimensional rendering method of an underground buried object based on a three-dimensional ground penetrating radar, which can be used for separating the underground buried object and effectively performing three-dimensional rendering aiming at a special application environment of the underground buried object and is different from targets in other environments through combination of various filtering and data preprocessing.
In this regard, the invention provides a three-dimensional rendering method of an underground buried object based on a three-dimensional ground penetrating radar, which comprises the following steps:
step S1, extracting target radar data according to a designated position;
s2, preprocessing the extracted radar data, wherein the preprocessing comprises the steps of removing direct waves in the extracted target radar data, eliminating background transverse waves in data of each channel, applying reverse amplitude attenuation gain to the data of each channel and removing low-frequency noise of the channel;
step S3, calculating and projecting the processed radar data to obtain a full-channel data slice set after projection is completed;
S4, processing data of the full-channel data slice set by utilizing an edge detection algorithm to obtain a slice set for separating a target waveform from background clutter;
and S5, performing three-dimensional rendering on the targets separated in the slice set in the step S4.
The invention further improves that the step S1 of extracting target radar data refers to extracting radar amplitude intensity data of a specified area of each measuring line, specifying a start and stop number, and extracting radar data in the range of the start and stop number of each measuring line, and comprises the following substeps:
step S101, obtaining the original data of each measuring line of the three-dimensional ground penetrating radar;
Step S102, according to the three-dimensional rendering target position, referring to the line position file, determining the initial track number Tr1 and the final track number Tr2 in the corresponding radar original data, searching the sample number nSam of each track of the data file and the data type in the data file according to the single-channel data detailed configuration file, determining the byte number nByte of each data according to the data type, calculating the total byte number totalB of the data of each channel to be extracted according to a formula totalB = nTr × nSam × nByte, and calculating the byte number ignB to be skipped when extracting the target data according to a formula ignB =Tr1× nSam × nByte; wherein nTr is the number of tracks that each channel's data needs to extract, nTr =tr2-Tr 1.
The invention is further improved in that the step S2 of preprocessing the extracted radar data comprises the following sub-steps:
Step S201, for single channel data, optionally, one of the data is selected, a position sam1 with a first amplitude value lower than a preset sample point is determined, the position sam1 is used as a position of a stop sample point of the direct wave of the channel, and all sample point data from a start sample point sam0 to the stop sample point of each data of the channel are discarded according to a formula totalB' = nTr (nSam- (sam 1-sam 0)) -nByte;
Step S202, setting M (nSam, nTr) as a matrix form of the channel data after the direct wave is removed in the step S201, averaging sample points at the same position of each track in the data matrix M (nSam, nTr) to obtain a vector BackGr (nSam, 1), and subtracting the vector BackGr (nSam, 1) from each track of data of the data matrix M (nSam, nTr) to obtain a new data matrix M0;
step S203, each data of the data matrix M0 obtained in the step S202 is combined with the normalized gain factor Multiplying to apply inverse amplitude decay gain to the data of each channel, wherein the normalized gain factor; A (t) is a fitted attenuation model, and is calculated by the formula a (t) =c (1) exp (-a (1) t) +. c is a median and mean value compensation function of forward and reverse Fourier transformation, a is an amplitude value of each sampling point, n is a linear adjustable parameter representing the sampling point, t is a time axis representing depth, and max A (t) is a maximum return value of a fitting attenuation curve;
step S204, the data matrix obtained in the step S203 is processed through zero-phase high-pass finite impulse response filtering.
A further improvement of the present invention is that in the step S201, the total number of sample points is nSam0, wherein nSam0 < nSam- (sam 1-sam 0), for each data of the channel, only sample points from sam1 to sam1+ nSam0 are reserved, and the size of the data obtained finally is: totalB' = nTr × nSam0 × nByte.
A further development of the invention is that said step S3 comprises the sub-steps of:
step S301, calculating the mean value m and standard deviation S of all data points of all channels;
step S302, respectively setting a minimum value aMin and a maximum value aMax of the radar data amplitude according to formulas aMin =m-p×s and aMax =m+p×s, wherein p is a linearly adjustable parameter representing the amplitude variation;
in step S303, all data points of all channels are projected into the image domain range of [0,1] according to the maximum value of aMax and the minimum value of aMin, and a full-channel data slice set is obtained after projection is completed.
A further improvement of the present invention is that in the step S303, the projection formula is passedAll data points of all channels are projected into the image domain range of [0,1], where Amp is the amplitude value of the data point where it is located.
A further development of the invention is that said step S4 comprises the sub-steps of:
step S401, denoising and smoothing the data of the full-channel data slice set obtained in the step S3 through Gaussian filtering;
step S402, calculating the amplitude and the direction of the pixel gradient by using a sobel operator;
step S403, performing non-maximum suppression on the image, and reserving the point with the maximum local gradient;
Step S404, setting a first pixel threshold and a second pixel threshold, wherein pixel points with gradient amplitude exceeding the first pixel threshold are reserved as edge pixels; discarding pixels with gradient magnitudes below a second threshold of pixels;
Step S405, reserving pixel points communicated with the edge pixels according to the edge continuous rule for the pixel points with gradient amplitude values between the first pixel threshold value and the second pixel threshold value, and discarding the pixel points which cannot be communicated with the edge pixels;
Step S406, performing secondary communication on the discontinuous edge obtained in the step S405 to form a closed edge, generating a logic matrix according to the closed edge, wherein the inside of the closed edge is 1, and the outside of the closed edge is 0; repeating the steps once for each channel to obtain a closed edge logic matrix of each channel;
in step S407, the closed edge logic matrix of each channel is multiplied by the data matrix to obtain a slice set for separating the target waveform from the background clutter.
A further development of the invention is that said step S5 comprises the sub-steps of:
step S501, generating a tiff picture in a multi-frame format from a slice set with a target waveform separated from background clutter;
step S502, three-dimensional rendering is carried out through tiff pictures in a multi-frame format.
A further improvement of the present invention is that said step S501 comprises the sub-steps of:
s5011, setting a picture frame number idx by using a tiff function of matlab, and setting the number of channels to be equal to or less than 1 and equal to or less than 1;
S5012, setting a picture length and a picture width by using a tiff function of matlab, wherein the picture length nWidth is less than or equal to nTr of the track number of ‌ target data, and the picture width nHeight is less than or equal to nSam of sampling points;
step S5013, writing the slice set obtained by the step S4 into a tiff function of matlab according to a hierarchical sequence;
Step S5014, repeating steps S5011 to S5014 until writing of all channels is completed;
step S502 includes the sub-steps of:
Step S5021, arranging the slice sets into a three-dimensional matrix according to the channel sequence;
Step S5022, determining an observation direction aiming at the three-dimensional matrix, and carrying out equidistant sampling along the observation direction;
Step S5023, calculating the color value and the opacity of each sampling point by using a cubic smoothing interpolation method;
Step S5024, synthesizing sampling points in the observation direction in a sequence from front to back or from back to front;
Step S5025, projecting accumulated color values of all pixel points along the observation direction on a screen to generate a visual three-dimensional rendering chart; when the observation direction changes, steps S5022 to S5025 are repeated to obtain a new three-dimensional rendering map.
The invention also provides a three-dimensional rendering system of the underground buried object based on the three-dimensional ground penetrating radar, which adopts the three-dimensional rendering method of the underground buried object based on the three-dimensional ground penetrating radar and comprises the following steps:
The data acquisition module extracts target radar data according to the appointed position;
The data preprocessing module is used for preprocessing the extracted radar data, wherein the preprocessing comprises the steps of removing direct waves in the extracted target radar data, eliminating background transverse waves in the data of each channel, applying reverse amplitude attenuation gain to the data of each channel and removing low-frequency noise of the channel;
The data projection module is used for calculating and projecting the processed radar data to obtain a full-channel data slice set after projection is completed;
The background separation module is used for processing the data of the full-channel data slice set by utilizing an edge detection algorithm to obtain a slice set for separating the target waveform from background clutter;
And the three-dimensional rendering module is used for performing three-dimensional rendering on the targets which are intensively separated by the background separation module slice.
Compared with the prior art, the invention has the beneficial effects that: different from three-dimensional rendering under other environments, the invention aims at a special application environment of an underground buried object, the underground buried object cannot be directly observed or detected, the underground environment where the underground buried object is positioned is complex, and the interference factors are extremely large, so that the target and the environment are difficult to effectively separate, the invention aims at the technical difficulties, the technical scheme of combining various filtering and data preprocessing is independently designed and optimized, so that underground buried objects can be visually presented, detection and display description of the underground buried objects are optimized, the effectiveness and the high efficiency of geological detection are improved, the difficulty of high-quality geological detection is reduced, and the threshold of cross-industry cooperation is reduced.
Drawings
FIG. 1 is a schematic workflow architecture diagram of one embodiment of the present invention;
FIG. 2 is a diagram showing a comparison of radar waveforms before and after data preprocessing in accordance with one embodiment of the present invention;
FIG. 3 is a full channel data slice set simulation diagram of one embodiment of the present invention;
FIG. 4 is a graph showing the effect of edge detection on the edge morphology of a target according to one embodiment of the present invention;
FIG. 5 is a three-dimensional rendering effect diagram of one embodiment of the present invention.
Detailed Description
Preferred embodiments of the present invention will be described in further detail below with reference to the accompanying drawings.
Firstly, it should be noted that, unlike three-dimensional rendering under other environments, the present example is directed to a special application environment of an underground buried object, the underground buried object cannot be directly observed or detected, the underground environment where the underground buried object is located is complex, and interference factors are extremely large, so that it is difficult to effectively separate the object from the environment; even though the prior art has a technical scheme of detecting and rendering the underground probe, the effect of the method can not meet the actual requirement because the target and the environment are not well separated.
As shown in fig. 1, this example provides a three-dimensional rendering method of an underground buried object based on a three-dimensional ground penetrating radar, which includes:
step S1, extracting target radar data according to a designated position;
s2, preprocessing the extracted radar data, wherein the preprocessing comprises the steps of removing direct waves in the extracted target radar data, eliminating background transverse waves in data of each channel, applying reverse amplitude attenuation gain to the data of each channel and removing low-frequency noise of the channel;
step S3, calculating and projecting the processed radar data to obtain a full-channel data slice set after projection is completed;
S4, processing data of the full-channel data slice set by utilizing an edge detection algorithm to obtain a slice set for separating a target waveform from background clutter;
and S5, performing three-dimensional rendering on the targets separated in the slice set in the step S4.
In this example, extracting target radar data in step S1 refers to extracting radar amplitude intensity data of a specified area of each measuring line, specifying a start-stop number (i.e., a start-stop number Tr1 and a stop-stop number Tr 2), and extracting radar data in a range of the start-stop number of each measuring line, including step S101 and step S102.
Step S101, obtaining the original data of each measuring line of the three-dimensional ground penetrating radar, and confirming the data structure of the three-dimensional ground penetrating radar; a part of the original data of each measuring line of the three-dimensional ground penetrating radar is a data information file related to the whole measuring line, wherein the data information file comprises position files (gps) and all position information of the radar when the radar collects the measuring line; time file (time), the file includes all time information of radar when gathering the line; a line location file (cor) which integrates time information and location information into a whole and uses radar track number information as a common index; a survey line marking file (mrk) which records the position information of the artificial mark in the survey line collecting process; and a configuration file (ord) of the line channel, wherein the file stores the configuration condition of each channel in the process of collecting the line, and is used for fusing the data among the channels into a data body.
The other part is the data file of each channel of each measuring line, including a single channel data detailed configuration file (iprh), wherein the file records the total channel number (nTr) of the channel data, the sample point number (nSam) of each channel and other relevant parameters; the radar amplitude intensity file (iprb), the raw data file, contains the echo amplitude intensities of all sample points received by the radar from the beginning of the line (first line) to the end of the line (last line).
Step S102, according to the three-dimensional rendering target position, referring to the line position file (cor), determining the initial track number Tr1 and the final track number Tr2 in the corresponding radar original data, searching the sample number nSam of each track of the data file and the data type in the data file according to the single-channel data detailed configuration file (iprh), determining the byte number nByte of each data according to the data type, calculating the total data byte number totalB required to be extracted by the data of each channel according to the formula totalB = nTr × nSam × nByte, and calculating the byte number ignB required to be skipped by the data of each channel according to the formula ignB =Tr1× nSam × nByte; wherein nTr is the number of tracks that each channel's data needs to extract, nTr =tr2-Tr 1.
As shown in fig. 2, the process of preprocessing the extracted radar data in step S2 belongs to a combination of the first preprocessing, and can filter out the direct wave, background clutter and other noise waves in the radar signal, enhance the signal at the lower part and highlight the signal characteristics of the underground buried object. Preferably, steps S201 to S204 are included.
In this example, the step S201 is used for removing the direct wave in the extracted target radar data, and the design of the substep is that the direct wave in the data of each channel has a difference, so that the direct wave needs to be removed for different channels respectively. The specific method includes determining, for single channel data, a position sam1 with a first amplitude value lower than a preset sample point, using the position sam1 as a position of a stop sample point of the direct wave of the channel, performing discarding operation on all sample point data from a start sample point sam0 to a stop sample point of each channel data according to a formula totalB' = nTr (nSam- (sam 1-sam 0)) × nByte, where the size of the finally obtained data is: totalB' = nTr (nSam- (sam 1-sam 0)). NByte. The preset sample point is preferably a sample point of-5000, and the preset sample point can be preset according to actual conditions or can be adjusted according to actual requirements.
It should be noted that, in order to avoid that the data of each channel has different sizes due to the removal of the direct wave, in step S201 of this example, the total number of sample points is preferably limited to nSam0, wherein nSam 0< nSam- (sam 1-sam 0), only sample points between sam1 and sam1+ nSam0 are reserved for each data of the channel, and the final actually obtained data size is totalB' = nTr × nSam0 × nByte.
Step S202 in this example is used to eliminate the background transverse wave in the data of each channel. The design of step S202 is that the data between the channels needs to eliminate the background transverse wave due to the difference of the antenna energy, and the method is as follows: setting M (nSam, nTr) as a matrix form of the channel data after removing the direct wave in the step S201, wherein nSam is the number of sampling points and nTr is the number of channels of the target data; the data matrix M (nSam, nTr) is averaged laterally, i.e. the sample points at the same position in each track of the data matrix M (nSam, nTr) are averaged to obtain a vector BackGr (nSam, 1), and then each track of data of the data matrix M (nSam, nTr) is subtracted by the vector BackGr (nSam, 1) to obtain a new data matrix M0, where the new data matrix M0 and the data matrix M have the same size.
Step S203 in this example is for applying an inverse amplitude attenuation gain to the data of each channel. Setting M0 (nSam, nTr) as a matrix form of the data of a certain channel processed in the step S202, and calculating a median and mean attenuation formula of all channels of the data matrix M0 (nSam, nTr), namely, each channel of the data matrix M0 obtained in the step S202 and a normalized gain factorMultiplying to achieve application of inverse amplitude attenuation gain to the data for each channel, where a (t) is a fitted attenuation model, calculated by the formula a (t) =c (1) exp (-a (1) t) +. C is a median and mean compensation function of forward and reverse Fourier transformation, a is an amplitude value of each sampling point, n is a linear adjustable parameter representing the sampling point, t is a time axis representing depth, and max A (t) is a maximum return value of a fitting attenuation curve.
The step S204 is used for removing low-frequency noise (dewow), the data matrix obtained in the step S203 is processed through zero-phase high-pass Finite Impulse Response (FIR) filtering, and the band-pass cut-off frequency is preferably set to 2% of Nyquist frequency (Nyquist), which belongs to the preferred parameter value for underground burial, and in practical application, if the actual requirement changes, the band-pass cut-off frequency can be adjusted accordingly.
As shown in fig. 3, step S3 in this example belongs to the second data processing, and in fact, corresponds to the combination of the second preprocessing before rendering. Through setting up suitable linearly adjustable parameter P, can make the characteristic waveform of underground buried object more outstanding, simultaneously, the data of adjacent passageway is arranged together, is convenient for carry out three-dimensional rendering to the buried object. Preferably, steps S301 to S303 are included.
The step S3 in this example preferably comprises the following sub-steps:
step S301, calculating the mean value m and standard deviation S of all data points of all channels;
Step S302, respectively setting a minimum value aMin and a maximum value aMax of the radar data amplitude according to formulas aMin =m-p×s and aMax =m+p×s, where p is a linear adjustable parameter representing the amplitude variation, and the linear adjustable parameter p can be set and adjusted in a self-defined manner according to the actual requirement, and is preferably set to 1-3;
In step S303, all data points of all channels are projected into the image domain range of [0,1] according to the maximum value of aMax and the minimum value of aMin, and a full-channel data slice set is obtained after projection is completed. More preferably, the projection formula is adopted All data points of all channels are projected into the image domain range of [0,1], where Amp is the amplitude value of the data point where it is located.
As shown in fig. 4, in the embodiment, the step S4 outlines the target by using an edge detection algorithm, separates the target from the background clutter, can more completely retain the characteristic waveform of the buried object, more thoroughly remove the background clutter, reduce the difficulty of three-dimensional rendering of the underground buried object, and promote the effect of three-dimensional rendering. Preferably, the method includes steps S401 to S407:
step S401, denoising and smoothing the data of the full-channel data slicing set obtained in step S3 through Gaussian filtering, wherein 5*5 Gaussian filtering is preferably adopted (in practical application, the data can be adjusted according to requirements);
Step S402, calculating the amplitude and direction of a pixel gradient by using a sobel operator, wherein a region with a larger gradient generally belongs to a region with enhanced image, and calculating the pixel gradient by using the sobel operator can roughly acquire edge information;
Step S403, performing non-maximum suppression on the image, and reserving the point with the maximum local gradient, namely reserving possible edge pixel points;
step S404, setting a first pixel threshold and a second pixel threshold, wherein pixel points with gradient amplitude exceeding the first pixel threshold are reserved as edge pixels; discarding pixels with gradient magnitudes below a second threshold of pixels; the first pixel threshold refers to a high pixel threshold and is a preset judgment threshold of the edge pixels; the second pixel threshold refers to a low pixel threshold, and is a preset threshold that the pixel gradient is lower than the edge requirement;
Step S405, reserving pixel points communicated with edge pixels according to edge continuity rules for pixel points with gradient amplitude between a first pixel threshold and a second pixel threshold, namely weak edge pixel points communicated with strong edge pixel points can be reserved, reliability is improved, and meanwhile, discarding pixel points which cannot be communicated with edge pixels;
Step S406, performing secondary communication on the discontinuous edge obtained in the step S405 to form a closed edge, generating a logic matrix according to the closed edge, wherein the inside of the closed edge is 1, and the outside of the closed edge is 0; repeating the steps once for each channel to obtain a closed edge logic matrix of each channel;
in step S407, the closed edge logic matrix of each channel is multiplied by the data matrix to obtain a slice set for separating the target waveform from the background clutter.
As shown in fig. 5, step S5 in this example is configured to perform three-dimensional rendering on the objects separated in the slice set, and includes step S501 and step S502:
step S501, generating a tiff picture in a multi-frame format from a slice set with a target waveform separated from background clutter;
step S502, three-dimensional rendering is carried out through tiff pictures in a multi-frame format.
Step S501 in this example includes the following sub-steps:
step S5011, setting a picture frame number idx by using a tiff function of matlab, preferably by using TIFFSetDirectory (tiff, idx) functions of the matlab function, and setting the number of channels to be equal to or less than 1 and equal to or less than 1;
step S5012, setting a picture length and a picture width by using a tiff function of matlab, preferably by using TIFFSETFIELD (tiff, nWidth, nHeight) functions of matlab functions, wherein the picture length nWidth is less than or equal to nTr of the track number of ‌ target data, and the picture width nHeight is less than or equal to nSam of sampling points;
Step S5013, writing the slice set obtained by the step S4 into a tiff function of the matlab according to a hierarchical sequence, preferably into a TIFFWRITESCANLINE function of the matlab function;
Step S5014, repeating steps S5011 to S5014 until writing of all channels is completed;
step S502 includes the sub-steps of:
Step S5021, arranging the slice sets into a three-dimensional matrix according to the channel sequence;
Step S5022, determining an observation direction aiming at the three-dimensional matrix, and carrying out equidistant sampling along the observation direction;
Step S5023, calculating the color value and the opacity of each sampling point by using a cubic smoothing interpolation method;
Step S5024, synthesizing sampling points in the observation direction in a sequence from front to back or from back to front;
Step S5025, projecting accumulated color values of all pixel points along the observation direction on a screen to generate a visual three-dimensional rendering chart; when the observation direction changes, steps S5022 to S5025 are repeated to obtain a new three-dimensional rendering map.
The embodiment also provides a three-dimensional rendering system of the underground buried object based on the three-dimensional ground penetrating radar, which adopts the three-dimensional rendering method of the underground buried object based on the three-dimensional ground penetrating radar and comprises the following steps:
The data acquisition module extracts target radar data according to the appointed position;
The data preprocessing module is used for preprocessing the extracted radar data, wherein the preprocessing comprises the steps of removing direct waves in the extracted target radar data, eliminating background transverse waves in the data of each channel, applying reverse amplitude attenuation gain to the data of each channel and removing low-frequency noise of the channel;
The data projection module is used for calculating and projecting the processed radar data to obtain a full-channel data slice set after projection is completed;
The background separation module is used for processing the data of the full-channel data slice set by utilizing an edge detection algorithm to obtain a slice set for separating the target waveform from background clutter;
And the three-dimensional rendering module is used for performing three-dimensional rendering on the targets which are intensively separated by the background separation module slice.
In summary, unlike three-dimensional rendering in other environments, the embodiment aims at a special application environment of an underground buried object, the underground buried object cannot be directly observed or detected, the underground environment where the underground buried object is located is complex, interference factors are extremely large, and therefore targets and the environment are difficult to effectively separate, the technical scheme comprising the combination of various filtering and data preprocessing is autonomously designed and optimized according to the technical difficulties, such as detailed descriptions of steps S1 to S4, and the underground buried object can be effectively separated from the underground complex environment, so that the underground buried object can be intuitively presented through rendering of step S5, the detection and display description of the underground buried object are optimized, the effectiveness and the high-efficiency of geological detection are improved, the difficulty of high-quality geological detection is reduced, and the threshold of cross-industry cooperation is reduced.
The foregoing is a further detailed description of the invention in connection with the preferred embodiments, and it is not intended that the invention be limited to the specific embodiments described. It will be apparent to those skilled in the art that several simple deductions or substitutions may be made without departing from the spirit of the invention, and these should be considered to be within the scope of the invention.

Claims (8)

1. The three-dimensional rendering method of the underground buried object based on the three-dimensional ground penetrating radar is characterized by comprising the following steps of:
step S1, extracting target radar data according to a designated position;
s2, preprocessing the extracted radar data, wherein the preprocessing comprises the steps of removing direct waves in the extracted target radar data, eliminating background transverse waves in data of each channel, applying reverse amplitude attenuation gain to the data of each channel and removing low-frequency noise of the channel;
step S3, calculating and projecting the processed radar data to obtain a full-channel data slice set after projection is completed;
S4, processing data of the full-channel data slice set by utilizing an edge detection algorithm to obtain a slice set for separating a target waveform from background clutter;
Step S5, performing three-dimensional rendering on the targets which are intensively separated from the slices after the target waveforms and the background clutter are separated in the step S4;
The step S1 of extracting target radar data refers to extracting radar amplitude intensity data of a specified area of each measuring line, specifying a start and stop number, and extracting radar data within a range of the start and stop number of each measuring line, and includes the following sub-steps:
step S101, obtaining the original data of each measuring line of the three-dimensional ground penetrating radar;
Step S102, according to the three-dimensional rendering target position, referring to the line position file, determining the initial track number Tr1 and the final track number Tr2 in the corresponding radar original data, searching the sample number nSam of each track of the data file and the data type in the data file according to the single-channel data detailed configuration file, determining the byte number nByte of each data according to the data type, calculating the total byte number totalB of the data of each channel to be extracted according to a formula totalB = nTr × nSam × nByte, and calculating the byte number ignB to be skipped when extracting the target data according to a formula ignB =Tr1× nSam × nByte; wherein nTr is the number of tracks that the data of each channel needs to extract, nTr =tr2-Tr 1;
the process of preprocessing the extracted radar data in the step S2 comprises the following substeps:
Step S201, for single channel data, optionally, one of the data is selected, a position sam1 with a first amplitude value lower than a preset sample point is determined, the position sam1 is used as a position of a stop sample point of the direct wave of the channel, and all sample point data from a start sample point sam0 to the stop sample point of each data of the channel are discarded according to a formula totalB' = nTr (nSam- (sam 1-sam 0)) -nByte;
Step S202, setting M (nSam, nTr) as a matrix form of the channel data after the direct wave is removed in the step S201, averaging sample points at the same position of each track in the data matrix M (nSam, nTr) to obtain a vector BackGr (nSam, 1), and subtracting the vector BackGr (nSam, 1) from each track of data of the data matrix M (nSam, nTr) to obtain a new data matrix M0;
step S203, each data of the data matrix M0 obtained in the step S202 is combined with the normalized gain factor Multiplying effects application of inverse amplitude decay gains to the data for the respective channels, wherein,A (t) is a fitting attenuation model, and is calculated by the formula a (t) =c (1) ×exp (-a (1) ×t) + times.+c (n) ×exp (-a (n) ×t), c is a median and mean compensation function of forward and reverse fourier transform, a is an amplitude value of each sampling point, n is a linearly adjustable parameter representing the sampling point, t is a time axis representing depth, and max a) Maximum return value for fitting decay curve;
step S204, the data matrix obtained in the step S203 is processed through zero-phase high-pass finite impulse response filtering.
2. The three-dimensional ground penetrating radar-based three-dimensional rendering method of underground buried object according to claim 1, wherein in the step S201, the total number of defined sample points is nSam0, wherein nSam0 < nSam- (sam 1-sam 0), for each data of the channel, only sample points from sam1 to sam1+ nSam0 are reserved, and the size of the resulting data is: totalB' = nTr × nSam0 × nByte.
3. The three-dimensional rendering method of an underground buried object based on a three-dimensional ground penetrating radar according to claim 1, wherein the step S3 comprises the sub-steps of:
step S301, calculating the mean value m and standard deviation S of all data points of all channels;
step S302, respectively setting a minimum value aMin and a maximum value aMax of the radar data amplitude according to formulas aMin =m-p×s and aMax =m+p×s, wherein p is a linearly adjustable parameter representing the amplitude variation;
in step S303, all data points of all channels are projected into the image domain range of [0,1] according to the maximum value of aMax and the minimum value of aMin, and a full-channel data slice set is obtained after projection is completed.
4. The method for three-dimensional rendering of an underground buried object based on three-dimensional ground penetrating radar according to claim 3, wherein in the step S303, a projection formula is usedProjecting all data points of all channels into an image domain range of [0,1 ]; wherein, Is the amplitude value of the data point.
5. The three-dimensional rendering method of an underground buried object based on a three-dimensional ground penetrating radar according to claim 1, wherein the step S4 includes the sub-steps of:
step S401, denoising and smoothing the data of the full-channel data slice set obtained in the step S3 through Gaussian filtering;
step S402, calculating the amplitude and the direction of the pixel gradient by using a sobel operator;
step S403, performing non-maximum suppression on the image, and reserving the point with the maximum local gradient;
Step S404, setting a first pixel threshold and a second pixel threshold, wherein pixel points with gradient amplitude exceeding the first pixel threshold are reserved as edge pixels; discarding pixels with gradient magnitudes below a second threshold of pixels;
Step S405, reserving pixel points communicated with the edge pixels according to the edge continuous rule for the pixel points with gradient amplitude values between the first pixel threshold value and the second pixel threshold value, and discarding the pixel points which cannot be communicated with the edge pixels;
Step S406, performing secondary communication on the discontinuous edge obtained in the step S405 to form a closed edge, generating a logic matrix according to the closed edge, wherein the inside of the closed edge is 1, and the outside of the closed edge is 0; repeating the steps once for each channel to obtain a closed edge logic matrix of each channel;
Step S407, multiplying the closed edge logic matrix of each channel with the data matrix obtained in step S2 to obtain a slice set for separating the target waveform from the background clutter.
6. The three-dimensional rendering method of an underground buried object based on a three-dimensional ground penetrating radar according to claim 1, wherein the step S5 comprises the sub-steps of:
step S501, generating a tiff picture in a multi-frame format from a slice set with a target waveform separated from background clutter;
step S502, three-dimensional rendering is carried out through tiff pictures in a multi-frame format.
7. The three-dimensional rendering method of an underground buried object based on a three-dimensional ground penetrating radar according to claim 6, wherein the step S501 includes the sub-steps of:
s5011, setting a picture frame number idx by using a tiff function of matlab, and setting the number of channels to be equal to or less than 1 and equal to or less than 1;
S5012, setting a picture length and a picture width by using a tiff function of matlab, wherein the picture length nWidth is less than or equal to nTr of the track number of ‌ target data, and the picture width nHeight is less than or equal to nSam of sampling points;
step S5013, writing the slice set obtained by the step S4 into a tiff function of matlab according to a hierarchical sequence;
Step S5014, repeating steps S5011 to S5014 until writing of all channels is completed;
step S502 includes the sub-steps of:
Step S5021, arranging the slice sets into a three-dimensional matrix according to the channel sequence;
Step S5022, determining an observation direction aiming at the three-dimensional matrix, and carrying out equidistant sampling along the observation direction;
Step S5023, calculating the color value and the opacity of each sampling point by using a cubic smoothing interpolation method;
Step S5024, synthesizing sampling points in the observation direction in a sequence from front to back or from back to front;
Step S5025, projecting accumulated color values of all pixel points along the observation direction on a screen to generate a visual three-dimensional rendering chart; when the observation direction changes, steps S5022 to S5025 are repeated to obtain a new three-dimensional rendering map.
8. A three-dimensional ground penetrating radar-based three-dimensional rendering system for an underground buried object, characterized in that the three-dimensional ground penetrating radar-based three-dimensional rendering method for an underground buried object according to any one of claims 1 to 7 is adopted, and comprises:
The data acquisition module extracts target radar data according to the appointed position;
The data preprocessing module is used for preprocessing the extracted radar data, wherein the preprocessing comprises the steps of removing direct waves in the extracted target radar data, eliminating background transverse waves in the data of each channel, applying reverse amplitude attenuation gain to the data of each channel and removing low-frequency noise of the channel;
The data projection module is used for calculating and projecting the processed radar data to obtain a full-channel data slice set after projection is completed;
The background separation module is used for processing the data of the full-channel data slice set by utilizing an edge detection algorithm to obtain a slice set for separating the target waveform from background clutter;
and the three-dimensional rendering module is used for three-dimensionally rendering the target which is intensively separated from the slice after the background clutter is separated from the target waveform by the background separation module.
CN202110963447.9A 2021-08-20 2021-08-20 Three-dimensional rendering method and system for underground buried object based on three-dimensional ground penetrating radar Active CN113985487B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110963447.9A CN113985487B (en) 2021-08-20 2021-08-20 Three-dimensional rendering method and system for underground buried object based on three-dimensional ground penetrating radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110963447.9A CN113985487B (en) 2021-08-20 2021-08-20 Three-dimensional rendering method and system for underground buried object based on three-dimensional ground penetrating radar

Publications (2)

Publication Number Publication Date
CN113985487A CN113985487A (en) 2022-01-28
CN113985487B true CN113985487B (en) 2024-10-18

Family

ID=79735153

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110963447.9A Active CN113985487B (en) 2021-08-20 2021-08-20 Three-dimensional rendering method and system for underground buried object based on three-dimensional ground penetrating radar

Country Status (1)

Country Link
CN (1) CN113985487B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114578348B (en) * 2022-05-05 2022-07-29 深圳安德空间技术有限公司 Autonomous intelligent scanning and navigation method for ground penetrating radar based on deep learning
CN116559867A (en) * 2023-07-12 2023-08-08 深圳安德空间技术有限公司 Scanning detection method for three-dimensional SAR imaging of unmanned aerial vehicle-mounted two-dimensional ground penetrating radar

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111551927A (en) * 2020-05-19 2020-08-18 上海圭目机器人有限公司 Underground pipeline diameter measuring method based on three-dimensional ground penetrating radar
CN113256562A (en) * 2021-04-22 2021-08-13 深圳安德空间技术有限公司 Road underground hidden danger detection method and system based on radar images and artificial intelligence

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09288188A (en) * 1996-04-23 1997-11-04 Osaka Gas Co Ltd Method and apparatus for detecting object buried underground
FR2802303B1 (en) * 1999-12-14 2002-03-08 Centre Nat Rech Scient METHOD FOR OBTAINING BASEMENT IMAGING USING GROUND PENETRATION RADAR
US6700526B2 (en) * 2000-09-08 2004-03-02 Witten Technologies Inc. Method and apparatus for identifying buried objects using ground penetrating radar
JP4644816B2 (en) * 2006-04-14 2011-03-09 国立大学法人東北大学 Ground penetrating radar apparatus and image signal processing method
CN104020495B (en) * 2014-06-24 2015-05-06 中国矿业大学(北京) Automatic underground pipeline parameter recognizing method on basis of ground penetrating radar
CN105974405B (en) * 2016-05-04 2018-07-06 哈尔滨工业大学 Ground Penetrating Radar rear orientation projection imaging method based on amplitude weighting
CN111562574B (en) * 2020-05-22 2022-08-16 中国科学院空天信息创新研究院 MIMO ground penetrating radar three-dimensional imaging method based on backward projection
CN112132946B (en) * 2020-09-29 2023-03-10 深圳安德空间技术有限公司 Data extraction and display method for three-dimensional ground penetrating radar
CN112232392B (en) * 2020-09-29 2022-03-22 深圳安德空间技术有限公司 Data interpretation and identification method for three-dimensional ground penetrating radar

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111551927A (en) * 2020-05-19 2020-08-18 上海圭目机器人有限公司 Underground pipeline diameter measuring method based on three-dimensional ground penetrating radar
CN113256562A (en) * 2021-04-22 2021-08-13 深圳安德空间技术有限公司 Road underground hidden danger detection method and system based on radar images and artificial intelligence

Also Published As

Publication number Publication date
CN113985487A (en) 2022-01-28

Similar Documents

Publication Publication Date Title
CN113985487B (en) Three-dimensional rendering method and system for underground buried object based on three-dimensional ground penetrating radar
CN113238190B (en) A Denoising Method of Ground Penetrating Radar Echo Signal Based on EMD Joint Wavelet Threshold
CN105353373B (en) One kind is based on Hough transform Ground Penetrating Radar target extraction method and device
US20060184021A1 (en) Method of improving the quality of a three-dimensional ultrasound doppler image
CN101980287B (en) Method for detecting image edge by nonsubsampled contourlet transform (NSCT)
CN108776336A (en) A kind of adaptive through-wall radar static human body object localization method based on EMD
CN112666552B (en) Ground penetrating radar data background clutter self-adaptive removing method
CN108710888B (en) A kind of Coherent Noise in GPR Record method for registering
CN112324422B (en) Electric imaging logging fracture and hole identification method, system and pore structure characterization method
Chen et al. Automatic scaling of F layer from ionograms
CN105676230B (en) Real-time fishing net autonomous classification device and recognition methods for the navigation of underwater avoidance
CN113759337A (en) Three-dimensional ground penetrating radar real-time interpretation method and system for underground space data
DE202013105253U1 (en) imaging device
CN110031854A (en) A kind of more echoes of real-time high-precision laser are apart from extracting method
CN116973914A (en) Road hidden disease three-dimensional reconstruction method based on three-dimensional ground penetrating radar
CN115494496A (en) Single-bit radar imaging system, method and related equipment
EP2103954B1 (en) Adaptive clutter signal filtering in an ultrasound system
CN109002777B (en) An infrared small target detection method for complex scenes
CN113688692A (en) Water supply pipeline leakage detection method based on time-frequency scale characteristics of ground penetrating radar
CN114578348B (en) Autonomous intelligent scanning and navigation method for ground penetrating radar based on deep learning
CN111627035A (en) Method for fusing ground penetrating radar attribute features by utilizing wavelet transform
CN102693530B (en) Synthetic aperture radar (SAR) image despeckle method based on target extraction and speckle reducing anisotropic diffusion (SRAD) algorithm
CN113126089A (en) Ground penetrating radar data display method
CN101930605A (en) SAR image target extraction method and system based on two-dimensional hybrid transformation
CN119805440B (en) Integrated coupling imaging device and method for ground penetrating radar and millimeter wave radar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant