[go: up one dir, main page]

CN120180200A - Intelligent underwater acoustic signal processing system and method based on multi-sensor data fusion - Google Patents

Intelligent underwater acoustic signal processing system and method based on multi-sensor data fusion Download PDF

Info

Publication number
CN120180200A
CN120180200A CN202510652324.1A CN202510652324A CN120180200A CN 120180200 A CN120180200 A CN 120180200A CN 202510652324 A CN202510652324 A CN 202510652324A CN 120180200 A CN120180200 A CN 120180200A
Authority
CN
China
Prior art keywords
data
module
signal
fusion
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202510652324.1A
Other languages
Chinese (zh)
Other versions
CN120180200B (en
Inventor
吕志超
邢学飞
杜立彬
于菲
刘铭扬
王刚
吕晨龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University of Science and Technology
Original Assignee
Shandong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University of Science and Technology filed Critical Shandong University of Science and Technology
Priority to CN202510652324.1A priority Critical patent/CN120180200B/en
Publication of CN120180200A publication Critical patent/CN120180200A/en
Application granted granted Critical
Publication of CN120180200B publication Critical patent/CN120180200B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/10Pre-processing; Data cleansing
    • G06F18/15Statistical pre-processing, e.g. techniques for normalisation or restoring missing data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • G06F2218/04Denoising
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • G06F2218/10Feature extraction by analysing the shape of a waveform, e.g. extracting parameters relating to peaks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

本发明公开一种基于多传感器数据融合的智能水声信号处理系统及方法,属于水下信号处理领域,通过多个传感器采集水下目标的水声信号、线加速度和角速度等信息并输出连续的模拟信号,然后传输至信号调理模块根据目标强度动态调整增益,并输出放大后的差分信号至模数转换模块,通过24位Δ‑ΣADC进行高精度数字化并输出数字信号至主控单元,主控单元进行多传感器数据融合和目标方位计算;最后主控单元通过RS485总线将水声信号和设备姿态信息数据上传至上位机进行人机交互。本方案基于多传感器数据融合,能够在动态水下环境中实现高精度目标探测与定位,具有广泛的推广及应用价值。

The present invention discloses an intelligent underwater acoustic signal processing system and method based on multi-sensor data fusion, which belongs to the field of underwater signal processing. It collects underwater acoustic signals, linear acceleration, angular velocity and other information of underwater targets through multiple sensors and outputs continuous analog signals, which are then transmitted to the signal conditioning module to dynamically adjust the gain according to the target intensity, and output the amplified differential signal to the analog-to-digital conversion module. It is digitized with high precision through a 24-bit Δ-ΣADC and outputs the digital signal to the main control unit, which performs multi-sensor data fusion and target orientation calculation; finally, the main control unit uploads the underwater acoustic signal and equipment posture information data to the host computer through the RS485 bus for human-computer interaction. This solution is based on multi-sensor data fusion, can achieve high-precision target detection and positioning in a dynamic underwater environment, and has wide promotion and application value.

Description

Intelligent underwater acoustic signal processing system and method based on multi-sensor data fusion
Technical Field
The invention belongs to the field of underwater signal processing, and particularly relates to an intelligent underwater acoustic signal processing system and method based on multi-sensor data fusion.
Background
With the continuous promotion of the fields of ocean resource development, submarine engineering monitoring, underwater safety precaution and the like, the requirements on the accuracy and the instantaneity of underwater target detection and positioning are continuously improved. The traditional underwater detection system mainly relies on sonar technology, underwater sound signals are collected through a hydrophone array, and then parameters such as phase and amplitude difference are utilized to calculate the target position.
However, in practical applications, the conventional system has a number of disadvantages:
on one hand, most systems adopt a single sensor or simply combined multi-sensor scheme, the data fusion effect is poor due to the difference of each sensor in acquisition time sequence, sensitivity and response speed, errors are easily introduced to influence positioning accuracy, on the other hand, the factors such as the complexity and the variability of the underwater environment, temperature, pressure, flow speed, ocean clutter and the like can aggravate background noise, and the traditional system lacks an effective dynamic calibration and error compensation mechanism, so that weak target signals are difficult to accurately extract.
In addition, the underwater vibration or motion signal is very weak, and can be processed after multistage amplification, filtering and high-precision analog-to-digital conversion, and the existing signal conditioning scheme has delay in the processes of anti-interference, real-time acquisition and digitization, so that the requirement of quick response cannot be met. Moreover, the underwater data transmission is easy to be interfered by a communication link, the error rate in the transmission process is high, the delay is obvious, and the real-time monitoring and remote control capability of the system is further restricted.
Disclosure of Invention
Aiming at the problems of insufficient multi-sensor cooperative precision, limited dynamic environment adaptability, lack of real-time emergency processing capability and the like in the prior art, the invention provides an intelligent underwater sound signal processing system and method based on multi-sensor data fusion.
The intelligent underwater acoustic signal processing system based on multi-sensor data fusion comprises a main control unit, and a data processing module and a multi-sensor module which are connected with the main control unit, wherein the multi-sensor module comprises a hydrophone, an underwater acceleration sensor, an attitude sensor and a temperature sensor, and acquires an underwater acoustic signal, a linear acceleration signal, an angular velocity signal and a temperature signal correspondingly;
The data processing module comprises a signal conditioning module and an analog-to-digital conversion module, wherein the analog signals acquired by the sensor module are amplified and conditioned through the signal conditioning module, and the amplified and conditioned signals are sent to the analog-to-digital conversion module for digitization;
The main control unit comprises a data fusion module, an azimuth calculation module and a calibration module, wherein the data fusion module receives output information of the data processing module for data fusion, inputs a data fusion result into the azimuth calculation module, performs beam forming and target direction estimation by combining Eigen-AMVDR algorithm, finally outputs target azimuth and motion state parameters, and ensures output accuracy by combining the calibration module;
the main control unit is connected with the upper computer through the communication module so as to realize data transmission and remote monitoring.
Further, the data fusion module performs fusion processing including inertial data fusion and underwater sound signal fusion, and specifically:
The inertial data fusion comprises the steps of firstly fusing linear acceleration and angular velocity signals, adopting a complementary filtering method to calculate a current attitude angle, constructing a rotation matrix based on the current attitude angle to perform attitude compensation on the linear acceleration signals, combining dynamic weight distribution to perform weighted fusion, calculating a multi-channel weighted average value as a fusion result, and finally synthesizing corrected linear acceleration data;
And (3) merging the underwater acoustic signals, namely correcting the receiving direction of the underwater acoustic signals in real time by using a rotation matrix constructed by posture compensation, synchronously packaging the corrected underwater acoustic signals and the data merged by the inertial data under the unified time stamp and posture reference, and entering an azimuth calculation module.
Further, the azimuth calculation module builds a covariance matrix aiming at the received signals, and performs characteristic decomposition on the covariance matrix to obtain characteristic values and characteristic vectors, wherein the characteristic vectors with large characteristic values correspond to signal subspaces, the characteristic vectors with small characteristic values correspond to noise subspaces, and the received signals are projected to the signal subspaces and the noise subspaces by utilizing the characteristic vectors so as to separate out the signals and the noise components;
Applying an Eigen-AMVDR algorithm in a signal subspace, calculating an optimal weight vector W so as to maximize the gain in a target direction, and simultaneously suppressing interference and noise in other directions, specifically:
(1) Obtaining the linear acceleration, the angular velocity and the underwater sound signal data after fusion from a data fusion module, constructing sample data according to a predefined time window, calculating a covariance matrix R i by using the sample data, then carrying out eigenvalue decomposition on the covariance matrix R i, extracting main components of the signals, and obtaining the distribution condition of a signal subspace and a noise subspace;
(2) Second order statistics among a plurality of channels are fused, and a weighted fusion type power spectrum is designed according to the second order statistics:
;
Wherein, Indicating that the ith channel is in directionPower response function, weightThe signal energy of each channel and the statistical characteristics thereof determine, and the comprehensive beam pointing response is calculated by weighting and fusing the information of a plurality of channels;
(3) Traversing within a set angle range Calculating the power spectrum for each angleDrawing the result to form a beam pattern, and carrying out weighted fusion on the information in the signal subspace and the weighted fusion type power spectrum;
Specifically, an enhanced beam function is defined:
;
Wherein, Is the desired directionThe guiding vector on the upper part of the frame,Is thatIs a conjugate transpose of (2);
(4) The actual result is Seen as a true target azimuth distributionConvolution with a point spread function PSF results in:
;
Pair of deconvolution algorithms Performing iterative processing to solve the target distributionThe iterative update formula is:
;
Wherein, From the following componentsThe process of the initialization is carried out,Is thatIs a flipped version of (a);
final target direction estimation The output result after the iteration process is converged is:
;
Wherein K is the number of steps when the iteration termination condition is satisfied, and finally the estimated target azimuth is obtained through iteration update A clear peak is formed in the correct direction to accurately determine the target bearing.
Further, the calibration module is configured to implement real-time dynamic compensation:
In the running process of the system, output data of each sensor are continuously collected and compared in real time, linear acceleration data collected each time are analyzed by utilizing a sliding window technology, mean value and variance of the linear acceleration data are calculated, if the data of a plurality of continuous windows exceed a set tolerance range, the data are regarded as deviation, when the deviation is detected, a low-pass filter is applied to filter out sudden noise based on a calibration module, new bias and gain values are recalculated, and new bias and gain value parameters are dynamically loaded into the calibration module for calibration.
Further, the signal conditioning module adopts a differential amplification circuit design, the differential amplification circuit adopts a differential amplifier OPA1632 to construct a variable gain differential amplification network, the variable gain differential amplification network comprises a first OPA module and a second OPA module which are connected in series, a positive input end of the first OPA module is connected with a resistor R1, a negative input end of the first OPA module is connected with a resistor R2, a resistor R3 and a capacitor C1 are connected in parallel between the positive input end and the negative output end of the first OPA module, a resistor R4 and a capacitor C2 are connected in parallel between the negative input end and the positive output end of the first OPA module, a resistor R5 is connected with a resistor R6 in parallel between the positive input end and the negative output end of the second OPA module, a resistor R8 and a capacitor C4 are connected in parallel between the negative input end and the positive output end of the second OPA module, and a resistor R9 and a capacitor C10 are connected with the ground correspondingly between the negative output end and the positive output end of the second OPA module respectively.
Further, after the data processed by the data processing module is transmitted to the main control unit, data preprocessing is performed first, the preprocessed data is fused by the data fusion module, and when the data preprocessing is performed, the method specifically comprises the following steps:
1) Time-space alignment, namely marking different sensor sampling data through uniform time stamps to realize time synchronization of multi-source data;
2) Static reference calibration, namely collecting data in a static state at a system starting stage, and constructing a linear calibration model;
3) And temperature compensation, namely correcting the underwater sound signal, the angular velocity and the linear acceleration measured value by combining a temperature drift model of each sensor based on temperature data acquired in real time.
The invention further provides an intelligent underwater acoustic signal processing method based on multi-sensor data fusion, which comprises the following steps of:
Step A, data acquisition and preprocessing, wherein a multi-sensor module acquires underwater related signals, analog signals acquired by the sensor module are amplified and conditioned through a signal conditioning module, the amplified and conditioned signals are sent to an analog-to-digital conversion module for digitizing, and the signals are uniformly input into a main control unit for signal preprocessing;
When the main control unit performs signal preprocessing, the method comprises the following steps:
1) Time-space alignment, namely marking different sensor sampling data through uniform time stamps to realize time synchronization of multi-source data;
2) Static reference calibration, namely collecting data in a static state at a system starting stage, and constructing a linear calibration model;
3) Temperature compensation, namely correcting the underwater sound signal, the angular velocity and the linear acceleration measured value based on temperature data acquired in real time by combining a temperature drift model of each sensor so as to reduce zero point offset and gain error caused by temperature change;
b, data fusion and filtering, namely carrying out multi-source information fusion on the data preprocessed by the main control unit through a data fusion module, carrying out attitude compensation on the linear acceleration signal, and removing noise through filtering;
Step C, position and direction calculation, namely, combining an azimuth calculation module, using an Eigen-AMVDR algorithm, extracting a characteristic vector of a signal by calculating a covariance matrix of the signal, and further estimating the direction and azimuth of the target;
Step D, error calibration and compensation, namely continuously acquiring output data of each sensor and comparing the output data in real time in the running process of the system, analyzing the linear acceleration data acquired each time by utilizing a sliding window technology, and calculating the mean value and variance of the linear acceleration data;
and E, data monitoring and displaying, namely displaying and monitoring the finally processed data through an upper computer.
Further, in the step C, when the azimuth calculation is performed, the principle is as follows:
The azimuth calculation module builds a covariance matrix aiming at the received signals, and performs characteristic decomposition on the covariance matrix to obtain characteristic values and characteristic vectors, wherein the characteristic vectors with large characteristic values correspond to signal subspaces, the characteristic vectors with small characteristic values correspond to noise subspaces, and the received signals are projected to the signal subspaces and the noise subspaces by utilizing the characteristic vectors so as to separate out signal and noise components;
Applying an Eigen-AMVDR algorithm in a signal subspace, calculating an optimal weight vector W so as to maximize the gain in a target direction, and simultaneously suppressing interference and noise in other directions, specifically:
(1) Obtaining the linear acceleration, the angular velocity and the underwater sound signal data after fusion from a data fusion module, constructing sample data according to a predefined time window, calculating a covariance matrix R i by using the sample data, then carrying out eigenvalue decomposition on the covariance matrix R i, extracting main components of the signals, and obtaining the distribution condition of a signal subspace and a noise subspace;
(2) Second order statistics among a plurality of channels are fused, and a weighted fusion type power spectrum is designed according to the second order statistics:
;
Wherein, Indicating that the jth channel is in the directionPower response function, weightThe signal energy of each channel and the statistical characteristics thereof determine, and the comprehensive beam pointing response is calculated by weighting and fusing the information of a plurality of channels;
(3) Traversing within a set angle range Calculating the power spectrum for each angleDrawing the result to form a beam pattern, and carrying out weighted fusion on the information in the signal subspace and the weighted fusion type power spectrum;
Specifically, an enhanced beam function is defined:
;
Wherein, Is the desired directionThe guiding vector on the upper part of the frame,Is thatIs a conjugate transpose of (2);
(4) The actual result is Seen as a true target azimuth distributionConvolution with a point spread function PSF results in:
Pair of deconvolution algorithms Performing iterative processing to solve the target distributionThe iterative update formula is:
;
Wherein, From the following componentsThe process of the initialization is carried out,Is thatIs a flipped version of (a);
final target direction estimation The output result after the iteration process is converged is:
;
Wherein K is the number of steps when the iteration termination condition is satisfied, and finally the estimated target azimuth is obtained through iteration update A clear peak is formed in the correct direction to accurately determine the target bearing.
Compared with the prior art, the invention has the advantages and positive effects that:
According to the scheme, through multi-sensor data fusion and intelligent signal processing, underwater target signals are effectively distinguished, the underwater target signals are amplified and conditioned through a signal conditioning module, in addition, accurate target direction estimation is creatively achieved through Eigen-AMVDR and other algorithms, the RS485 bus and differential compression algorithm are combined, data transmission quantity is greatly reduced, instantaneity is guaranteed, the requirements of submarine long-distance monitoring and real-time interaction are met, in addition, the design of a calibration module is combined, an anomaly detection mechanism, a timeout retransmission mechanism and a fault isolation mechanism are built in, timely feedback and automatic compensation can be carried out when data anomalies or equipment faults occur, and long-term stable operation of a system is guaranteed.
Drawings
FIG. 1 is a block diagram of a system according to an embodiment of the present invention;
FIG. 2 is a schematic circuit diagram of setting PGA amplification gain in an analog-to-digital conversion module according to an embodiment of the present invention;
FIG. 3 is a schematic circuit diagram of a signal conditioning module according to an embodiment of the present invention;
Fig. 4 is a schematic flow chart of a method according to an embodiment of the invention.
Detailed Description
In order that the above objects, features and advantages of the invention will be more readily understood, a further description of the invention will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced otherwise than as described herein, and therefore the present invention is not limited to the specific embodiments disclosed below.
Currently, in underwater target detection and positioning, the following three key problems are mainly faced:
(1) The information dimension of a single sensor is limited, and the requirement of high-precision three-dimensional perception is difficult to meet. Most current underwater target detection systems only rely on a single underwater acceleration sensor to acquire signals in a certain direction, and cannot comprehensively describe the real motion state of a target in a three-dimensional space. Especially in tasks such as target track recognition, fine positioning and the like, visual angle blind areas exist in single-axis data, and system performance is severely restricted;
(2) The underwater environment is dynamically complex, the sensor error is time-varying and difficult to compensate independently. The underwater environment is interfered by multiple factors such as temperature, pressure, flow speed, ocean clutter and the like, the problems of zero drift, gain change, transient state abnormality and the like of the sensor in actual deployment are easy to occur, the traditional system lacks the capability of dynamic identification and joint calibration of error sources, and the system has poor robustness and cannot cope with local sensor failure or data abnormality. The prior proposal generally lacks a redundancy mechanism, and once a single sensor fails or data is subjected to strong interference, the whole sensing function of the system is paralyzed.
Aiming at the problems, the multi-sensor data fusion technology is introduced, and multi-source data are fused in time, space and statistical characteristics to achieve the purposes of improving three-dimensional perceptibility, fusing underwater sound signals, linear acceleration, angular velocity and temperature information to obtain complete target motion description, enhancing system robustness and anti-interference performance, and achieving 'perceptive redundancy' and 'fault tolerance fusion' through multi-channel comparison, dynamic weighting and abnormal rejection, so that the system can stably operate even if part of sensors fail.
The embodiment 1 of the intelligent underwater acoustic signal processing system based on multi-sensor data fusion is as shown in fig. 1, and comprises a main control unit, a data processing module and a multi-sensor module, wherein the data processing module and the multi-sensor module are connected with the main control unit, the multi-sensor module comprises a hydrophone, an underwater acceleration sensor, an attitude sensor and a temperature sensor, the underwater acceleration sensor and the attitude sensor respectively adopt an ADXL335 triaxial accelerometer and an MPU6050 attitude sensor, can perform motion monitoring in six degrees of freedom so as to keep high sensitivity in small vibration detection, the data processing module is communicated with the main control unit through an SPI digital interface, the main control unit comprises a data fusion module, an azimuth calculation module and a calibration module, and is connected with an upper computer through a communication module so as to realize data transmission and remote monitoring, and the data display of an acceleration curve, an azimuth angle, environmental parameters and the like of an underwater target is realized at the upper computer.
The data processing module comprises a signal conditioning module and an analog-to-digital conversion module, and because signals in an underwater environment are weak and are easy to be interfered by noise, analog signals need to be amplified and digitized through the signal conditioning module and the analog-to-digital conversion module and then transmitted to the main control unit for data fusion processing, and the data processing module is particularly used for:
Firstly, the hydrophone continuously outputs very weak analog signals due to the state change of the hydrophone, the very weak analog signals are transmitted to the data processing module for processing, the analog signals of the sensor are amplified and conditioned through the signal conditioning module, the amplified and conditioned signals are transmitted to the analog-to-digital conversion module for digitizing, and meanwhile, the main control unit continuously collects the data of the underwater acceleration sensor and the attitude sensor to determine the running attitude of the equipment, so that the safe running of the equipment is ensured.
The signal conditioning module adopts a differential amplifying circuit design, specifically, in combination with fig. 3, the signal conditioning module is as follows:
The differential amplifying circuit adopts a differential amplifier OPA1632 to construct a variable gain differential amplifying network, and comprises a first OPA module and a second OPA module which are connected in series, wherein the positive input end of the first OPA module is connected with a resistor R1, the negative input end of the first OPA module is connected with a resistor R2, the positive input end of the first OPA module is connected with a resistor R3 and a capacitor C1 in parallel, the negative input end of the first OPA module is connected with a resistor R4 and a capacitor C2 in parallel, the positive input end of the second OPA module is connected with a resistor R5, the negative input end of the second OPA module is connected with a resistor R6 in parallel, the negative input end of the second OPA module is connected with a resistor R8 and a capacitor C4 in parallel, the negative output end of the second OPA module is connected with a capacitor C5, and the negative output end of the second OPA module and the positive output end of the second OPA module are respectively connected with a resistor R9 and R10 which are grounded. In the embodiment, the OPA1632 is adopted to construct the fully differential amplifier, and the symmetrical bridge type negative feedback structure is formed by the input resistors R1 and R2 and the feedback resistors R3 and R4, so that compared with the traditional inverting amplifier feedback design, the structure has the advantages that the number of matching resistors is reduced by about 50%, the design is simplified, and the system consistency is improved. The differential gain is about g=2r 3/R1, which has higher gain adjustment sensitivity, is favorable for realizing higher-precision gain control under the same resistance error condition, and is filtered and decoupled through the capacitors C1, C2, C3 and C4 to eliminate high-frequency noise and stabilize the power supply voltage.
Conventional differential amplification circuits are typically implemented using an operational amplifier + feedback resistor, and the present invention uses a differential amplifier OPA1632 that internally optimizes the matching and biasing circuitry to provide better CMRR (common mode rejection ratio) and distortion performance. The classical differential amplifying circuit generally realizes the gain setting of signals directly through resistors, and the scheme uses coupling capacitors (C1, C2, C3 and C4) at the input end when the circuit is designed, so that direct current offset can be filtered, and the drift problem caused by direct current amplification is avoided. In addition, the input ends (R1-R4) of the circuit have higher impedance, so that the load influence on the front-stage circuit can be effectively reduced. The output end of the circuit is provided with R9 and R10 as terminal matching resistors, and the classical differential amplifying circuit generally does not need additional terminal resistors, so that the design is beneficial to optimizing impedance matching of differential signals, reducing signal reflection and improving signal integrity.
The analog-to-digital conversion module adopts a multichannel synchronous sampling 24-bit delta-sigma analog-to-digital converter circuit (ADS 1292), a Programmable Gain Amplifier (PGA), an internal reference and an on-board oscillator are arranged in the analog-to-digital conversion module, the analog-to-digital conversion module is provided with two input signals, namely MuxP (positive input) and MuxN (negative input), a signal OUT+ end is connected with a signal positive input end of ADS1292, the signal OUT end is connected with a signal negative input end of ADS1292, and a signal +/-input end of No. 2 of ADS1292 is short-circuited and connected to a +2.5V power supply end. The resistors Ra and Rs are used to set the gain of the amplifier, the PGA output is filtered by an RC filter before entering the ADC, the filter is made up of an internal resistor rs=2kΩ and an external capacitor CFILTER (typically 4.7 nF), the larger the capacitance, the worse the Total Harmonic Distortion (THD) performance, the internal Rs resistor is accurate to 15%, and hence the actual bandwidth will vary.
After the digital signal enters the main control unit, data preprocessing is firstly carried out, and then data fusion processing is carried out. In the embodiment, a differential compression algorithm is adopted to reduce the transmission data amount by 60%, independent timer resources are required to be allocated in a main control unit to realize multi-sensor data fusion, and accurate time stamps are made on each time of linear acceleration, angular velocity and underwater sound signal data acquisition to ensure that the sensor data are aligned on the same time axis. The RS485 bus is optimized by realizing zero-delay transmission through a DMA (direct memory access) mechanism in the main control unit, and the data stream is stored in the SRAM in a buffer zone form with a fixed length, so that the CPU burden can be reduced, batch data is provided for subsequent algorithm processing, and the main control unit data preprocessing comprises the following steps:
1) Time-space alignment, namely marking different sensor sampling data through uniform time stamps to realize time synchronization of multi-source data;
2) Static reference calibration, namely collecting data in a static state at a system starting stage, and constructing a linear calibration model;
3) And temperature compensation, namely correcting the underwater sound signal, the angular velocity and the linear acceleration measured value by combining a temperature drift model of each sensor based on temperature data acquired in real time so as to reduce zero point offset and gain error caused by temperature change.
According to the invention, multi-source information fusion is carried out by combining the underwater acceleration, the gesture, the temperature sensor and the hydrophone, gesture compensation is carried out on linear acceleration data, extra linear acceleration components caused by gesture change are removed, the fused underwater sound signals are completed, the linear acceleration and the angular velocity are sent to the azimuth calculation module, the direction of a target is accurately estimated by combining with the Eigen-AMVDR algorithm, the movement state data and the azimuth angle of the target are calculated by the azimuth calculation module, so that signals in the expected direction are undistorted, the output power is kept unchanged, noise in the non-output expected direction is minimized, and the final beam output power is minimized, thereby obtaining the target azimuth. The fusion processing is divided into two parts, namely inertial data fusion and underwater acoustic signal fusion, wherein the inertial data fusion is used for gesture/linear acceleration compensation, and the underwater acoustic signal fusion is used for underwater acoustic array direction correction, and specifically:
inertial data fusion:
1) The attitude calculation, namely, fusing linear acceleration and angular velocity data, and adopting a complementary filtering method to calculate the current attitude angle (Pitch, roll, yaw) of the equipment;
2) Posture compensation, namely constructing a rotation matrix based on the current posture angle, performing posture compensation on the triaxial acceleration data, and eliminating a pseudowire acceleration component caused by equipment posture change;
3) Weighting fusion, namely dynamically distributing weights according to the historical stability and the current deviation of the sensor, wherein the higher the stability is, the smaller the deviation is, the higher the weight is distributed;
Calculating a multi-channel weighted average as a fusion result, wherein the weighted average is an average mode of weighting according to the weight of each value;
4) Abnormal rejection, namely dynamically reducing the weight of certain channel data or temporarily rejecting the channel data when the channel data is obviously deviated from the statistical intervals of other sensors, so as to enhance the robustness of the system;
5) And synthesizing a three-dimensional vector, namely synthesizing the corrected X/Y/Z triaxial acceleration data through an Euclidean norm, and representing the overall disturbance amplitude of the equipment in a three-dimensional space to reflect the current motion trend or the stability state of the equipment.
And (3) fusion of underwater acoustic signals:
After the underwater acoustic signals are subjected to signal conditioning and analog-to-digital conversion by the preprocessing module, the system corrects the receiving direction of the underwater acoustic signals in real time by using the rotation matrix constructed in the posture compensation step, compensates the pointing deviation of the hydrophone array caused by posture disturbance, and ensures that the array wave beam is consistent with the reference direction. And finally, synchronously packaging the corrected underwater sound signals, linear acceleration and angular velocity data under the unified time stamp and gesture reference, and entering an azimuth calculation module.
It should be noted that, since the present embodiment employs a fixed underwater device, all the data collected by the linear acceleration and the attitude sensor originate from the state of the device body, rather than the acceleration of an external target. The fused angular velocity and the three-axis acceleration data subjected to gesture and temperature compensation are used for assisting in the coordinate mapping in the beam direction correction and the target direction estimation, and the gesture change detection and compensation processing of the equipment, so that the stability of a reference system and the accuracy of the direction estimation of underwater acoustic signal processing are ensured.
The data fusion result is corrected triaxial acceleration + angular velocity + underwater sound signal space-time mapping data + temperature data obtained after temperature compensation, attitude compensation, weighted fusion and abnormal elimination, and the corrected triaxial acceleration + angular velocity + underwater sound signal space-time mapping data + temperature data are used for representing the motion state, target azimuth sensing and environmental response of the underwater equipment under a unified reference coordinate system. The format of the fusion data frame is shown in table 1, and each frame is in a fixed-length format and transmitted in hexadecimal form.
Table 1 fusion data frame format
Byte position Number of bytes Field name Description of the invention
0-1 2B Data frame header Fixed identifier (0 xAA 55) for frame start identification
2-5 4B Time stamp Microsecond timer for accurately recording data acquisition time
6-11 6B Underwater acoustic signal Underwater acoustic data
12-17 6B Three axis acceleration X/Y/Z three axis acceleration
18-23 6B Triaxial angular velocity Pitch, roll, yaw angular velocity
24-25 2B Temperature data Temperature value
26 1B State sign Bit0, calibration state, bit1, sensor anomalies
27-28 2B Check code CRC-16 or checksum for data integrity verification
29-30 2B Data frame end Fixed frame tail label (0X 55 AA)
The data fusion result is input into an azimuth calculation module, wherein the corrected underwater sound signal is used as the main input of the azimuth calculation module and is used for constructing a direction covariance matrix, the direction covariance matrix is the core basis for calculating a target azimuth, the target azimuth and the motion state parameters are finally output by combining an Eigen-AMVDR algorithm to carry out target direction estimation, the motion state parameters comprise an equipment attitude angle and a synthetic acceleration and are used for assisting target dynamic behavior analysis and system attitude compensation, and the azimuth calculation module is based on the following principle:
First, a covariance matrix of the received signals is calculated, which matrix comprises the spatial characteristics of the array received signals. And carrying out feature decomposition on the covariance matrix to obtain a feature value and a feature vector. The eigenvectors with larger eigenvalues correspond to the signal subspace, while the eigenvectors with smaller eigenvalues correspond to the noise subspace. The received signal is projected to a signal subspace and a noise subspace using the feature vectors, thereby separating out signal and noise components.
An Eigen-AMVDR algorithm is applied in the signal subspace to calculate an optimal weight vector W such that the gain in the target direction is maximized while suppressing interference and noise in other directions. The algorithm can separate signals and noise more accurately, improve the resolution of the beam former, and is specific to:
(1) Obtaining the linear acceleration, angular velocity and underwater sound signal data after fusion from the data fusion module, constructing sample data according to a predefined time window, and calculating a covariance matrix R i by using the sample data, namely
;
Wherein, Respectively the firstThe output vectors of the snap-shot underwater acceleration sensor, attitude sensor and hydrophone, i=1, 2,3. The method comprises the steps of collecting data from a multichannel underwater acceleration sensor, a gesture sensor and a hydrophone, extracting signal covariance information of a sensor array, reflecting spatial distribution characteristics of signals, and providing basic data for direction estimation of a target.
Performing eigenvalue decomposition on covariance matrix R i to obtain
;
Wherein, As a matrix of feature vectors,A diagonal matrix for the corresponding eigenvalue. And extracting main components of the signals through feature decomposition to obtain distribution conditions of a signal subspace and a noise subspace. Through the decomposed characteristic values and the characteristic vectors, signals and noise can be effectively distinguished, and the target detection capability is improved.
(2) Based on the traditional method that only channel information of a single-route acceleration sensor is used for direction estimation, the embodiment further fuses second order statistics among a plurality of channels (including linear acceleration, angular velocity and underwater sound signals), and designs a weighted fusion type power spectrum according to the second order statistics:
;
Wherein, Indicating that the jth channel is in the directionPower response function, weightThen it is determined by the energy of each channel signal and its statistical properties. The information of a plurality of channels is combined in a weighting mode, comprehensive beam pointing response is calculated, and direction estimation is optimized by combining data of a plurality of sensors, so that the influence of noise of a single channel is reduced, and robustness is improved.
(3) Traversing within a set angle range (e.g., -90 DEG to +90 DEG)Calculating the power spectrum for each angleThe result is plotted to form a beam pattern. The information in the signal subspace and the weighted fusion type power spectrum are subjected to weighted fusion, specifically, an enhanced beam function is defined:
;
Wherein, Is the desired directionThe guiding vector on the upper part of the frame,Is thatIs a conjugate transpose of (a). This step uses the signal subspaceProjection is performed and enhanced beam functions are combinedTo improve the contrast of the direction estimation. By enhancing the signal energy and improving the resolving power of the target azimuth, the target angle is easier to identify, the advantages of noise suppression of the signal subspace are achieved, and rich spatial information provided by second-order statistics is introduced, so that the main lobe is narrower, and the side lobe is lower.
(4) The actual result isCan be regarded as a true target azimuth distributionConvolution with a Point Spread Function (PSF), namely:
;
The formula shows that the enhanced beam function is equivalent to the target bearing distribution, meaning that there may be an expansion or blurring of the original direction estimate, requiring further optimization to obtain clearer target bearing information. Pair of deconvolution algorithms Performing iterative processing to solve sharper target distribution. The iterative update formula is:
;
Wherein, From the following componentsThe process of the initialization is carried out,Is thatIs a flipped version of (c).
Final target direction estimationThe output result after the iteration process is converged is:
;
Wherein K is the number of steps when the iteration termination condition is satisfied, and finally the estimated target azimuth is obtained through iteration update A clear peak is formed in the correct direction to accurately determine the target bearing.
After a number of iterations of the process,Will show significant peaks, which are the direction estimation results of the target.
It can be seen that the Eigen-AMVDR algorithm removes the noise subspace, leaving only the signal subspace in the output covariance matrix. The noise output power is reduced, and meanwhile, the output energy in the expected direction is basically kept unchanged, so that the signal-to-noise ratio of the output is improved, and a sharper direction estimation spectrum is obtained.
In addition, in consideration of the possible abnormal situations such as sensor damage or sudden noise spike in an actual underwater environment, the data processing module introduces a sliding window statistics abnormal detection mechanism, and alarms or automatically restarts the sensor calibration process in time when the data abnormality is found so as to ensure the overall stability of the system. And the calibration module is used for dynamically compensating the error of the sensor, so that the data accuracy and stability of the system in long-term underwater operation are ensured.
And in the real-time dynamic compensation step, the system continuously collects the output data of the sensor in the running process and performs real-time comparison. And analyzing the linear acceleration data acquired each time by utilizing a sliding window technology, and calculating the mean value and the variance. If the data of the continuous windows exceeds the set tolerance range, the deviation is considered to occur. When a deviation is detected, the system automatically triggers a 'secondary calibration', i.e. a low-pass filter is applied to filter out sudden noise, new bias and gain values are recalculated, and the parameters are dynamically loaded into a calibration module of the system for calibration. The calibrated data can be continuously used in a real-time processing chain, so that the signal accuracy of the system output is ensured.
The communication module of the embodiment adopts an RS485 bus standard to carry out data transmission, and the core controller of the main control unit adopts a singlechip (STM 32F 103), is connected with the USART interface of the main control unit through TX, RX and DE/RE pins, and is used for finishing signal level conversion of a physical layer. To ensure signal quality, a 120 Ω termination matching resistor is added across the bus. In addition, to improve the stability of the bus idle state, a bias resistor is added to bus A+/B-to ensure that the bus is at a known level when it is idle. And differential signal transmission is adopted at the RS485 layer, so that the anti-interference capability is high. Because the invention needs to transmit real-time data, the invention uses higher baud rate (115200 bps), and reduces the error rate as much as possible while meeting the real-time requirement.
When the system is started, the underwater acceleration sensor, the temperature sensor, the attitude sensor and the like are firstly placed in a static state and a stable state to collect a section of reference data. And calculating the initial zero offset and gain error of each sensor by taking an average value of a plurality of sampling points. And storing the initial calibration result into a nonvolatile memory as an initial reference for subsequent dynamic compensation. During system operation, the output variation of the sensor is continuously detected by using the real-time data stream. When the deviation of the long-time static acquisition data is detected, the system automatically enters a secondary calibration mode, performs low-pass filtering processing on the latest sampling data, eliminates transient noise, and then calculates a new calibration coefficient. By integrating the temperature sensor, the environmental temperature data are acquired in real time, a relation curve between the sensor output and the temperature is established, and the calibration parameters are dynamically adjusted. And comparing and fusing the data of each sensor through a fusion algorithm. And comparing the consistency of the output of each sensor, identifying abnormal data, and carrying out weight adjustment on each sensor according to the statistical model to realize error minimization. And acquiring equipment posture information in real time by using a posture sensor, correcting the posture of the linear acceleration data, and setting a periodically calibration task for periodically interrupting operation in a main control unit to ensure that the calibration parameters are always in an optimal state in long-term operation. When the main control unit detects that the sampling data is abnormal, the system automatically triggers an alarm mechanism, feeds abnormal information back to the upper computer and resets the abnormal information to an initial calibration value. Through the specific implementation process of the calibration module and the error compensation, the scheme can correct zero drift and gain error of the sensor and deviation caused by factors such as temperature and pressure in real time in a complex and changeable underwater environment, and ensures high precision and stability of output data.
In the embodiment, each module is integrated in one underwater device, so that stable operation of the system in an underwater complex dynamic environment is effectively ensured. The whole system has high integration and reliability and can still work normally in severe environments. In practical application, the system can be deployed in the fields of submarine resource detection, ocean engineering monitoring, ocean safety precaution and the like, provides accurate azimuth information by monitoring vibration signals and motion states of underwater targets in real time, helps operators judge target positions and motion tracks, and provides reliable data support for subsequent action decisions.
Embodiment 2, a signal processing method of the intelligent underwater acoustic signal processing system based on multi-sensor data fusion proposed in embodiment 1, with reference to fig. 4, includes the following steps:
Step A, data acquisition and pretreatment:
The system collects three-axis acceleration and angular velocity information of the equipment body in an underwater environment through an underwater acceleration sensor (such as ADXL 335) and a gesture sensor (such as MPU 6050) and is used for sensing the perturbation trend and gesture stability of the platform. Meanwhile, the underwater sound signals are collected in real time by the array hydrophone and used for subsequent direction estimation and sound source identification. The analog signals output by all the sensors are amplified by the amplifying circuit and then converted into digital signals, then the digital signals are preprocessed, and the digital signals are uniformly input into the main control unit, so that a foundation is laid for subsequent fusion processing and beam forming.
And B, data fusion and filtering:
Data from a plurality of sensors (hydrophone, underwater acceleration sensor, attitude sensor, temperature sensor and the like) are fused through a data fusion module, noise is removed through filtering, and accuracy of the data is ensured. The data is synchronized using space-time alignment coding so that the multi-sensor data can accurately co-operate.
Step C, calculating the position and the direction:
and combining an azimuth calculation module, using an Eigen-AMVDR algorithm, and extracting a characteristic vector of the signal by calculating a covariance matrix of the signal so as to estimate the direction and azimuth angle of the target. And transmitting the processed data through an RS485 bus.
Step D, error calibration and compensation:
And combining a calibration module, and performing reference calibration through static data acquisition to acquire the zero offset and gain error of the sensor. In the actual running process, the system monitors the data change in real time, and corrects errors caused by factors such as temperature, pressure and the like by calibrating through a dynamic compensation mechanism.
And E, data monitoring and displaying, namely displaying and monitoring the finally processed data through an upper computer. The user can view the motion state, the acceleration curve and the azimuth information of the target in real time, and the upper computer can carry out bidirectional communication with the equipment through the serial port to carry out configuration adjustment, parameter modification and other operations.
Specifically, in the step C, when the position and the azimuth are calculated, the following method is specifically adopted:
(1) Obtaining the linear acceleration, angular velocity and underwater sound signal data after fusion from the data fusion module, constructing sample data according to a predefined time window, and calculating a covariance matrix R i by using the sample data, namely
;
Wherein, n. The method comprises the steps of collecting data from a multichannel underwater acceleration sensor, a gesture sensor and a hydrophone, extracting signal covariance information of a sensor array, reflecting spatial distribution characteristics of signals, and providing basic data for direction estimation of a target.
Performing eigenvalue decomposition on the covariance matrix R to obtain
;
Wherein, As a matrix of feature vectors,A diagonal matrix for the corresponding eigenvalue. And extracting main components of the signals through feature decomposition to obtain distribution conditions of a signal subspace and a noise subspace. Through the decomposed characteristic values and the characteristic vectors, signals and noise can be effectively distinguished, and the target detection capability is improved.
(2) In addition to using the information of a single channel, the embodiment further calculates the second order statistics between the channels and designs a weighted fusion power spectrum based on the second order statistics;
;
Wherein, Directional response, weight, representing the j-th channel contributionThen it is determined by the energy of each channel signal and its statistical properties. The information of a plurality of channels is combined in a weighting mode, comprehensive beam pointing response is calculated, and direction estimation is optimized by combining data of a plurality of sensors, so that the influence of noise of a single channel is reduced, and robustness is improved.
(2) The information in the signal subspace and the weighted fusion type power spectrum are subjected to weighted fusion, specifically, an enhanced beam function is defined:
;
Wherein, Is the desired directionUpper steering vector. This step uses the signal subspaceProjection is performed and original statistics are combinedTo improve the contrast of the direction estimation. By enhancing the signal energy and improving the resolving power of the target azimuth, the target angle is easier to identify, the advantages of noise suppression of the signal subspace are achieved, and rich spatial information provided by second-order statistics is introduced, so that the main lobe is narrower, and the side lobe is lower.
(3) The actual result isCan be regarded as a true target azimuth distributionConvolution with a Point Spread Function (PSF), namely:
The formula shows that the enhanced beam function is equivalent to the target bearing distribution, meaning that there may be an expansion or blurring of the original direction estimate, requiring further optimization to obtain clearer target bearing information. Pair of deconvolution algorithms Performing iterative processing to solve sharper target distribution
The iterative update formula is:
;
Wherein, From the following componentsThe process of the initialization is carried out,Is thatIs a flipped version of (c).
Final target direction estimationThe output result after the iteration process is converged is:
;
wherein K is the number of steps when the iteration termination condition is satisfied, and finally the estimated target azimuth is obtained through iteration update A clear peak is formed in the correct direction to accurately determine the target bearing. After a number of iterations of the process,Will show significant peaks, which are the direction estimation results of the target.
It can be seen from the equation that the Eigen-AMVDR algorithm removes the noise subspace, leaving only the signal subspace in the output covariance matrix. The noise output power is reduced, and meanwhile, the output energy in the expected direction is basically kept unchanged, so that the signal-to-noise ratio of the output is improved, and a sharper direction estimation spectrum is obtained.
The scheme provides a brand new solution for underwater target detection by integrating a high-precision sensor, multistage data processing, intelligent calibration and high-efficiency communication technology, effectively improves detection precision and system stability, and has wide application potential.
The present invention is not limited to the above-mentioned embodiments, and any equivalent embodiments which can be changed or modified by the technical content disclosed above can be applied to other fields, but any simple modification, equivalent changes and modification made to the above-mentioned embodiments according to the technical substance of the present invention without departing from the technical content of the present invention still belong to the protection scope of the technical solution of the present invention.

Claims (10)

1. The intelligent underwater acoustic signal processing system based on multi-sensor data fusion is characterized by comprising a main control unit, and a data processing module and a multi-sensor module which are connected with the main control unit, wherein the multi-sensor module comprises a hydrophone, an underwater acceleration sensor, an attitude sensor and a temperature sensor, and respectively and correspondingly acquires an underwater acoustic signal, a linear acceleration signal, an angular velocity signal and a temperature signal;
The data processing module comprises a signal conditioning module and an analog-to-digital conversion module, wherein the analog signals acquired by the sensor module are amplified and conditioned through the signal conditioning module, and the amplified and conditioned signals are sent to the analog-to-digital conversion module for digitization;
The main control unit comprises a data fusion module, an azimuth calculation module and a calibration module, wherein the data fusion module receives output information of the data processing module for data fusion, inputs a data fusion result into the azimuth calculation module, performs beam forming and target direction estimation by combining Eigen-AMVDR algorithm, finally outputs target azimuth and motion state parameters, and ensures output accuracy by combining the calibration module;
the main control unit is connected with the upper computer through the communication module so as to realize data transmission and remote monitoring.
2. The intelligent underwater acoustic signal processing system based on multi-sensor data fusion according to claim 1, wherein the data fusion module performs fusion processing comprising inertial data fusion and underwater acoustic signal fusion, and the method is characterized in that:
The inertial data fusion comprises the steps of firstly fusing linear acceleration and angular velocity signals, adopting a complementary filtering method to calculate a current attitude angle, constructing a rotation matrix based on the current attitude angle to perform attitude compensation on the linear acceleration signals, combining dynamic weight distribution to perform weighted fusion, calculating a multi-channel weighted average value as a fusion result, and finally synthesizing corrected linear acceleration data;
And (3) merging the underwater sound signals, namely correcting the receiving direction of the underwater sound signals in real time by using a rotation matrix, synchronously packaging the corrected underwater sound signals and the data merged by the inertial data under the condition of uniform time stamp and posture reference, and entering an azimuth calculation module.
3. The intelligent underwater acoustic signal processing system based on multi-sensor data fusion of claim 2, wherein the azimuth calculation module constructs a covariance matrix for the received signals, performs feature decomposition on the covariance matrix to obtain feature values and feature vectors, wherein the feature vectors with large feature values correspond to signal subspaces, the feature vectors with small feature values correspond to noise subspaces, and the received signals are projected to the signal subspaces and the noise subspaces by using the feature vectors to separate out signal and noise components;
Applying an Eigen-AMVDR algorithm in a signal subspace, calculating an optimal weight vector W so as to maximize the gain in a target direction, and simultaneously suppressing interference and noise in other directions, specifically:
(1) Obtaining the linear acceleration, the angular velocity and the underwater sound signal data after fusion from a data fusion module, constructing sample data according to a predefined time window, calculating a covariance matrix R i by using the sample data, then carrying out eigenvalue decomposition on the covariance matrix R i, extracting main components of the signals, and obtaining the distribution condition of a signal subspace and a noise subspace;
(2) Second order statistics among a plurality of channels are fused, and a weighted fusion type power spectrum is designed according to the second order statistics:
;
Wherein, Indicating that the jth channel is in the directionPower response function, weightThe signal energy of each channel and the statistical characteristics thereof determine, and the comprehensive beam pointing response is calculated by weighting and fusing the information of a plurality of channels;
(3) Traversing within a set angle range Calculating the power spectrum for each angleDrawing the result to form a beam pattern, and carrying out weighted fusion on the information in the signal subspace and the weighted fusion type power spectrum;
Specifically, an enhanced beam function is defined:
;
Wherein, Is the desired directionThe guiding vector on the upper part of the frame,Is thatIs a conjugate transpose of (2);
(4) The actual result is Seen as a true target azimuth distributionConvolution with a point spread function PSF results in:
;
Pair of deconvolution algorithms Performing iterative processing to solve the target distributionThe iterative update formula is:
;
Wherein, From the following componentsThe process of the initialization is carried out,Is thatIs a flipped version of (a);
final target direction estimation The output result after the iteration process is converged is:
;
Wherein K is the number of steps when the iteration termination condition is satisfied, and finally the estimated target azimuth is obtained through iteration update A clear peak is formed in the correct direction to accurately determine the target bearing.
4. The intelligent underwater acoustic signal processing system based on multi-sensor data fusion as claimed in claim 1, wherein the calibration module is used for realizing real-time dynamic compensation:
In the running process of the system, output data of each sensor are continuously collected and compared in real time, linear acceleration data collected each time are analyzed by utilizing a sliding window technology, mean value and variance of the linear acceleration data are calculated, if the data of a plurality of continuous windows exceed a set tolerance range, the data are regarded as deviation, when the deviation is detected, a low-pass filter is applied to filter out sudden noise based on a calibration module, new bias and gain values are recalculated, and new bias and gain value parameters are dynamically loaded into the calibration module for calibration.
5. The intelligent underwater acoustic signal processing system based on multi-sensor data fusion according to claim 1, wherein the signal conditioning module is designed by a differential amplification circuit, the differential amplification circuit is designed by a differential amplifier OPA1632 to construct a variable gain differential amplification network, the intelligent underwater acoustic signal processing system comprises a first OPA module and a second OPA module which are connected in series, a positive input end of the first OPA module is connected with a resistor R1, a negative input end of the first OPA module is connected with a resistor R2, a resistor R3 and a capacitor C1 are connected in parallel between the positive input end and the negative output end of the first OPA module, a resistor R4 and a capacitor C2 are connected in parallel between the negative input end and the positive output end of the first OPA module, a resistor R5 is connected with a negative input end connecting a resistor R6, a resistor R7 and a capacitor C3 are connected in parallel between the positive input end and the negative output end of the second OPA module, a resistor R8 and a capacitor C4 are connected with a capacitor C5 and a capacitor C6 are connected between the negative input end and a positive output end of the second OPA module and a resistor R10 and a capacitor R10 are connected between the positive input end and a capacitor R9 respectively.
6. The intelligent underwater acoustic signal processing system based on multi-sensor data fusion of claim 1, wherein after the data processed by the data processing module is transmitted to the main control unit, the data preprocessing is performed first, the preprocessed data is fused by the data fusion module, and when the data preprocessing is performed, the system specifically comprises:
1) Time-space alignment, namely marking different sensor sampling data through uniform time stamps to realize time synchronization of multi-source data;
2) Static reference calibration, namely collecting data in a static state at a system starting stage, and constructing a linear calibration model;
3) And temperature compensation, namely correcting the underwater sound signal, the angular velocity and the linear acceleration measured value by combining a temperature drift model of each sensor based on temperature data acquired in real time.
7. The method for the intelligent underwater sound signal processing system based on the multi-sensor data fusion as claimed in claim 3, which is characterized by comprising the following steps:
Step A, data acquisition and preprocessing, wherein a multi-sensor module acquires underwater related signals, analog signals acquired by the sensor module are amplified and conditioned through a signal conditioning module, the amplified and conditioned signals are sent to an analog-to-digital conversion module for digitizing, and the signals are uniformly input into a main control unit for signal preprocessing;
b, data fusion and filtering, namely carrying out multi-source information fusion on the data preprocessed by the main control unit through a data fusion module, carrying out attitude compensation on the linear acceleration signal, and removing noise through filtering;
Step C, position and direction calculation, namely, combining an azimuth calculation module, using an Eigen-AMVDR algorithm, extracting a characteristic vector of a signal by calculating a covariance matrix of the signal, and further estimating the direction and azimuth of the target;
Step D, error calibration and compensation, namely, combining a calibration module, performing reference calibration through static data acquisition, acquiring zero offset and gain error of a sensor, and performing calibration through real-time monitoring of data change and combining a dynamic compensation mechanism in the actual operation process;
and E, data monitoring and displaying, namely displaying and monitoring the finally processed data through an upper computer.
8. The method of the intelligent underwater acoustic signal processing system based on multi-sensor data fusion of claim 7, wherein in the step A, the main control unit performs signal preprocessing, the method comprises the following steps:
1) Time-space alignment, namely marking different sensor sampling data through uniform time stamps to realize time synchronization of multi-source data;
2) Static reference calibration, namely collecting data in a static state at a system starting stage, and constructing a linear calibration model;
3) And temperature compensation, namely correcting the underwater sound signal, the angular velocity and the linear acceleration measured value by combining a temperature drift model of each sensor based on temperature data acquired in real time so as to reduce zero point offset and gain error caused by temperature change.
9. The method of claim 7, wherein in the step C, the principle of the method is as follows when the azimuth calculation is performed:
The azimuth calculation module builds a covariance matrix aiming at the received signals, and performs characteristic decomposition on the covariance matrix to obtain characteristic values and characteristic vectors, wherein the characteristic vectors with large characteristic values correspond to signal subspaces, the characteristic vectors with small characteristic values correspond to noise subspaces, and the received signals are projected to the signal subspaces and the noise subspaces by utilizing the characteristic vectors so as to separate out signal and noise components;
Applying an Eigen-AMVDR algorithm in a signal subspace, calculating an optimal weight vector W so as to maximize the gain in a target direction, and simultaneously suppressing interference and noise in other directions, specifically:
(1) Obtaining the linear acceleration, the angular velocity and the underwater sound signal data after fusion from a data fusion module, constructing sample data according to a predefined time window, calculating a covariance matrix R i by using the sample data, then carrying out eigenvalue decomposition on the covariance matrix R i, extracting main components of the signals, and obtaining the distribution condition of a signal subspace and a noise subspace;
(2) Second order statistics among a plurality of channels are fused, and a weighted fusion type power spectrum is designed according to the second order statistics:
;
Wherein, Indicating that the jth channel is in the directionPower response function, weightThe signal energy of each channel and the statistical characteristics thereof determine, and the comprehensive beam pointing response is calculated by weighting and fusing the information of a plurality of channels;
(3) Traversing within a set angle range Calculating the power spectrum for each angleDrawing the result to form a beam pattern, and carrying out weighted fusion on the information in the signal subspace and the weighted fusion type power spectrum;
Specifically, an enhanced beam function is defined:
;
Wherein, Is the desired directionThe guiding vector on the upper part of the frame,Is thatIs a conjugate transpose of (2);
(4) The actual result is Seen as a true target azimuth distributionConvolution with a point spread function PSF results in:
;
Pair of deconvolution algorithms Performing iterative processing to solve the target distributionThe iterative update formula is:
;
Wherein, From the following componentsThe process of the initialization is carried out,Is thatIs a flipped version of (a);
final target direction estimation The output result after the iteration process is converged is:
;
Wherein K is the number of steps when the iteration termination condition is satisfied, and finally the estimated target azimuth is obtained through iteration update A clear peak is formed in the correct direction to accurately determine the target bearing.
10. The method of claim 7, wherein in the step D, the method is realized based on the following principle:
In the running process of the system, output data of each sensor are continuously collected and compared in real time, linear acceleration data collected each time are analyzed by utilizing a sliding window technology, mean value and variance of the linear acceleration data are calculated, if the data of a plurality of continuous windows exceed a set tolerance range, the data are regarded as deviation, when the deviation is detected, a low-pass filter is applied to filter out sudden noise based on a calibration module, new bias and gain values are recalculated, and new bias and gain value parameters are dynamically loaded into the calibration module for calibration.
CN202510652324.1A 2025-05-21 2025-05-21 Intelligent underwater acoustic signal processing system and method based on multi-sensor data fusion Active CN120180200B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202510652324.1A CN120180200B (en) 2025-05-21 2025-05-21 Intelligent underwater acoustic signal processing system and method based on multi-sensor data fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202510652324.1A CN120180200B (en) 2025-05-21 2025-05-21 Intelligent underwater acoustic signal processing system and method based on multi-sensor data fusion

Publications (2)

Publication Number Publication Date
CN120180200A true CN120180200A (en) 2025-06-20
CN120180200B CN120180200B (en) 2025-07-29

Family

ID=96036270

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202510652324.1A Active CN120180200B (en) 2025-05-21 2025-05-21 Intelligent underwater acoustic signal processing system and method based on multi-sensor data fusion

Country Status (1)

Country Link
CN (1) CN120180200B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120973867A (en) * 2025-10-16 2025-11-18 河北省唐山水文勘测研究中心(河北省唐山水平衡测试中心) Multi-sensor fusion method for synchronous acquisition of multi-modal hydrological data

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6658234B1 (en) * 1995-06-02 2003-12-02 Northrop Grumman Corporation Method for extending the effective dynamic range of a radio receiver system
US20100130873A1 (en) * 2008-04-03 2010-05-27 Kai Sensors, Inc. Non-contact physiologic motion sensors and methods for use
US20100152600A1 (en) * 2008-04-03 2010-06-17 Kai Sensors, Inc. Non-contact physiologic motion sensors and methods for use
US20120093344A1 (en) * 2009-04-09 2012-04-19 Ntnu Technology Transfer As Optimal modal beamformer for sensor arrays
CN110652299A (en) * 2019-08-20 2020-01-07 南京航空航天大学 Multi-source sensing information fusion system for soft exoskeleton of lower limbs
CN115690368A (en) * 2022-10-28 2023-02-03 中国科学院深圳先进技术研究院 Nuclear radiation dose augmented reality interaction visualization method and system based on three-dimensional tracking registration
CN117387606A (en) * 2023-10-12 2024-01-12 北京理工大学 A method and device for multi-sensor information fusion of high-rotating flying objects throughout the entire process
CN118776606A (en) * 2023-04-04 2024-10-15 全研科技有限公司 Dynamic system and method for measuring precision alignment platform using meta-neural sensing device
US20240361445A1 (en) * 2023-04-26 2024-10-31 Nxp B.V. Updating radar sensor accuracy measurements for object tracking
CN119334353A (en) * 2024-10-30 2025-01-21 深圳叩鼎科技有限责任公司 Smart wristband location tracking and navigation method, device, equipment and storage medium
CN119541080A (en) * 2025-01-23 2025-02-28 深圳富士伟业科技有限公司 A method and system for analyzing automobile data based on intelligent diagnostic instrument
CN119688009A (en) * 2025-02-24 2025-03-25 贵州大学 Intelligent sensing method for state of pressure forming equipment by fusing multi-source sensor signals
CN119739281A (en) * 2024-12-03 2025-04-01 华南师范大学 A somatosensory glove and system based on flexible photoelectric dual-modal sensor
CN119989079A (en) * 2025-01-14 2025-05-13 华东交通大学 Bearing intelligent diagnosis method, system, readable storage medium and computer

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6658234B1 (en) * 1995-06-02 2003-12-02 Northrop Grumman Corporation Method for extending the effective dynamic range of a radio receiver system
US20100130873A1 (en) * 2008-04-03 2010-05-27 Kai Sensors, Inc. Non-contact physiologic motion sensors and methods for use
US20100152600A1 (en) * 2008-04-03 2010-06-17 Kai Sensors, Inc. Non-contact physiologic motion sensors and methods for use
US20120093344A1 (en) * 2009-04-09 2012-04-19 Ntnu Technology Transfer As Optimal modal beamformer for sensor arrays
CN110652299A (en) * 2019-08-20 2020-01-07 南京航空航天大学 Multi-source sensing information fusion system for soft exoskeleton of lower limbs
CN115690368A (en) * 2022-10-28 2023-02-03 中国科学院深圳先进技术研究院 Nuclear radiation dose augmented reality interaction visualization method and system based on three-dimensional tracking registration
CN118776606A (en) * 2023-04-04 2024-10-15 全研科技有限公司 Dynamic system and method for measuring precision alignment platform using meta-neural sensing device
US20240361445A1 (en) * 2023-04-26 2024-10-31 Nxp B.V. Updating radar sensor accuracy measurements for object tracking
CN117387606A (en) * 2023-10-12 2024-01-12 北京理工大学 A method and device for multi-sensor information fusion of high-rotating flying objects throughout the entire process
CN119334353A (en) * 2024-10-30 2025-01-21 深圳叩鼎科技有限责任公司 Smart wristband location tracking and navigation method, device, equipment and storage medium
CN119739281A (en) * 2024-12-03 2025-04-01 华南师范大学 A somatosensory glove and system based on flexible photoelectric dual-modal sensor
CN119989079A (en) * 2025-01-14 2025-05-13 华东交通大学 Bearing intelligent diagnosis method, system, readable storage medium and computer
CN119541080A (en) * 2025-01-23 2025-02-28 深圳富士伟业科技有限公司 A method and system for analyzing automobile data based on intelligent diagnostic instrument
CN119688009A (en) * 2025-02-24 2025-03-25 贵州大学 Intelligent sensing method for state of pressure forming equipment by fusing multi-source sensor signals

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LIBIN DU: "A method for underwater acoustic target recognition based on the delay-doppler joint feature", 《REMOTE SENSING》, 2 June 2024 (2024-06-02) *
孙玉山;万磊;庞永杰;: "潜水器导航技术研究现状与展望", 机器人技术与应用, no. 01, 15 February 2010 (2010-02-15) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120973867A (en) * 2025-10-16 2025-11-18 河北省唐山水文勘测研究中心(河北省唐山水平衡测试中心) Multi-sensor fusion method for synchronous acquisition of multi-modal hydrological data
CN120973867B (en) * 2025-10-16 2026-01-27 河北省唐山水文勘测研究中心(河北省唐山水平衡测试中心) Multi-sensor fusion method for synchronous acquisition of multi-modal hydrological data

Also Published As

Publication number Publication date
CN120180200B (en) 2025-07-29

Similar Documents

Publication Publication Date Title
CN120180200B (en) Intelligent underwater acoustic signal processing system and method based on multi-sensor data fusion
CN109143942B (en) Attitude sensor control system
CN112985384A (en) Anti-interference magnetic course angle optimization system
CN116499465B (en) Multi-source fusion positioning method for underground pipe network robot based on adaptive factor graph weights
CN115435786A (en) Real-time monitoring method and monitoring system for attitude of power transmission line
CN117848320A (en) An airborne multi-source fusion navigation method based on data screening and adjustable trust
CN120314529A (en) A system and method for error correction of marine environment observation data
CN105737793A (en) Roll angle measurement unit and measurement method
CN117146810B (en) Combined positioning system based on satellite navigation and MEMS inertial navigation
CN110672103A (en) A multi-sensor target tracking filtering method and system
CN114624671B (en) A method for recovering the characteristics of saturated waveform signals from satellite-borne laser altimetry
KR101724330B1 (en) Remote measuring system and compensation method of the doppler frequency shift usinf the same, and apparatus thereof
CN116380054B (en) Aircraft attitude calculation method
CN111491368A (en) Correction method and correction device suitable for AOA algorithm positioning base station
JP2965039B1 (en) High bandwidth attitude control method and high bandwidth attitude control device for artificial satellite
CN104792336A (en) Measurement method and device of flying state
CN102075302B (en) Error estimation method and device for vector sensor array
CN115576337A (en) Real-time terrain and landform following method for mine field detection flight
CN115389783A (en) A non-linear compensation circuit and compensation method for accelerometer based on ARM
CN120122106B (en) A method, system and storage medium for ultrasonic height measurement of aircraft based on dynamic attitude compensation
CN119276408B (en) High dynamic motion carrier communication signal synchronization method based on motion information measurement
CN118565434B (en) Vehicle altitude analysis method and system based on dynamic air pressure monitoring
CN120907535A (en) State estimation method for mobile robot navigation
CN120781128A (en) A multi-sensor attitude confidence solution method for UAV flight control
CN118565471A (en) A fault-tolerant intelligent connected vehicle collaborative positioning method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant