CN120746510A - Low-altitude monitoring system based on multi-source heterogeneous perception fusion and rule engine driving - Google Patents
Low-altitude monitoring system based on multi-source heterogeneous perception fusion and rule engine drivingInfo
- Publication number
- CN120746510A CN120746510A CN202511240561.3A CN202511240561A CN120746510A CN 120746510 A CN120746510 A CN 120746510A CN 202511240561 A CN202511240561 A CN 202511240561A CN 120746510 A CN120746510 A CN 120746510A
- Authority
- CN
- China
- Prior art keywords
- data
- target
- rule
- alarm
- low
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/886—Radar or analogous systems specially adapted for specific applications for alarm systems
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/21—Design, administration or maintenance of databases
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/04—Inference or reasoning models
- G06N5/041—Abduction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/103—Workflow collaboration or project management
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Engineering & Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Artificial Intelligence (AREA)
- Entrepreneurship & Innovation (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- Quality & Reliability (AREA)
- Computing Systems (AREA)
- Computational Linguistics (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Economics (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Mathematical Physics (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Electromagnetism (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention relates to the field of low-altitude airspace monitoring, in particular to a low-altitude monitoring system based on multi-source heterogeneous perception fusion and rule engine driving, which comprises a perception access layer, a processing calculation layer and an application support layer, wherein the perception access layer is used for completing data standardization and protocol adaptation, realizing data stream release through a message middleware, the processing calculation layer is used for generating a structured alarm event, and the application support layer is used for three-dimensional visual interaction display of target situation data and the alarm event.
Description
Technical Field
The invention relates to the field of low-altitude airspace monitoring, in particular to a low-altitude monitoring system based on multi-source heterogeneous perception fusion and rule engine driving.
Background
With the rapid development of novel low-altitude aircrafts such as unmanned planes and eVTOL (electric vertical take-off and landing aircrafts), low-altitude flying activities present the characteristics of high frequency, multiple types and multiple concurrent scenes, and have brought unprecedented challenges to traditional airspace management modes. The low-altitude airspace (usually refers to the ground to 3000 m) has the characteristics of small volume, unstable flying speed, complex running track, frequent environmental shielding and the like of an aircraft, and the existing medium-high altitude airway monitoring system (such as ADS-B, SSR) is difficult to adapt to the actual requirements of low-altitude scenes in the aspects of sensing range, recognition precision and response capability. The main stream technology in the current low-altitude monitoring and early warning field comprises a civil aviation airspace management system, a single-point unmanned aerial vehicle countering system, a video+AI recognition system and a local multi-sensor integrated system. However, these technical paths have significant limitations in dealing with the complexity and diversity of the low-altitude space domain.
The civil aviation air traffic control system mainly depends on a fixed route, high-altitude equipment and an authorized communication protocol, is only suitable for traditional navigation and civil aviation scenes, and has limited perception capability on low-altitude illegal flight targets. The single-point unmanned aerial vehicle reaction system usually adopts radio frequency detection, radar monitoring or electronic interference equipment, but the system is decentralized, standard is not uniform, environmental interference is easy to occur, and identification accuracy is low. The photoelectric video-based target detection scheme introduces an AI recognition technology, but is limited by the problems of a single data source such as a visual angle, weather and shielding, and has the advantages of small coverage and high false recognition rate. Some emerging platforms try to integrate sensing means such as radar, video and ADS-B, but most of the emerging platforms are of a single structure or a closed platform, and lack of a data fusion mechanism of cross-equipment and cross-protocol, so that sensing capability is dispersed, data is difficult to share, algorithm expansibility is poor, and system upgrading cost is high.
The core technical problem of the existing low-altitude monitoring system is that the high-efficiency multi-source heterogeneous data fusion and intelligent processing capability are lacked. The existing multi-sensor system generally only realizes simple superposition or independent processing of data, and is difficult to generate unified low-altitude target situation data through space-time registration and algorithm fusion. The scattered perception mode causes low target recognition precision and discontinuous track construction, and cannot accurately capture complex flight behaviors (such as out-of-range, spiral and disconnection). In addition, the software part of the existing system is limited to basic data display and manual rule setting, and lacks a configurable intelligent rule engine and an automatic early warning mechanism, so that the system is difficult to dynamically adapt to the supervision requirements under changeable scenes.
Disclosure of Invention
Based on the problems, the invention provides a global low-altitude intelligent monitoring and early warning software system based on multi-source heterogeneous sensing fusion and rule engine driving, which aims to realize unified access, intelligent fusion, rule driving automatic early warning and three-dimensional visual interactive display of multi-source data by combining an event driving asynchronous communication mechanism through a layered architecture of a sensing access layer, a processing calculation layer and an application support layer, solve the main technical problems of insufficient multi-source heterogeneous data fusion and weak intelligent processing capability and provide a systematic, intelligent and engineering solution for global monitoring and efficient supervision of a low-altitude airspace.
The invention is realized by the following technical scheme:
A low-altitude monitoring system based on multi-source heterogeneous awareness fusion and rule engine driving, comprising:
The sensing access layer is used for accessing multi-source heterogeneous data of a radar, a photoelectric sensor, a radio frequency sensor, an ADS-B sensor and a meteorological sensor, completing data standardization and protocol adaptation, and realizing data stream release through a message middleware;
The processing and calculating layer is used for realizing data fusion, target identification, rule driving judgment, alarm generation and event distribution based on the micro-service architecture, wherein the data fusion and target identification service is used for receiving data streams from the perception access layer in real time and outputting fused target situation data, and the rule engine service is used for consuming the target situation data in real time to perform rule matching so as to generate a structured alarm event;
the three-dimensional visual interactive display of the target situation data and the alarm event is realized by the application of the support layer, the real-time push of the alarm, the closed-loop management of alarm processing and the analysis of historical data are supported, wherein event driving, asynchronous communication, distributed deployment and elastic expansion are carried out among all functional services through the message queue.
Furthermore, the perception access layer adopts a plug-in protocol adapter architecture to support the data protocol adapters of dynamic loading and unloading of radar, video, radio frequency, ADS-B and meteorological sensors;
the data preresolved component of the perception access layer extracts target position coordinates, speed vectors, target types, signal intensity and time stamps from sensor original data in real time, uniformly packages the target position coordinates, speed vectors, target types, signal intensity and time stamps into structured information, distributes the structured information into a Kafka information queue, realizes a Topic partition strategy according to the sensor types and geographic areas, supports a service module to independently subscribe different partitions, and performs data parallel consumption and high concurrency processing.
Furthermore, the data fusion service of the processing and computing layer adopts a space-time registration algorithm to perform time synchronization and space coordinate unified mapping of multi-source data, and utilizes an extended Kalman filter to realize target state estimation and smoothing, the target recognition service integrates a deep learning model, processes video streams of the photoelectric sensor in real time, outputs target types, position boundary frames and confidence scores, and the results are further fused through a message queue.
Furthermore, the rule engine service adopts a Drools or like rule engine to define rules in a structured JSON format, wherein the rules comprise a geofence rule, a high-speed limit rule, a disconnection and crossing rule, a black-white list rule and a combination behavior rule, the rule engine subscribes the fused target situation data stream in real time, compares rule triggering conditions one by one, generates an early warning event when the rules hit, comprises information such as a target ID, a rule type, an alarm grade, a triggering position, a triggering time and the like, and transmits the information to the alarm service.
Furthermore, the application support layer adopts a three-dimensional GIS engine to construct a situation display page to display three-dimensional space distribution, history track, alarm area and rule hit record of the low-altitude target in real time, and provides an alarm management interface to display an alarm list in real time.
Furthermore, the system overall deployment mode is realized based on a Docker container technology and a Kubernetes editing tool, has transregional distributed deployment and elastic expansion capability, and carries source event chain information for each event in an event stream driving framework, wherein the source event chain information comprises data sources, acquisition time, target identification history, rule hit history and evidence index, so that full-link data backtracking and audit trail are realized.
Furthermore, the system uses a Redis database to realize real-time sharing state storage, including a current online target state, a sensor health state and an active alarm event list;
The system adopts PostgreSQL to combine PostGIS database to define rules, manage users, record alarm event and manage audit log, and has structured data storage and space data retrieval functions;
the system adopts a time sequence database for storing historical target track data and alarm trend data, and supports efficient historical data backtracking analysis and rule optimization.
Furthermore, the system fuses a multi-mode target recognition large model and is used for processing radar images, photoelectric images and infrared images, so that the accuracy, the anti-interference capability and the target type generalization capability of low-altitude target recognition are improved;
The system integrated behavior modeling large model is used for analyzing time sequence track data of a target, identifying high-risk flight behaviors and taking the result as an input characteristic condition of the rule engine.
The invention has the beneficial effects that:
(1) According to the invention, unified access of multiple types of sensors such as radar, photoelectricity, radio frequency, ADS-B and weather is realized through the plug-in protocol adapter, and target state estimation and track smoothing are realized by combining space-time registration and extended Kalman filtering, so that the limitation that the existing system can only perform single data source or simple superposition processing is overcome, and the target recognition precision and track continuity are obviously improved;
(2) According to the invention, a rule engine driving mechanism is introduced, multiple rule configurations such as geofences, speed height limits, cross-linking, black-and-white lists and combination behaviors are supported, the target situation can be dynamically matched in second-level response, real-time early warning and hierarchical warning under complex scenes are realized, and the defects that the traditional system depends on fixed threshold values and cannot adapt to variable scenes are overcome;
(3) The invention adopts an event stream driving framework supported by a message queue, combines a micro-service and containerized deployment mode, realizes decoupling, elastic expansion and cross-region cooperative processing among modules, solves the problems of poor expansibility and insufficient fault tolerance of the traditional centralized single framework, and ensures the stability and instantaneity of the system under the conditions of high concurrency and large-scale sensor access;
(4) The invention carries complete source event chain information for each alarm in the event stream, including data source, acquisition time, identification history and rule hit record, and builds unified storage by combining with Redis cache, postgreSQL and time sequence database, thereby supporting real-time online monitoring, post track playback, alarm tracing and rule optimization, and meeting the requirements of compliance supervision and judicial evidence collection;
(5) According to the invention, the support layer is used for realizing low-altitude situation visualization based on the three-dimensional GIS engine, supporting real-time display of targets, playback of historical tracks, presentation and regular configuration of alarm areas, and remarkably improving visual perception and quick handling capacity of supervision staff on complex airspace;
(6) The invention has remarkable advantages in the aspects of target recognition precision, complex behavior analysis and risk assessment through the integration of the multi-mode target recognition large model and the track behavior modeling large model, can reduce false alarm rate and false miss rate, and improves the adaptability of the system to novel low-altitude aircrafts and complex scenes.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments will be briefly described below, it will be obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort to a person skilled in the art.
FIG. 1 is a block diagram of the overall architecture of a low-altitude monitoring system based on multi-source heterogeneous awareness fusion and rule engine driving according to the present invention;
FIG. 2 is a schematic diagram of the interaction flow of core modules of the low-altitude monitoring system based on multi-source heterogeneous perception fusion and rule engine driving;
FIG. 3 is a schematic diagram of the workflow of the AI enhancement module of the low-altitude monitoring system based on multi-source heterogeneous perception fusion and rule engine driving according to the present invention;
Fig. 4 is a schematic diagram of the working principle of the rule-driven intelligent early warning module of the low-altitude monitoring system based on multi-source heterogeneous sensing fusion and rule engine driving;
FIG. 5 is a schematic flow chart of a deployment and collaboration scheme of a low-altitude monitoring system based on multi-source heterogeneous perception fusion and rule engine driving;
FIG. 6 is a schematic diagram of a terminal device of a low-altitude monitoring system based on multi-source heterogeneous perception fusion and rule engine driving;
FIG. 7 is a schematic diagram of a readable storage medium of a low-altitude monitoring system based on multi-source heterogeneous awareness fusion and rule engine driving according to the present invention;
In the figure, 200-terminal device, 210-memory, 211-RAM, 212-cache, 213-ROM, 214-program/utility, 215-program module, 220-processor, 230-bus, 240-external device, 250-I/O interface, 260-network adapter, 300-program product.
Detailed Description
For the purpose of making apparent the objects, technical solutions and advantages of the present invention, the present invention will be further described in detail with reference to the following examples and the accompanying drawings, wherein the exemplary embodiments of the present invention and the descriptions thereof are for illustrating the present invention only and are not to be construed as limiting the present invention.
Example 1
Referring to fig. 1, the implementation of a global low-altitude intelligent monitoring and early warning software system for a city of 0-3000 m and a low-altitude scene of complex terrain is described in detail in this embodiment, and the implementation includes technical features of different modules, which are mapped to a perception access layer, a processing calculation layer and an application support layer respectively to form a distributed platform with event driven and micro-service decoupling, so as to support low-altitude space monitoring and early warning functions with high precision, high real-time performance and strong expansibility.
The sensor access and standardization adaptation module is positioned at a sensing access layer, is an interface of a system and a physical sensor, is responsible for protocol decoupling, data adaptation, standardization processing and time synchronization of multi-source heterogeneous data, ensures uniformity and efficient transmission of data streams, solves the problems of complex access and insufficient data standardization of the multi-source heterogeneous sensor in the background art, and adopts a plug-in protocol adapter architecture to support communication protocols of dynamically loading and unloading various sensor devices, including TCP/IP, UDP, serial ports, HTTP, webSocket and RTSP. The supported sensors include radar (2D/3D, millimeter wave), optoelectronic devices (visible/infrared cameras), radio frequency detection devices (spectrometers, signal reconnaissance), ADS-B receivers and ground weather stations (measuring wind speed, temperature, visibility, precipitation). In the embodiment, radar data can be analyzed through an Asterix protocol, photoelectric equipment obtains video streams through RTSP, ADS-B analyzes through aviation broadcast messages, and a weather station transmits data through a serial port.
The data pre-analysis comprises the steps that a data pre-analyzer extracts key fields from original data, wherein the key fields comprise radar including point trace positions (longitude, latitude and altitude), speed vectors, RCS values and time stamps, photoelectric including video frame rate, resolution and frame time stamps, radio frequency including frequency bands, signal strength and fingerprint characteristics, ADS-B including target identification codes, positions, speeds and altitudes, weather including wind speed, wind direction, temperature and precipitation. The extracted fields are uniformly packaged into a structured data packet, and the structured data packet is in a JSON Schema or Protobuf format and contains uniform metadata (such as equipment ID and time stamp).
And time synchronization, namely realizing cross-equipment time alignment through an NTP protocol or a GPS clock source, wherein the precision reaches millisecond level, and ensuring the time sequence consistency of subsequent data fusion. For example, the radar data sampling rate 10Hz is aligned with the photo-video frame 2FPS by temporal interpolation, eliminating the sampling frequency difference.
And the data transmission and partitioning are that the standardized data are pushed to a downstream module through a Kafka message queue, and different topics, such as raw_radar_data, raw_opto_data and raw_ adsb _data, are written according to the sensor type. Kafka Topic is partitioned by geographic area, e.g., urban north area data is written to partition 0 and eastern industrial park is written to partition 1, supporting parallel consumption and high concurrency processing.
Sensor health monitoring, wherein the module monitors the connection state of the sensor in real time and detects data interruption or delay. For example, no data is input for 10 seconds continuously, an automatic reconnection mechanism is triggered or an alarm prompt is generated to inform operation and maintenance personnel, and the continuity of data acquisition is ensured.
The module is deployed in a Docker container, and each device driver runs in an independent container to support dynamic loading of new device plugins. If millimeter wave radar is newly added in an urban airport scene, the system loads a driving plug-in unit thereof, analyzes binary data stream and does not need to modify core codes. The data pre-parsing is implemented by a high performance parsing library, such as libpcap processing network data, FFmpeg parsing RTSP video streams. The parsed data is packaged into a JSON format, and comprises a device ID, a time stamp and a sensor specific field, so that unified processing of downstream modules is ensured. The time synchronization mechanism is based on an NTP server or a GPS module, periodically calibrates the sensor time stamp, and the precision is controlled within 1 ms. The Kafka Topic partition strategy supports localized deployment, for example, in a multi-region collaboration scenario, northern urban radar data is written into raw_radar_data partition 0, eastern park is written into partition 1, and consumption services can be bound to specific partitions, so that throughput is improved. The health state of the sensor is stored in Redis (key value format sensor: health: < id >) for the operation and maintenance panel to check in real time, and the alarm is pushed by the WebSocket when the operation and maintenance panel is abnormal. The module solves the problems of non-uniform interface protocol and insufficient data timeliness in the background technology through plug-in design and asynchronous transmission, and supports high-density equipment access and cross-region expansion. In practical deployment, the module supports high-density sensor access in urban core areas (such as airport clearance areas), for example, processing real-time data streams of 10 radars, 20 photoelectric cameras and 5 radio frequency detectors simultaneously. The module automatically expands capacity through the Kubernetes, and dynamically distributes container examples when equipment is newly added, so that stability and instantaneity of data acquisition are ensured.
The multi-source data fusion and target identification module is positioned at the processing and calculating layer and is responsible for integrating multi-source heterogeneous data into a unified target entity, carrying out target identification, classification and track construction, generating target situation data, and solving the problems of inaccurate target identification, insufficient fusion capacity and unstable track construction in the background technology.
Data preprocessing, namely subscribing to the raw_sensor_data Topic of Kafka, and executing streaming data processing, wherein the streaming data processing comprises the following steps:
And filtering noise, namely removing radar trace jitter, photoelectric image weather interference or radio frequency signal noise by adopting Kalman filtering (EKF), particle filtering or median filtering. For example, kalman filtering smoothes radar trace locations, and particle filtering processes cloud-to-fog interference in optoelectronic images.
Outlier processing, eliminating data beyond a reasonable range, such as radar traces or outliers in a photoelectric image at a speed >100 m/s.
Data normalization, normalizing the data to a uniform unit, e.g., position using WGS84 coordinate system, speed in meters per second.
Space-time alignment, which is to map data to a unified coordinate system and time axis based on sensor layout positions (longitude, latitude and altitude), field angles and sampling time delays. For example, radar trace (10 Hz) aligns the time stamp with the photo frame (2 FPS) by linear interpolation, ensuring fusion accuracy.
Target fusion, namely integrating multi-sensor observation into a single target entity by adopting nearest neighbor, gating, joint Probability Data Association (JPDA) and multi-hypothesis tracking (MHT) algorithms. For example, the position and the speed of radar tracks are associated with the target frame of a photoelectric image through a JPDA to generate a unique target ID, and the MHT algorithm processes target loss or overlapping in a complex scene to ensure track continuity.
Target recognition, namely integrating a deep learning model (such as YOLOv, YOLOv8 or EFFICIENTDET), processing a photoelectric video frame, and outputting a target type (such as a rotor unmanned plane, a helicopter and a small fixed wing), frame coordinates and a confidence score. The radar RCS characteristics, doppler frequency and radio frequency fingerprints are combined to complete multi-mode joint identification, and comprehensive attributes including target types, speeds, directions, confidence and source sensor lists are generated.
And (3) data storage, namely writing the fusion result into a Redis cache (key value format target: < id >) in real time for second-level situation rendering, and synchronously persistence to InfluxDB or TimescaleDB (Measurement tracks) for supporting history track inquiry.
Model optimization YOLOv model acceleration reasoning is carried out through TensorRT or ONNX Runtime, INT8 quantization is adopted, the reasoning time is controlled within 50ms, and batch parallel reasoning (multi-frame simultaneous processing) is supported. The model is used for incremental training by manual labeling at regular intervals through historical detection data, and the identification precision and generalization capability are improved through CI/CD pipeline hot updating.
The module runs on a Kubernetes cluster, deploys a plurality of Docker container instances, and subscribes to raw_sensor_data Topic. The preprocessing service estimates the target state position, speed, and smooth trajectory through a kalman filter, for example, filtering random jitter in radar point tracks. The time-space alignment is based on the sensor metadata position and the field angle, and the time stamp is aligned through linear interpolation, so that the data consistency is ensured. The fusion process uses JPDA to calculate the association probability of radar, photoelectric and radio frequency data, for example radar trace (longitude and latitude 31.23, 121.47, altitude 120 m) to associate with photoelectric target (frame [100, 150, 200, 250 ]), generating target ID (e.g. TGT-000341). MHT algorithms maintain multiple hypothesis trajectories to handle occlusion or loss of targets in complex scenes.
The object recognition service loads YOLOv the model, processes 640 x 640 pixel photoelectric video frames, outputs object type (e.g., "rotorcraft"), frame coordinates, and confidence (e.g., 0.93). The model is optimized through TensorRT, the INT8 quantization is adopted, and the reasoning time is controlled within 50 ms. The recognition result is fused with radar RCS (e.g. 0.85), doppler frequency and radio frequency fingerprint to generate comprehensive target state, which is stored in Redis and InfluxDB. For example, the target state contains location, speed, type, confidence, and source sensor list (e.g., RADAR_001, CAMERA_002). Historical data are stored in MinIO, a training data set is formed through manual labeling, and the identification capability of the novel unmanned aerial vehicle is improved through a CI/CD assembly line updating model. The module solves the problems of target loss, misjudgment and discontinuous track in the background technology through a high-efficiency fusion algorithm and AI optimization.
In the urban core area scene, the module processes multi-sensor data (such as 10 radars and 20 photoelectric cameras), fuses and generates real-time situations of hundreds of targets, maintains track continuity, has identification accuracy of more than 95%, and is suitable for supervision of a black-flying unmanned aerial vehicle.
The rule-driven intelligent early warning module is located in a processing and calculating layer, is a core for behavior judgment and anomaly detection, supports a configurable rule engine, matches target situations in real time, generates early warning events, and solves the problems of weak intelligent early warning capability and lack of flexibility in rule configuration.
Rule definition based on Drools framework, support definition of rules in structured JSON format, stored in PostgreSQL database, comprising:
geofence rules express polygonal areas and altitude ranges, such as airport clearance areas (0-120 m), by GeoJSON or WKT.
Flight restriction rules defining altitude (> 500 m), velocity (> 30 m/s), residence time (> 10 s).
The decoupling and boundary crossing rules, the target is not tracked for 10 seconds continuously, or exceeds a specified airspace.
And setting a black-and-white list rule, and setting an exemption or a strong alarm based on the target identification code or the radio frequency fingerprint.
Behavior combination rules such as entering a no-fly zone + overspeed + u-turn flight.
Rule management supporting version control (storing historical versions), hot loading (second level validation), timed start-stop and dynamic parameter replacement (e.g., adjusting speed thresholds).
And (3) real-time processing, namely subscribing tracked-targets Topic of Kafka by an engine in real time, carrying out rule matching on each target entity, generating an early warning event when hit, including a target ID, a rule ID, trigger time, a position and an alarm grade, and writing in a warning_ eventsTopic.
The AI rule is generated by integrating a fine tuning language model (such as ChatGLM), supporting a user to generate a rule template through natural language description, for example, inputting an unmanned aerial vehicle to enter a sensitive area to overspeed, automatically converting the rule template into a Drools grammar, and storing the rule template as a JSON rule.
The module runs in a Docker container and rules are stored in the JSONB field of PostgreSQL, with example rules including region coordinates, trigger conditions, and alert levels. The engine subscribes tracked _ targets Topic to match rules on a target-by-target basis. For example, a target (ID: TGT-000342) enters a no-fly zone (GeoJSON definition), speed is 35m/s, a radio frequency fingerprint hits a blacklist, a GEO_FENCE+blacklist rule is triggered, an early warning event is generated, the early warning event comprises a target ID, a rule ID, trigger time, a position and a danger level, and the early warning event is written into a warning_ eventsTopic.
Rule configuration is realized through a Vue.js front-end interface, a user can draw a fence, set a threshold (speed >30 m/s), and the rules are automatically saved and activated. The AI rule generation service parses natural language input to generate a Drools grammar, e.g., converts an unmanned aerial vehicle entering an airport area overspeed into a condition speed >30 &geo fence = air. The engine supports second-level response, dynamically adjusts the strategy and optimizes the false alarm rate through rule hit rate analysis. The module solves the problems of insufficient identification of static and complex behaviors of rules in the background technology through flexible rule management and efficient stream processing.
In the urban supervision scene, the module processes situation data of hundreds of targets in real time, matches complex rules, triggers high-risk alarms, and has response time of <1 second and misinformation rate controlled below 5%.
The event enrichment and multichannel alarm distribution module is positioned at the processing and calculating layer and is responsible for carrying out context enhancement on the early warning event generated by the rule engine to form a structured alarm object, and the problems of incomplete alarm information and insufficient linkage capacity in the background technology are solved through multichannel distribution.
Event enrichment, namely monitoring the warning_events Topic of Kafka, calling a target history track (InfluxDB), an image/video fragment (MinIO), rule details (PostgreSQL) and a classification label, and generating a structured alarm object comprising a target ID, a position, time, an alarm level, a screenshot, a video link and a rule abstract.
Multichannel distribution in this embodiment, the following push mode is supported:
WebSocket/Server-SENT EVENTS (SSE) push to the large screen and Web console.
RESTful API/gRPC is a butt joint airspace safety platform and an emergency platform.
And a short message/mail gateway for notifying the operator on duty.
And voice broadcasting, namely docking the on-site sound system.
Alarm classification, namely classifying according to attention, warning and danger levels, and configuring different pushing strategies.
Semantic abstract generation, namely generating semantic text description of alarms by integrating a large model (such as BLOOM), for example, a rotor unmanned aerial vehicle enters a no-fly zone and the speed exceeds the limit.
And storing the alarm event into PostgreSQL, including fields of positioning, screenshot, rule ID, processing state, handler and the like, establishing a primary key index with the track and the video, and supporting later tracing.
The module runs in a Docker container, subscribing to warning_events Topic. Enriching early warning events, such as triggering a rule of 'no-fly zone plus blacklist' by a target (ID: TGT-000344), and calling InfluxDB to acquire a latest 30-second track (longitude, latitude, altitude and speed), searching an image screenshot and a 5-second video clip in MinIO, loading rule details (ID, description and trigger fields) in PostgreSQL, and generating an alarm object comprising the target ID, position, time, level (danger) and evidence link. The large model generates a semantic abstract, such as a rotor unmanned aerial vehicle (ID: TGT-000344) enters an airport clearance area at 2025-08-1814:30, the speed is 35m/s, and a high-risk alarm is triggered.
The alarm is pushed to CesiumJS three-dimensional large screen through WebSocket, the target position and the alarm frame are displayed (red flashing+sound prompt), the alarm is synchronized to an emergency platform through REST API, the operator on duty is notified through short message/mail, the position, the type and the abstract are contained, the on-site sound is docked through a voice broadcasting module, and the unmanned aerial vehicle invasion forbidden zone is broadcasted. The alarm event is stored in postgreSQL, associated with track and evidence index, supporting judicial evidence collection. The module improves the integrity and linkage efficiency of the alarm through multi-modal evidence and multi-channel distribution.
In the security scene of major activities, the module pushes hundreds of alarms in real time, the response time is less than 1 second, multiparty collaboration (such as public security and empty management) is supported, the evidence chain is complete, and the requirement of post audit is met.
The micro-service architecture and the event-driven mechanism design module are positioned in a processing and computing layer, define a core architecture of a system, adopt event-driven and micro-service decoupling design, realize autonomous, distributed deployment and elastic expansion of the modules through Kafka asynchronous communication, and solve the problems of poor expansibility and insufficient concurrent processing capacity of the centralized architecture in the background art.
The micro-service is divided into independent service nodes, wherein the independent service nodes comprise a data access service for collecting and converting sensor data, a stream processing service for executing fusion recognition and early warning judgment, a rule service for maintaining a rule set and executing in real time, an alarm service for processing enrichment and distribution, a situation service for maintaining three-dimensional airspace target distribution, and a UI service and an API gateway for supporting user access and intersystem call.
Event-driven communication, namely realizing asynchronous communication through a Kafka message queue, wherein Topic comprises raw_sensor_data and packaged_ targets, warning _events, and the communication is partitioned according to the type of a sensor or an area.
The module supports automatic capacity expansion, load balancing and service discovery based on a Docker container and a Kubernetes cluster. The central node runs core services (such as a rule engine and an alarm service), the regional substation processes the local perception data, and cooperation is realized through VPN.
And the state sharing comprises the step of Redis storing real-time states including an online target state, a sensor health state and an active alarm list.
Fault tolerance and expansion, supporting a cold/hot backup mechanism, ensuring high availability by switching between the main and the backup, and realizing hot updating and quick rollback of modules by a CI/CD pipeline.
Each service runs in a Docker container, and supports dynamic expansion and contraction capacity through Kubernetes management. For example, the stream processing service automatically expands instances, binding tracked _ targets Topic partitions as the target number increases. Kafka Topic writes tracked _ targets to partition 0 and the eastern park to partition 1 by regional partition, e.g., north urban target data, ensuring parallel processing. Redis stores real-time states, e.g., target location, speed, type, for quick access by situational services and UI services. The central node deploys PostgreSQL, influxDB and MinIO, the regional substation deploys local Kafka and Redis, and the data is synchronized over the VPN. The CI/CD pipeline pushes updates through the container mirror repository, e.g., the rules service hot loads new rule versions. The module improves the stability, expansibility and high concurrency processing capacity of the system through event driving and microservice design.
In a trans-regional deployment, the system supports 10 regional substations, hundreds of sensors, handles high concurrent data streams (tens of thousands of points per second), maintains system stability, and extends for <1 minute.
The data visualization and history tracing module is positioned on the application supporting layer, provides a user interaction interface and a supervision auxiliary tool, supports three-dimensional situation display, alarm management, rule configuration, track playback and statistical analysis, and solves the problems of insufficient visualization interaction capability and weak history data management in the background technology.
And displaying the three-dimensional situation, namely adopting a Vue.js or real front end and CesiumJS three-dimensional engine to display the target position, the track, the altitude, the type and the alarm state in real time. Map layer switching (satellite map, topography), eagle eye overview, object screening and detail viewing (trajectory, sensor source, alarm recording) is supported.
And (3) alarm management, namely updating a real-time alarm list through WebSocket/SSE, and supporting detail viewing, state modification (processed/unprocessed), responsible person assignment and opinion filling.
Rule configuration, providing an interactive interface, drawing a geofence (GeoJSON), setting a threshold (speed, altitude), and selecting a notification mode.
And (3) historical tracing, namely supporting selection of a target ID and a time period, playing back tracks on a three-dimensional situation map, and analyzing a hot spot area by combining with thermodynamic diagram.
And (3) performing statistical analysis, namely generating a rule hit frequency, alarm type distribution and hot spot area thermodynamic diagram, and storing the thermodynamic diagram in PostgreSQL.
Data store alarm event, rule definition and operation log are stored in postgresql+ PostGIS, track is stored in InfluxDB/TimescaleDB, image/video is stored in MinIO.
The module interacts with the back end through the API gateway, the front end adopts Vue. Js, cesiumJS to render a three-dimensional situation map, subscribes to the Redis target state, and displays the target position (longitude, latitude and altitude), the track and the alarm state (red high-risk mark). The alarm list panel receives the warning_events Topic in real time, displays the target ID, the trigger time, the level and the rule ID, and the user can modify the state and assign the responsible person, and the operation log is stored in the PostgreSQL. The rule configuration interface supports drawing polygonal pens (WKT format), setting a speed threshold (e.g., 30 m/s), saving to PostgreSQL through the REST API. The track playback function queries the InfluxDB, selects the destination ID and time period (e.g., 2025-08-1814:00-14: 30), and reproduces the track in the three-dimensional interface. And generating a thermodynamic diagram through statistical analysis, displaying the alarm hot spot of the no-fly zone, and optimizing rule configuration. The data storage adopts cold-hot separation, redis stores real-time data, postgreSQL stores structured data, minIO stores unstructured evidence. The module improves user experience and supervision efficiency through an interactive interface and closed-loop management.
In urban low-altitude traffic management, the module supports a large screen of a command center to display the situation of hundreds of targets in real time, processes thousand-level alarms, plays back historical tracks and meets the requirements of supervision and audit.
The AI enhancement module is used as an additional component for processing a calculation layer, integrates multi-mode target identification, behavior analysis, rule generation, alarm abstract generation and regional collaboration large models, improves the intelligent level of the system, and solves the problems of insufficient behavior modeling and dynamic adaptability in the background technology.
And the multi-mode target recognition is realized by integrating a large self-training aircraft recognition model, supporting the joint input of radar images, photoelectric images and infrared images, deploying ONNX an optimized reasoning engine, and improving the robustness of small target and rare target recognition.
Behavior analysis, namely, identifying high-risk behaviors (such as long-standing spiral and fast crossing of a forbidden zone) based on the time sequence data of a target track modeled by a transducer architecture, and generating semantic tags for a rule engine to use.
Rule generation-a fine-tuning language model (e.g., chatGLM) supports natural language generation rule templates, e.g., input "unmanned aircraft entering airport overspeed," conversion to Drools grammar.
And generating an alarm abstract, namely generating semantic text description of the alarm by a large model, for example, enabling an unmanned aerial vehicle to enter a no-fly zone and overrun speed.
Regional collaboration, namely deploying a regional collaboration big model, and optimizing resource scheduling based on cross-regional data reasoning threat.
The module operates in a Docker container, a target recognition model is deployed through ONNX Runtime, multi-mode input (photoelectric image, infrared image and radar point trace) is processed, and recognition accuracy reaches 95%. The behavior analysis model identifies "hover long dwell" behavior based on the sequence of traces in the transducer processing InfluxDB, generates tags (e.g., "high_risk_ loitering"), and inputs to the rules engine. The rule generation model analyzes natural language, generates JSON rules, and stores the JSON rules in PostgreSQL. The alarm abstract model combines the alarm field to generate description, and pushes the description to the short message/mail. The regional collaboration model operates at a central node, analyzes multi-region data, and optimizes an alarm distribution strategy. The module is enhanced by a large model, so that the intelligent level of complex behavior recognition and rule configuration is improved.
In the border patrol scene, the module identifies rare targets (such as small fixed wings), analyzes high-risk behaviors, generates dynamic rules, and improves the alarm accuracy to more than 90%.
And the data storage and management module is used for supporting high concurrent access and history tracing by adopting a cold-hot separation and index optimization strategy through all levels of data storage.
Kafka: raw sense data (raw_radar_data, etc.) is stored, partitioned by sensor type.
Redis, storing real-time target state, sensor health state and active alarm list.
Postgresql+ PostGIS store rule definitions, alarm events, user management, and operation logs, support spatial data retrieval.
InfluxDB/TimescaleDB, storing history track and alarm trend, supporting high-efficiency inquiry.
MinIO/Ceph. Store image/video evidence, named object ID+timestamp, append metadata index.
Kafka Topic stores original data according to regional partition, retains for 7 days, redis stores real-time state, TTL is 1 hour, postgreSQL stores structured data, postGIS supports fence query, influxDB stores track data, optimizes time period query, minIO stores evidence, and file name is target_ < id > _ < timestamp >. Jpg. This design ensures efficient access and traceability of data.
In a judicial evidence obtaining scene, the system rapidly retrieves alarm events, tracks and video evidence, and response time is less than 1 second, so that compliance requirements are met.
Referring to fig. 2, a core module interaction flow of the monitoring system is provided in this embodiment, including acquisition and preprocessing, multi-source fusion and recognition, a rule engine, alarm generation and linkage, and AI enhancement, where the interaction flow flows data between modules in the form of events, and mainly performs asynchronous communication through a Kafka message queue (cylinder), so as to implement decoupling between modules.
The perception data (raw_sensor_data) firstly enters the system, unified target situation data (tracked _ targets) is generated through fusion and identification, a rule engine matches the target situation in real time, and a warning event (warning_events) is triggered. The alarm service enriches the early warning event and distributes the alarm, and real-time state data is stored through the Redis cache for quick access of each module. PostgreSQL and timing database are used to persist rules, alarm logs, and historical track data. The object store (MinIO/Ceph) is used to store image/video evidence.
The final alarm information can be pushed to the front end UI and an external linkage platform (such as a short message, a mail, an emergency platform and the like). According to the embodiment, each module is enabled to communicate asynchronously through the message queue, the throughput, the scalability and the fault tolerance of the system are improved, each module operates independently and is deployed independently, development, maintenance and upgrading are facilitated, alarm handling is achieved from data acquisition, a complete processing chain is formed, and historical data tracing is supported.
Example 2
The embodiment proposes a specific working principle of an AI enhancement module of a monitoring system based on embodiment 1.
Referring to fig. 3, the main operation principles of the AI enhancement module include:
1. Video stream access and preliminary processing, namely transmitting the video stream in real time through a photoelectric sensor, and performing frame extraction and image preprocessing after the video stream is received by a system to prepare standardized image data for an AI model.
2. And the AI reasoning core is that the preprocessed image is input into an AI reasoning service container, a YOLOv or larger AI model is preloaded in the reasoning service container, and acceleration reasoning is carried out through an optimization engine such as TensorRT/ONNX Runtime and the like, and a detection result (such as type, position and confidence) of the target is output.
3. And (3) after-processing and outputting the detection result, namely after-processing AI reasoning results through non-maximum suppression (NMS) and the like, removing repeated and low-confidence targets, then packaging the AI reasoning results into structured data, and publishing the structured data to a video_detection_results theme of Kafka.
4. And the system integration is that the multi-source fusion service of the main system subscribes AI identification results and fuses the AI identification results with other sensor data such as radar, radio frequency and the like to form unified target situation data (tracked _ targets) for the use of subsequent modules.
5. Model iteration and deployment the system also includes a closed loop that continuously optimizes the AI model. The history detection data is used for model training after being manually marked, and a new model is thermally updated into the reasoning service through an automatic pipeline, so that the recognition accuracy is continuously improved.
Example 3
The embodiment provides a specific working principle of a rule-driven intelligent early warning module of a monitoring system on the basis of embodiment 1.
Referring to fig. 4, the system is shown to identify the abnormal behavior of the low-altitude flying target in real time and trigger early warning through the configured rule, the tracked _ targets theme of the Kafka message queue is arranged on the left side of the figure, the tracked _ targets theme carries real-time target situation data after multi-source fusion and identification, and the real-time target situation data is the basis for judging by the rule engine.
The middle part of the graph serves a rule engine, comprises a rule execution engine and is responsible for actually executing rule logic, such as based on a Drools framework, and a rule matcher is used for carrying out rule condition matching on input target data, loading rules from a database and supporting dynamic update (hot loading) of the rules.
The right side of the figure is the source and management of rules, comprising a PostgreSQL database storing definitions of all rules, usually in a structured JSON format, a rule configuration interface for users (administrators) to create, edit and manage rules, and an AI rule template generation service for assisting users to generate rule templates in natural language through an AI large model.
When the rule engine detects that the target meets a certain early warning condition, the module generates an early warning event and issues the early warning event to the warning_events theme of Kafka for the subsequent alarm processing module to use.
Example 4
The embodiment proposes a deployment and cooperation scheme of a monitoring system based on embodiment 1.
Referring to fig. 5, the deployment and collaboration scheme includes a central node, regional substation/edge nodes, various types of sensors.
The central node is located at a core position and is responsible for global management, core service logic and data aggregation. The central node comprises a Kubernetes cluster, deploys core business services (such as rule engine and alarm service) and central data services (such as PostgreSQL, influxDB, minIO/Ceph), and exchanges data through a Kafka message queue.
The regional substations/edge nodes are distributed in different geographical areas and are responsible for accessing sensor data nearby and performing preliminary processing. Each regional node also comprises a Kubernetes cluster, is provided with a perception access service and a processing computing service, and is provided with local Kafka and Redis caches so as to realize localization processing and real-time response of data.
The sensors of various types are deployed in the actual environment and are sources of data, and raw perception data are transmitted to the nearest regional substation.
At the top of FIG. 5 is a CI/CD pipeline and container mirror warehouse responsible for automated construction, deployment and updating of system modules.
The system can finally perform data sharing and alarm linkage with an external linkage platform (such as public security and air traffic control) and is accessed by a user through a front-end interface.
Example 5
Referring to fig. 6, based on embodiment 1, this embodiment proposes a terminal device of a low-altitude monitoring system based on multi-source heterogeneous awareness fusion and rule engine driving, where the terminal device 200 includes at least one memory 210, at least one processor 220, and a bus 230 connecting different platform systems.
Memory 210 may include readable media in the form of volatile memory, such as RAM211 and/or cache 212 memory, and may further include ROM213.
The memory 210 further stores a computer program, and the computer program may be executed by the processor 220, so that the processor 220 executes any application of the low-altitude monitoring system based on multi-source heterogeneous awareness fusion and rule engine driving in the embodiment of the present application, and a specific implementation manner of the application is consistent with an implementation manner and an achieved technical effect described in the embodiment of the application, and some contents are not repeated. Memory 210 may also include a program/utility 214 having a set (at least one) of program modules 215 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Accordingly, the processor 220 may execute the computer programs described above, as well as the program/utility 214.
Bus 230 may be a local bus representing one or more of several types of bus structures including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or using any of a variety of bus architectures.
Terminal device 200 can also communicate with one or more external devices 240, such as a keyboard, pointing device, bluetooth device, etc., as well as one or more devices capable of interacting with the terminal device 200, and/or with any device (e.g., router, modem, etc.) that enables the terminal device 200 to communicate with one or more other computing devices. Such communication may occur through the I/O interface 250. Also, terminal device 200 can communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through network adapter 260. Network adapter 260 may communicate with other modules of terminal device 200 via bus 230. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with terminal device 200, including, but not limited to, microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage platforms, among others.
Example 6
The embodiment proposes a readable storage medium based on a multi-source heterogeneous sensing fusion and rule engine driven low-altitude monitoring system, and an instruction is stored on the computer readable storage medium, and when the instruction is executed by a processor, any one of the low-altitude monitoring systems based on the multi-source heterogeneous sensing fusion and rule engine driven low-altitude monitoring system is realized, and a specific implementation manner of the low-altitude monitoring system is consistent with an implementation manner and an achieved technical effect recorded in the embodiment of the application, and a part of contents are not repeated.
Fig. 7 shows a program product 300 provided by the present embodiment for implementing the above application, which may employ a portable compact disc read-only memory (CD-ROM) and comprise program code, and may be run on a terminal device, such as a personal computer. However, the program product 300 of the present invention is not limited thereto, and in the present embodiment, the readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. Program product 300 may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of a readable storage medium include an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable storage medium may include a data signal propagated in baseband or as part of a carrier wave, with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable storage medium may also be any readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing. Program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
The foregoing has shown and described the basic principles and main features of the present invention and the advantages of the present invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, and that the above embodiments and descriptions are merely illustrative of the principles of the present invention, and various changes and modifications may be made without departing from the spirit and scope of the invention, which is defined in the appended claims. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (8)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202511240561.3A CN120746510A (en) | 2025-09-02 | 2025-09-02 | Low-altitude monitoring system based on multi-source heterogeneous perception fusion and rule engine driving |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202511240561.3A CN120746510A (en) | 2025-09-02 | 2025-09-02 | Low-altitude monitoring system based on multi-source heterogeneous perception fusion and rule engine driving |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN120746510A true CN120746510A (en) | 2025-10-03 |
Family
ID=97180532
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202511240561.3A Pending CN120746510A (en) | 2025-09-02 | 2025-09-02 | Low-altitude monitoring system based on multi-source heterogeneous perception fusion and rule engine driving |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN120746510A (en) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN121000334A (en) * | 2025-10-24 | 2025-11-21 | 北京融合汇控科技有限公司 | A method for driving away drones in urban environments based on low-power interference optimization |
| CN121048592A (en) * | 2025-10-29 | 2025-12-02 | 中国船舶集团有限公司第七〇七研究所 | A dual-decoupled architecture vehicle-mounted mapping system, method, and vehicle-mounted computer |
| CN121193367A (en) * | 2025-11-24 | 2025-12-23 | 杰能科世智能安全科技(杭州)有限公司 | Unmanned aerial vehicle detection countering method, system, equipment and medium |
| CN121048592B (en) * | 2025-10-29 | 2026-02-06 | 中国船舶集团有限公司第七〇七研究所 | Vehicle-mounted mapping system and method with double decoupling structures and vehicle-mounted computer |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107578646A (en) * | 2017-08-28 | 2018-01-12 | 梁晓龙 | low slow small target detection monitoring management system and method |
| CN113486351A (en) * | 2020-06-15 | 2021-10-08 | 中国民用航空局空中交通管理局 | Civil aviation air traffic control network safety detection early warning platform |
| CN113626616A (en) * | 2021-08-25 | 2021-11-09 | 中国电子科技集团公司第三十六研究所 | Aircraft safety early warning method, device and system |
| CN118645019A (en) * | 2024-08-14 | 2024-09-13 | 威海市华美航空科技股份有限公司 | Intelligent UAV data processing system |
| CN120319068A (en) * | 2025-04-10 | 2025-07-15 | 中山大学 | A method, system, device and medium for multi-modal collaborative control of low-altitude traffic |
-
2025
- 2025-09-02 CN CN202511240561.3A patent/CN120746510A/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107578646A (en) * | 2017-08-28 | 2018-01-12 | 梁晓龙 | low slow small target detection monitoring management system and method |
| CN113486351A (en) * | 2020-06-15 | 2021-10-08 | 中国民用航空局空中交通管理局 | Civil aviation air traffic control network safety detection early warning platform |
| CN113626616A (en) * | 2021-08-25 | 2021-11-09 | 中国电子科技集团公司第三十六研究所 | Aircraft safety early warning method, device and system |
| CN118645019A (en) * | 2024-08-14 | 2024-09-13 | 威海市华美航空科技股份有限公司 | Intelligent UAV data processing system |
| CN120319068A (en) * | 2025-04-10 | 2025-07-15 | 中山大学 | A method, system, device and medium for multi-modal collaborative control of low-altitude traffic |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN121000334A (en) * | 2025-10-24 | 2025-11-21 | 北京融合汇控科技有限公司 | A method for driving away drones in urban environments based on low-power interference optimization |
| CN121048592A (en) * | 2025-10-29 | 2025-12-02 | 中国船舶集团有限公司第七〇七研究所 | A dual-decoupled architecture vehicle-mounted mapping system, method, and vehicle-mounted computer |
| CN121048592B (en) * | 2025-10-29 | 2026-02-06 | 中国船舶集团有限公司第七〇七研究所 | Vehicle-mounted mapping system and method with double decoupling structures and vehicle-mounted computer |
| CN121193367A (en) * | 2025-11-24 | 2025-12-23 | 杰能科世智能安全科技(杭州)有限公司 | Unmanned aerial vehicle detection countering method, system, equipment and medium |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN109447048B (en) | Artificial intelligence early warning system | |
| CN202549080U (en) | Fusion system of radar data, flight plan data and ADS-B data | |
| Lv et al. | LiDAR-enhanced connected infrastructures sensing and broadcasting high-resolution traffic information serving smart cities | |
| CN120746510A (en) | Low-altitude monitoring system based on multi-source heterogeneous perception fusion and rule engine driving | |
| EP4198944A1 (en) | Roadside sensing system and traffic control method | |
| Jiang et al. | Ultra large-scale crowd monitoring system architecture and design issues | |
| US20180090012A1 (en) | Methods and systems for unmanned aircraft systems (uas) traffic management | |
| CN102637040A (en) | Unmanned aerial vehicle cluster visual navigation task coordination method and system | |
| CN112634663A (en) | General aviation flight plan and monitoring target association system and method | |
| CN105139606B (en) | A low-altitude aircraft information exchange system | |
| CN118824068B (en) | Aircraft collision prediction method, device, equipment, medium and program product | |
| CN109523836A (en) | A kind of unmanned plane aviation management access General Platform | |
| Zheng et al. | Intelligent airport collaborative decision making (A-CDM) system | |
| Ding et al. | Edge-to-cloud intelligent vehicle-infrastructure based on 5G time-sensitive network integration | |
| Blasch et al. | Uncertainty ontology for veracity and relevance | |
| Fu et al. | AI-Powered CPS-Enabled Urban Transportation Digital Twin: Methods and Applications | |
| Asahara et al. | International standard “OGC moving features” to address “4Vs” on locational BigData | |
| CN116484118A (en) | UAV flight environment data service system and method | |
| CN205961178U (en) | Low -altitude surveillance and management service system | |
| Deliparaschos et al. | A preliminary investigation of an autonomous vehicle validation infrastructure for smart cities | |
| Zhang et al. | Cooperative Safety Intelligence in V2X-Enabled Transportation: A Survey | |
| Fiorin De Carvalho | Integration of VRUs in Cooperative Perception: Extending CARLA and VaN3Twin | |
| CN121483084A (en) | AI-based multi-source heterogeneous traffic big data accident early warning intervention system | |
| CHEN et al. | Information Collection Technology in ITS | |
| Seabra | Plataforma de Gestão para Sistemas de Transportes Cooperativos |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |