CN112580702B - Multi-sensor collaborative sensing method and device - Google Patents
Multi-sensor collaborative sensing method and device Download PDFInfo
- Publication number
- CN112580702B CN112580702B CN202011434559.7A CN202011434559A CN112580702B CN 112580702 B CN112580702 B CN 112580702B CN 202011434559 A CN202011434559 A CN 202011434559A CN 112580702 B CN112580702 B CN 112580702B
- Authority
- CN
- China
- Prior art keywords
- sensor
- information
- sensing information
- original sensing
- original
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 57
- 230000000694 effects Effects 0.000 claims abstract description 28
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 25
- 230000004927 fusion Effects 0.000 claims description 59
- 230000008569 process Effects 0.000 claims description 21
- 238000013528 artificial neural network Methods 0.000 claims description 13
- 238000004590 computer program Methods 0.000 claims description 10
- 230000001953 sensory effect Effects 0.000 claims description 5
- 238000006073 displacement reaction Methods 0.000 claims description 3
- 238000005457 optimization Methods 0.000 abstract description 8
- 230000008901 benefit Effects 0.000 abstract description 5
- 230000008713 feedback mechanism Effects 0.000 abstract description 5
- 239000010410 layer Substances 0.000 description 26
- 235000013305 food Nutrition 0.000 description 17
- 241000257303 Hymenoptera Species 0.000 description 13
- 238000010586 diagram Methods 0.000 description 10
- 230000007246 mechanism Effects 0.000 description 6
- 235000012907 honey Nutrition 0.000 description 5
- 230000006870 function Effects 0.000 description 3
- 238000013178 mathematical model Methods 0.000 description 3
- 238000003062 neural network model Methods 0.000 description 3
- 210000002569 neuron Anatomy 0.000 description 3
- 230000008447 perception Effects 0.000 description 3
- 238000011160 research Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 2
- 238000000691 measurement method Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000004445 quantitative analysis Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001364 causal effect Effects 0.000 description 1
- 210000004027 cell Anatomy 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000019637 foraging behavior Effects 0.000 description 1
- 230000002431 foraging effect Effects 0.000 description 1
- 239000011229 interlayer Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 230000009024 positive feedback mechanism Effects 0.000 description 1
- 238000004451 qualitative analysis Methods 0.000 description 1
- 230000007115 recruitment Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/006—Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/02—Knowledge representation; Symbolic representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/04—Inference or reasoning models
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Testing Or Calibration Of Command Recording Devices (AREA)
Abstract
The invention provides a multi-sensor collaborative sensing method and a device, wherein the method comprises the following steps: according to the artificial bee colony algorithm, calculating the coverage effect of the sensor nodes; acquiring original sensing information corresponding to a plurality of sensors respectively according to the coverage effect of the sensor nodes; and fusing original sensing information corresponding to the plurality of sensors respectively to form the target sensing information. The method has the advantages that the artificial bee colony algorithm is used as an optimization algorithm, and the positive and negative feedback mechanisms are applied to calculate the coverage effect of the sensor nodes. Thus, fewer sensor nodes can be used to obtain more comprehensive original sensing information. And the original sensing information corresponding to the sensors is fused, so that the accuracy and the efficiency of the multi-sensor collaborative sensing method are improved.
Description
Technical Field
The invention relates to the field of sensor perception, in particular to a multi-sensor collaborative perception method and device.
Background
The sensor is an important component of intelligent machines and systems that function like a human perception organ. The state of the surrounding environment can be perceived, and necessary information is provided for the system. With the increasing complexity of working environments and tasks, people have put higher demands on the performance of intelligent systems.
In multi-sensor data fusion, a large amount of uncertainty information is contained. And the data of the multiple sensors are fused, so that the accuracy of the description of the environmental characteristics can be improved. However, due to the reasons of difference in information space, difficulty in data association, difficulty in time synchronization, unsuitable data fusion method, unreasonable data fusion structure and the like, the accuracy and efficiency of the multi-sensor collaborative sensing method at present need to be improved.
Disclosure of Invention
The technical problems to be solved by the invention are as follows: a multi-sensor cooperative sensing method is provided, so that the multi-sensors can accurately and effectively sense cooperatively.
In order to solve the technical problems, the invention adopts the following technical scheme: the multi-sensor collaborative sensing method comprises the following steps:
according to the artificial bee colony algorithm, calculating the coverage effect of the sensor nodes;
acquiring original sensing information corresponding to a plurality of sensors respectively according to the coverage effect of the sensor nodes;
and fusing original sensing information corresponding to the plurality of sensors respectively to form the target sensing information.
Further, the step of fusing the original sensing information corresponding to each of the plurality of sensors includes: acquiring a sensor information fusion model; the sensor information fusion model comprises at least two data fusion modes;
and selecting the original sensing information fusion by using the data fusion mode according to the priority of the data fusion mode.
Specifically, the step of selecting the original sensing information fusion by using the data fusion mode includes: determining characteristic information of the original sensing information by applying the data fusion mode;
judging whether the original sensing information is suitable for the data fusion mode or not according to the characteristic information of the original sensing information and corresponding preset information;
and fusing the original sensing information according to a data fusion mode suitable for the sensing information.
It should be understood that the characteristic information of the original sensing information includes: and fusing a specific value in the characteristic information of the original sensing information and/or fusing a specific image generated by the original sensing information according to the data fusion mode.
Further, the step of calculating the coverage effect of the sensor node according to the artificial bee colony algorithm includes:
and according to the artificial bee colony algorithm, the position, the network coverage rate, the deployment process and the deployment speed of the sensor are applied, and the coverage effect of the sensor node is calculated.
Specifically, the step of fusing original sensing information corresponding to each of the plurality of sensors includes:
applying a neural network to fuse the original information of the heterogeneous sensor according to the type of the original sensing information;
the kind of the original sensing information comprises one, two or more of space position information, environment temperature information and safety information.
The heterogeneous sensor comprises a spatial position sensor, an ambient temperature sensor or a safety information sensor;
the spatial position sensor comprises a displacement sensor and an ultrasonic sensor;
the environment temperature sensor comprises a temperature sensor and a heat-sensitive sensor;
the safety information sensor comprises a collision sensor, a sound sensor, a vibration sensor and a flame sensor.
The application also provides a multi-sensor cooperative sensing device, which comprises:
the coverage module is used for calculating the coverage effect of the sensor nodes according to the artificial bee colony algorithm;
the acquisition module is used for acquiring original sensing information corresponding to the plurality of sensors respectively according to the coverage effect of the sensor nodes;
and the fusion module is used for fusing the original sensing information corresponding to the plurality of sensors respectively to form the target sensing information.
The application also provides a terminal comprising a processor, a memory and a display, wherein the processor is coupled with the memory and the display, and the memory stores a computer program executable on the processor; the processor executes the computer program to implement the method described above.
The present application also provides a storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the above method.
The invention has the beneficial effects that: the method has the advantages that the artificial bee colony algorithm is used as an optimization algorithm, and the positive and negative feedback mechanisms are applied to calculate the coverage effect of the sensor nodes. Thus, fewer sensor nodes can be used to obtain more comprehensive original sensing information. And the original sensing information corresponding to the sensors is fused, so that the accuracy and the efficiency of the multi-sensor collaborative sensing method are improved.
Drawings
The following details the specific construction of the present invention with reference to the accompanying drawings
FIG. 1 is a flow chart of a multi-sensor collaborative awareness method according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating coverage effects of a sensor node according to a first embodiment of the present invention;
FIG. 3 is a schematic diagram of selecting the original sensory information fusion in a second embodiment of the present invention;
FIG. 4 is a flow chart of selecting the original sensory information fusion in a third embodiment of the present invention;
FIG. 5 is a schematic diagram of selecting the original sensory information fusion in a third embodiment of the present invention;
FIG. 6 is a schematic diagram of applying neural network training in a fourth embodiment of the present invention;
FIG. 7 is a schematic diagram of a knowledge base of sensor information fusion in a fifth embodiment of the invention;
Detailed Description
In order to describe the technical content, the constructional features, the achieved objects and effects of the present invention in detail, the following description is made in connection with the embodiments and the accompanying drawings.
Referring to fig. 1 and 2, fig. 1 is a flowchart of a multi-sensor collaborative sensing method according to a first embodiment of the present invention; fig. 2 is a schematic diagram illustrating an overlay effect of a sensor node according to a first embodiment of the present invention.
The application provides a multi-sensor collaborative sensing method, which comprises the following steps:
step S100, according to a manual bee colony algorithm, calculating a covering effect of the sensor node;
step 200, acquiring original sensing information corresponding to a plurality of sensors respectively according to the coverage effect of the sensor nodes;
and step S300, fusing original sensing information corresponding to the plurality of sensors respectively to form the target sensing information.
The method has the advantages that the artificial bee colony algorithm is used as an optimization algorithm, and the positive and negative feedback mechanisms are applied to calculate the coverage effect of the sensor nodes. Thus, fewer sensor nodes can be used to obtain more comprehensive original sensing information. And the original sensing information corresponding to the sensors is fused, so that the accuracy and the efficiency of the multi-sensor collaborative sensing method are improved.
The design of the artificial bee colony algorithm is derived from foraging behavior of bees in nature. The bee foraging model has three components (employment and non-employment of bees, food sources) and two basic behaviors (recruitment of bees to pick up honey, abandonment of food sources).
Wherein non-employment bees, including observation bees and reconnaissance bees. The food sources represent potential solutions to the optimization problem, each food source typically corresponding to one employment bee. The employment bees optimize the food source based on the current food source information. The observing bees select to feed in the vicinity of the cell by employing food source information provided by the bees. The bees will choose to discard poor quality food sources and re-search for new food sources based on the current abundance of food sources.
The artificial bee colony algorithm is used as an intelligent optimization algorithm, and a positive and negative feedback mechanism of the algorithm is realized by means of the self-organizing capability of bees. The positive feedback mechanism is as follows: when the nectar quality of one food source is higher, the number of the corresponding observed bees is increased, so that the food source is developed in a better direction. The negative feedback mechanism is as follows: when the nectar quality of a food source has not improved for a long period of time, employment and observation of bees will gradually decrease its optimization, eventually discarding the food source and finding a new food source.
Specifically, in this embodiment, in step S100, the step of calculating the coverage effect of the sensor node according to the artificial bee colony algorithm includes:
and according to the artificial bee colony algorithm, the position, the network coverage rate, the deployment process and the deployment speed of the sensor are applied, and the coverage effect of the sensor node is calculated.
The position of the sensor corresponds to the position of the honey source, the network coverage rate of the sensor corresponds to the richness of the food source, the deployment process of the sensor corresponds to the honey collection process, and the speed of the sensor corresponds to the speed of searching the honey source and collecting the honey. The problem of coverage optimization of the sensing network is converted into a process of searching an optimal food source in a manual bee colony algorithm. The problem of coverage optimization of the sensor network is that a plurality of nodes capable of sensing and communicating are arranged in a space with a known size, fewer sensor nodes are used, and the maximization of the network range is realized. The node measurement commonly used includes two models, namely a binary measurement method and a probability measurement method.
Step 200, acquiring original sensing information corresponding to a plurality of sensors respectively according to the coverage effect of the sensor nodes.
It should be understood that in practical applications, the high-level task is decomposed into a plurality of sub-tasks according to the complexity of the original sensing information, and the sub-tasks may be further subdivided into a plurality of tasks, which approximates the classification of the binary tree.
It should also be appreciated that the development of knowledge base systems is regarded as a process of converting human knowledge into a specific knowledge base, and research hotspots have focused on the research of various formalized knowledge representation methods. Knowledge representation refers to the process by which knowledge is symbolized and passed to a computer. It contains two layers of meaning: using a given knowledge structure to represent knowledge according to a certain principle and organization; the meaning of the represented knowledge is explained. In its form, knowledge is a data structure that is used to organize the knowledge needed to solve a problem. Thus, the same knowledge may have different representations, but different representations may produce different effects.
Knowledge representation is a core research field in AI, knowledge engineering and knowledge base systems, and is one of the keys in knowledge base systems for effectively representing knowledge, applying knowledge and managing knowledge. At present, knowledge representation methods in AI are various, and there are various knowledge representation methods such as first-order predicate logic, semantic net, frame generation rule, script, object-oriented, and the like. The reasoning mechanism is an important component in the knowledge system, and at present, the reasoning mechanism in the knowledge system mainly comprises 3 kinds:
1) Rule-based reasoning mechanisms. Rule-based reasoning, i.e., reasoning based on domain expert knowledge and experience, abstracts the expert knowledge and experience into F-THEN rules in several reasoning processes. Its advantages are visual sense, easy understanding of reasoning process and high reasoning efficiency. However, the expert knowledge and experience of RBR are difficult to obtain, and the method is mainly applicable to small and medium-sized knowledge systems with less complex systems and easy collection of expert knowledge in the field.
2) Model-based reasoning mechanisms. The reasoning based on the model utilizes the characteristics and principles of the system structure or component elements and the like as the problem to be solved to build a mathematical model, and then utilizes the condition of combining the mathematical model with the problem to make reasoning and judgment on the system so as to achieve the purpose of solving the system. The semantic web representation of expert knowledge in expert systems uses this reasoning approach. Obviously, not all systems are suitable for MBR, and some complex systems cannot build corresponding mathematical models at all, so that the application of MBR is limited.
3) Case-based reasoning mechanisms. In view of the difficulties encountered by RBR and MBR in application processes, particularly in complex systems, there is a strong need to find a new reasoning mechanism. Case-based reasoning is an important knowledge-based problem solving and learning method which is newly grown in the field of artificial intelligence. The method shows strong vitality to people by the unique reasoning style and successful application, and is widely interesting in the international AI world.
The technical scheme of the invention uses quantitative analysis, performs object-oriented reasoning design, uses a framework with strong flexibility and wide application range, and packages various knowledge information in relatively independent units. When using this model, multiple knowledge representations such as inference methods and membership functions need to be organically unified into the object through individual knowledge units.
Based on this, referring to fig. 3, fig. 3 is a schematic diagram illustrating the selection of the original sensing information fusion according to a second embodiment of the present invention; step S300, the step of fusing the original sensing information corresponding to each of the plurality of sensors includes:
step S310, acquiring a sensor information fusion model; the sensor information fusion model comprises at least two data fusion modes.
In a specific embodiment, after the system acquires a certain instruction as a significant sign, a causal correspondence method is used to select a data fusion model. And the sensor information fusion model comprises n data fusion modes, and the data fusion modes are called assumptions in reasoning. This step belongs to qualitative analysis, and reasoning is usually given in the form of rules, mainly based on expert experience knowledge.
And step 320, selecting the original sensing information fusion by using the data fusion mode according to the priority of the data fusion mode. The method has the advantage of high reasoning speed, but the accuracy of reasoning is not enough.
The above steps S310 to S320 belong to qualitative knowledge-based reasoning, which is mainly performed according to expert experience knowledge and is often given in the form of rules.
Referring to fig. 4 and 5, fig. 4 is a flowchart illustrating the selection of the original sensing information fusion according to a third embodiment of the present invention; fig. 5 is a schematic diagram of selecting the fusion of the original sensing information according to the third embodiment of the present invention. Step S320 specifically includes:
and S321, determining the characteristic information of the original sensing information by applying the data fusion mode. Wherein, the characteristic information of the original sensing information comprises: and fusing a specific value in the characteristic information of the original sensing information and/or fusing a specific image generated by the original sensing information according to the data fusion mode. The reasoning process corresponding to this step is to assign the most likely data fusion object, observe some phenomenon of this object, or some parameter, according to different assumptions.
Step S322, judging whether the original sensing information is suitable for the data fusion mode according to the characteristic information of the original sensing information and the corresponding preset information. The reasoning process corresponding to the step is to call the corresponding static knowledge from the knowledge base.
Step S323, fusing the original sensing information according to a data fusion mode suitable for the sensing information. The reasoning process corresponding to the step is that the observed phenomenon or the test result is compared with the static knowledge of the object, whether the assumption is true is judged according to the comparison result, if the assumption is true, the assumption is described as a reason of data fusion, the process is recorded, and the next assumption is selected to continue to reason and repeat the process until the assumption set is empty.
Steps S321-S323, which belong to quantitative analysis, are mainly to compare the difference between an actual value and an expected value, analyze the reason for causing the difference, and determine the data fusion reason; it should be understood that, by applying steps S321 to S323 of the present embodiment, the principle of data fusion may be generated, and a repair suggestion may also be generated.
The second embodiment and the third embodiment are combined, and the steps S310 to S323 are applied, so that the defects of the second embodiment and the third embodiment are overcome by organically combining the reasoning based on qualitative knowledge and the reasoning based on quantitative knowledge, and a good balance point is found between the reasoning speed and the accuracy.
It should be appreciated that neural networks may be categorized according to the topology and information flow direction of the network. When classified according to the topological structure, the neural network model can be divided into a hierarchical structure, an interconnection structure and a sparse structure according to the connection mode among neurons. The flow direction according to the internal information can be divided into a feedforward type neural network model and a feedback type neural network model.
The hierarchical structure model comprises three layers, namely an input layer, an hidden layer and an output layer, wherein the number of the layers can be layers or multiple layers, and the number of the hidden layers is generally added when the layer adding process is carried out. The input layer receives and processes the external information, and the external information is further processed by each hidden layer and is transmitted by the output layer, and the common hierarchical network model is a simple model, an output and input connection model and an intra-layer connection model. The three models are all layered structures, and the difference is that each layer of neurons not only has the function of receiving or processing the previous layer of information, but also plays a role of processing interlayer information or information among the neurons of the layer, and different structural models are selected according to actual conditions when the model is applied.
The multi-layer feedforward neural network with hidden layers can greatly improve the classification capability of the network, and because the multi-layer feedforward network is trained by adopting an error back propagation algorithm, the multi-layer feedforward neural network with the characteristics is called as a BP neural network. The learning process of the BP neural network is divided into two parts: forward propagation of the signal and reverse propagation of the error.
In forward propagation, samples are passed in from the input layer, then processed layer by the hidden layer, and passed to the output layer. If the actual output differs from the expected one, an error back propagation phase is entered. In the error back propagation stage, the output error is transmitted to the hidden layer in a certain form and then is further reversely transmitted to the input layer, so that the error is divided into all units and is also set as the basis for the weight correction of each unit. These two learning processes of the BP neural network are repeated. By constantly adjusting the weights, an acceptable level is achieved.
Optionally, referring to fig. 6, fig. 6 is a schematic diagram illustrating application of neural network training in a fourth embodiment of the present invention. In step S300, the fusing the original sensing information corresponding to each of the plurality of sensors further includes:
step S340, applying a neural network to fuse the original information of the heterogeneous sensor according to the type of the original sensing information; the kind of the original sensing information comprises one, two or more of space position information, environment temperature information and safety information. Therefore, the original sensing information can be trained by applying the BP neural network, and the original information of the heterogeneous sensor can be efficiently and quickly integrated, so that the information fusion can be performed more quickly.
Referring to fig. 7, fig. 7 is a schematic diagram of a knowledge base for sensor information fusion according to a fifth embodiment of the invention; the heterogeneous sensor comprises a spatial position sensor, an ambient temperature sensor or a safety information sensor; the spatial position sensor comprises a displacement sensor and an ultrasonic sensor;
the environment temperature sensor comprises a temperature sensor and a heat-sensitive sensor;
the safety information sensor comprises a collision sensor, a sound sensor, a vibration sensor and a flame sensor.
Optionally, in step S300, the fusing the raw sensing information corresponding to each of the plurality of sensors further includes:
and step 350, applying a neural network to fuse the same type of sensors distributed at different positions.
The application also discloses a multisensor cooperative sensing device, includes:
the coverage module is used for calculating the coverage effect of the sensor nodes according to the artificial bee colony algorithm;
the acquisition module is used for acquiring original sensing information corresponding to the plurality of sensors respectively according to the coverage effect of the sensor nodes;
and the fusion module is used for fusing the original sensing information corresponding to the plurality of sensors respectively to form the target sensing information.
A further terminal of the present application comprises a processor, a memory and a display, wherein the processor is coupled with the memory and the display, and a computer program executable on the processor is stored on the memory; the processor executes the computer program to implement the method described above.
The present application also provides a storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method described above.
The foregoing description is only illustrative of the present invention and is not intended to limit the scope of the invention, and all equivalent structures or equivalent processes or direct or indirect application in other related technical fields are included in the scope of the present invention.
Claims (8)
1. The multi-sensor collaborative sensing method is characterized by comprising the following steps of:
according to the artificial bee colony algorithm, calculating the coverage effect of the sensor nodes;
acquiring original sensing information corresponding to a plurality of sensors respectively according to the coverage effect of the sensor nodes;
fusing original sensing information corresponding to the plurality of sensors respectively to form target sensing information;
the step of fusing the original sensing information respectively corresponding to the plurality of sensors comprises the following steps:
acquiring a sensor information fusion model; the sensor information fusion model comprises at least two data fusion modes;
according to the priority of the data fusion mode, applying the data fusion mode to determine the characteristic information of the original sensing information;
judging whether the original sensing information is suitable for the data fusion mode or not according to the characteristic information of the original sensing information and corresponding preset information;
and fusing the original sensing information according to a data fusion mode suitable for the sensing information.
2. The multi-sensor collaborative awareness method of claim 1, wherein the characteristic information of the raw sensory information includes:
a particular value in the characteristic information of the raw sensor information, and/or,
and fusing the specific image generated by the original sensing information according to the data fusion mode.
3. The multi-sensor collaborative awareness method according to claim 1, wherein the step of calculating a coverage effect of sensor nodes according to an artificial bee colony algorithm comprises:
and according to the artificial bee colony algorithm, the position, the network coverage rate, the deployment process and the deployment speed of the sensor are applied, and the coverage effect of the sensor node is calculated.
4. A multi-sensor collaborative awareness according to any one of claims 1-3, wherein the step of fusing raw sensory information corresponding to each of the plurality of sensors comprises:
applying a neural network to fuse the original information of the heterogeneous sensor according to the type of the original sensing information;
the kind of the original sensing information comprises one, two or more of space position information, environment temperature information and safety information.
5. The multi-sensor collaborative awareness method of claim 4, wherein the disparate sensors comprise a spatial location sensor, an ambient temperature sensor, or a security information sensor;
the spatial position sensor comprises a displacement sensor and an ultrasonic sensor;
the environment temperature sensor comprises a temperature sensor and a heat-sensitive sensor;
the safety information sensor comprises a collision sensor, a sound sensor, a vibration sensor and a flame sensor.
6. A multi-sensor cooperative sensing apparatus, comprising:
the coverage module is used for calculating the coverage effect of the sensor nodes according to the artificial bee colony algorithm;
the acquisition module is used for acquiring original sensing information corresponding to the plurality of sensors respectively according to the coverage effect of the sensor nodes;
the fusion module is used for acquiring a sensor information fusion model; the sensor information fusion model comprises at least two data fusion modes; according to the priority of the data fusion mode, applying the data fusion mode to determine the characteristic information of the original sensing information; judging whether the original sensing information is suitable for the data fusion mode or not according to the characteristic information of the original sensing information and corresponding preset information; and fusing the original sensing information according to a data fusion mode suitable for the sensing information to form target sensing information.
7. A terminal comprising a processor, a memory and a display, the processor being coupled to the memory and the display, the memory having stored thereon a computer program executable on the processor; the processor executing the computer program for implementing the method of one of claims 1 to 5.
8. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of one of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011434559.7A CN112580702B (en) | 2020-12-10 | 2020-12-10 | Multi-sensor collaborative sensing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011434559.7A CN112580702B (en) | 2020-12-10 | 2020-12-10 | Multi-sensor collaborative sensing method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112580702A CN112580702A (en) | 2021-03-30 |
CN112580702B true CN112580702B (en) | 2024-01-23 |
Family
ID=75132057
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011434559.7A Active CN112580702B (en) | 2020-12-10 | 2020-12-10 | Multi-sensor collaborative sensing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112580702B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116227363B (en) * | 2023-04-25 | 2023-08-15 | 湖南省水务规划设计院有限公司 | Flood early warning method based on sensor distribution optimization |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102149158A (en) * | 2011-04-18 | 2011-08-10 | 武汉理工大学 | Method for fusing sensor grid data based on grid clustering |
CN103731848A (en) * | 2012-10-12 | 2014-04-16 | 北京源微达科技有限公司 | Data fusion method for multi-sensor distributed system |
CN104571079A (en) * | 2014-11-25 | 2015-04-29 | 东华大学 | Wireless long-distance fault diagnosis system based on multiple-sensor information fusion |
CN105704729A (en) * | 2016-01-22 | 2016-06-22 | 南京大学 | Wireless sensor deployment method employing improved artificial bee colony algorithm |
CN110992298A (en) * | 2019-12-02 | 2020-04-10 | 深圳市唯特视科技有限公司 | Genetic algorithm-based radiation source target identification and information analysis method |
CN110989618A (en) * | 2019-12-23 | 2020-04-10 | 福建工程学院 | A bee swarm carrier vehicle cooperative carrying control system and method |
CN111582130A (en) * | 2020-04-30 | 2020-08-25 | 长安大学 | A traffic behavior perception fusion system and method based on multi-source heterogeneous information |
CN111860589A (en) * | 2020-06-12 | 2020-10-30 | 中山大学 | Multi-sensor multi-target cooperative detection information fusion method and system |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10410113B2 (en) * | 2016-01-14 | 2019-09-10 | Preferred Networks, Inc. | Time series data adaptation and sensor fusion systems, methods, and apparatus |
US11210570B2 (en) * | 2018-01-23 | 2021-12-28 | Intelligent Fusion Technology, Inc. | Methods, systems and media for joint manifold learning based heterogenous sensor data fusion |
-
2020
- 2020-12-10 CN CN202011434559.7A patent/CN112580702B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102149158A (en) * | 2011-04-18 | 2011-08-10 | 武汉理工大学 | Method for fusing sensor grid data based on grid clustering |
CN103731848A (en) * | 2012-10-12 | 2014-04-16 | 北京源微达科技有限公司 | Data fusion method for multi-sensor distributed system |
CN104571079A (en) * | 2014-11-25 | 2015-04-29 | 东华大学 | Wireless long-distance fault diagnosis system based on multiple-sensor information fusion |
CN105704729A (en) * | 2016-01-22 | 2016-06-22 | 南京大学 | Wireless sensor deployment method employing improved artificial bee colony algorithm |
CN110992298A (en) * | 2019-12-02 | 2020-04-10 | 深圳市唯特视科技有限公司 | Genetic algorithm-based radiation source target identification and information analysis method |
CN110989618A (en) * | 2019-12-23 | 2020-04-10 | 福建工程学院 | A bee swarm carrier vehicle cooperative carrying control system and method |
CN111582130A (en) * | 2020-04-30 | 2020-08-25 | 长安大学 | A traffic behavior perception fusion system and method based on multi-source heterogeneous information |
CN111860589A (en) * | 2020-06-12 | 2020-10-30 | 中山大学 | Multi-sensor multi-target cooperative detection information fusion method and system |
Non-Patent Citations (5)
Title |
---|
Research on efficient-efficient routing protocol for WSNs based on improved artificial bee colony algorithm;Huadong Wang 等;《IET Wireless Sensor Systems》;第07卷(第01期);15-20 * |
基于人工蜂群算法的无线传感器网络部署问题研究;于文杰;《中国博士学位论文全文数据库信息科技辑》(第06期);I140-16 * |
基于定向视觉传感器网络的覆盖优化;徐瑞;《中国优秀硕士学位论文全文数据库信息科技辑》(第09期);I140-200 * |
基于混沌人工蜂群算法的无线传感器网络覆盖优化;文政颖 等;《计算机测量与控制》;第22卷(第05期);1609-1612 * |
多传感器数据融合系统闭环控制模式的构成与分析;刘先省 等;《信息与控制》(第02期);145-151 * |
Also Published As
Publication number | Publication date |
---|---|
CN112580702A (en) | 2021-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Gunning et al. | DARPA’s explainable artificial intelligence (XAI) program | |
Zhang et al. | Deep learning-enabled intelligent process planning for digital twin manufacturing cell | |
Zhang et al. | Active and dynamic information fusion for multisensor systems with dynamic Bayesian networks | |
CN109685110B (en) | Training method of image classification network, image classification method and device, and server | |
Lv et al. | A bio-inspired LIDA cognitive-based Digital Twin architecture for unmanned maintenance of machine tools | |
US20140324747A1 (en) | Artificial continuously recombinant neural fiber network | |
CN112036540B (en) | A sensor number optimization method based on dual-population hybrid artificial bee colony algorithm | |
Li et al. | Fault diagnosis expert system of semiconductor manufacturing equipment using a Bayesian network | |
Sanders et al. | AI tools for use in assembly automation and some examples of recent applications | |
Nassehi et al. | Review of machine learning technologies and artificial intelligence in modern manufacturing systems | |
Agami et al. | An innovative fuzzy logic based approach for supply chain performance management | |
Howell et al. | User centered neuro-fuzzy energy management through semantic-based optimization | |
Oakes et al. | Structuring and accessing knowledge for historical and streaming digital twins | |
CN119398824A (en) | Internet big data analysis method and system based on artificial intelligence | |
CN112580702B (en) | Multi-sensor collaborative sensing method and device | |
CN118779618A (en) | A food safety monitoring method and system based on multimodal large model | |
Liu et al. | A multi-hierarchical aggregation-based graph convolutional network for industrial knowledge graph embedding towards cognitive intelligent manufacturing | |
Zhu et al. | An intelligent collaboration framework of IoT applications based on event logic graph | |
Fritze et al. | A support system for sensor and information fusion system design | |
Zile | Intelligent and adaptive control | |
CN113743883A (en) | Brain-like elicitation-based industrial manufacturing general intelligent decision collaborative optimization system | |
CN119151526A (en) | Prediction optimization and method for rural household garbage clearing time | |
Sharp et al. | Hierarchical modeling of a manufacturing work cell to promote contextualized PHM information across multiple levels | |
Mendonça et al. | Making sense of digital twins: an analytical framework | |
CN117833200A (en) | Rapid collaborative assessment method and system for adequacy of power system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |