CN116804552A - Multi-bionic robot dog cross-floor three-dimensional inspection method and system - Google Patents
Multi-bionic robot dog cross-floor three-dimensional inspection method and system Download PDFInfo
- Publication number
- CN116804552A CN116804552A CN202310516688.8A CN202310516688A CN116804552A CN 116804552 A CN116804552 A CN 116804552A CN 202310516688 A CN202310516688 A CN 202310516688A CN 116804552 A CN116804552 A CN 116804552A
- Authority
- CN
- China
- Prior art keywords
- task
- robot
- dog
- robot dog
- floor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 88
- 238000000034 method Methods 0.000 title claims abstract description 86
- 239000011664 nicotinic acid Substances 0.000 title claims abstract description 43
- 241000282472 Canis lupus familiaris Species 0.000 claims abstract description 205
- 238000013439 planning Methods 0.000 claims abstract description 63
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 30
- 230000007613 environmental effect Effects 0.000 claims abstract description 13
- 230000006399 behavior Effects 0.000 claims abstract description 5
- 238000004891 communication Methods 0.000 claims description 27
- 230000008569 process Effects 0.000 claims description 23
- 238000011156 evaluation Methods 0.000 claims description 17
- 238000005265 energy consumption Methods 0.000 claims description 13
- 230000006870 function Effects 0.000 claims description 12
- 230000007423 decrease Effects 0.000 claims description 11
- 238000005286 illumination Methods 0.000 claims description 11
- 238000005516 engineering process Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 8
- 238000003860 storage Methods 0.000 claims description 8
- 238000001914 filtration Methods 0.000 claims description 7
- 238000007726 management method Methods 0.000 claims description 7
- 238000010276 construction Methods 0.000 claims description 6
- 238000012502 risk assessment Methods 0.000 claims description 6
- 239000002245 particle Substances 0.000 claims description 5
- 238000001514 detection method Methods 0.000 claims description 4
- 238000013507 mapping Methods 0.000 claims description 4
- 238000012544 monitoring process Methods 0.000 claims description 4
- 230000008447 perception Effects 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 claims description 4
- 230000009471 action Effects 0.000 claims description 3
- 238000012790 confirmation Methods 0.000 claims description 3
- 238000003891 environmental analysis Methods 0.000 claims description 3
- 238000009472 formulation Methods 0.000 claims description 3
- 230000001788 irregular Effects 0.000 claims description 3
- 239000000463 material Substances 0.000 claims description 3
- 239000000203 mixture Substances 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 3
- 238000012423 maintenance Methods 0.000 claims description 2
- 238000005259 measurement Methods 0.000 description 9
- 230000004888 barrier function Effects 0.000 description 5
- 238000009826 distribution Methods 0.000 description 5
- 238000002474 experimental method Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 208000028571 Occupational disease Diseases 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 238000009529 body temperature measurement Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000009194 climbing Effects 0.000 description 1
- 238000010835 comparative analysis Methods 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000008713 feedback mechanism Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000013468 resource allocation Methods 0.000 description 1
- 238000010415 tidying Methods 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a multi-bionic robot dog cross-floor three-dimensional inspection method and a system, which relate to the technical field of robot dog cross-floor three-dimensional inspection and comprise the following steps: the bionic robot dog navigates across floors and builds a map; performing automatic navigation obstacle avoidance on the built map through path planning to complete a specified inspection task; and performing multi-machine dog task allocation by adopting a task allocation method based on behaviors. According to the multi-bionic robot dog cross-floor three-dimensional inspection method, through the cooperative work of the plurality of robot dogs, more inspection tasks can be completed in a shorter time, the overall inspection efficiency is improved, the real-time obstacle avoidance algorithm and the task allocation method are adopted, the collision risks among the robot dogs and between the robot dogs and environmental obstacles are effectively reduced, the task priority is adjusted in real time based on the task allocation method of behaviors, so that the robot dogs can respond to sudden tasks or environmental changes more flexibly, and the overall moving path length of the robot dogs is reduced by optimizing task allocation and path planning.
Description
Technical Field
The invention relates to the technical field of machine dog cross-floor three-dimensional inspection, in particular to a multi-bionic machine dog cross-floor three-dimensional inspection method and system.
Background
With the development of technology, robotics has been widely used in many fields, especially in industry and daily life. Among them, a bionic robot dog has become a research hotspot as a robot dog with high autonomy and flexibility. The bionic robot dog has good stability, maneuverability and obstacle crossing capability, and can be used for tasks in various complex environments.
At present, multi-machine dog cross-floor inspection generally adopts different floor navigation technologies. Firstly, the machine dog identifies and positions the floor, and the floor where the machine dog is located is determined; secondly, the machine dog needs to interact with the elevator, and the machine dog is transported to a target floor which needs to be reached; and finally, repositioning and switching the destination floor navigation map after reaching the destination floor.
However, in the multi-machine dog cross-floor inspection process, the method is likely to have path overlapping, and when the number of machine dogs is increased, the probability of collision and locking is also greatly improved. Meanwhile, as the environment complexity is improved, the load of the centralized controller is increased, and the real-time performance is difficult to ensure.
Disclosure of Invention
This section is intended to outline some aspects of embodiments of the application and to briefly introduce some preferred embodiments. Some simplifications or omissions may be made in this section as well as in the description of the application and in the title of the application, which may not be used to limit the scope of the application.
The present invention has been made in view of the above-described problems.
Therefore, the technical problems solved by the invention are as follows: the paths of the multiple machine dogs are overlapped in the process of crossing floor inspection, the machine dogs collide with each other and are locked, the load of the centralized controller is overlarge, and the inspection quality and efficiency of the machine dogs are ensured.
In order to solve the technical problems, the invention provides the following technical scheme: the multi-bionic robot dog cross-floor three-dimensional inspection method comprises the following steps:
the bionic robot dog navigates across floors and builds a map;
performing automatic navigation obstacle avoidance on the built map through path planning to complete a specified inspection task;
and performing multi-machine dog task allocation by adopting a task allocation method based on behaviors.
As a preferable scheme of the multi-bionic robot dog cross-floor three-dimensional inspection method, the cross-floor navigation comprises the following steps:
based on the navigation technology of 3Dslam, navigation is performed through an ROS system, parallel mapping and positioning technology based on particle filtering and a Cartographer algorithm are adopted, a sensor is deployed to acquire surrounding environment information to realize map construction and judge the environment condition, and the sensor is used for measuring the distance a between a robot dog and a target obstacle according to the environment condition to judge the security level.
As a preferable scheme of the multi-bionic robot dog cross-floor three-dimensional inspection method, the deployment sensor comprises: deploying a camera, a laser radar and an ultrasonic sensor;
initializing a sensor, capturing a real-time environment image by using a camera, evaluating the current environment condition by using an image processing algorithm, and carrying out dynamic weight adjustment on a laser radar, the camera and an ultrasonic sensor by monitoring illumination conditions and distances in real time according to performance characteristics of the sensor under different environment conditions;
cases where lidar weights are increasing: the ambient illumination does not meet the threshold, various material surfaces exist in the environment, and a large amount of accurate distance measurement and angle information are required in the environment;
the condition of camera weight increase: the ambient illumination accords with a threshold value, and the target object needs to be identified, classified or extracted with more visual characteristics, and navigation is needed based on the shape and color information of the target object;
the weight of the ultrasonic sensor increases: the environment is provided with a large number of barriers, the barrier avoiding operation is required to be frequently carried out, and the environment is provided with a large number of irregular surfaces or soft objects, when the distance is less than or equal to 1 meter.
As a preferable scheme of the multi-bionic robot dog cross-floor three-dimensional inspection method, the judging security level comprises the following steps:
When the distance a is more than 3 meters, the value a is determined as 3 Continuously keeping the current speed running, and simultaneously adjusting the running direction in real time according to a navigation and obstacle avoidance algorithm;
when the distance a is more than or equal to 1 meter and less than or equal to 3 meters, the distance a is judged to be a 2 A stage for the robot to continue to advance and decrease the speed if the obstacle is stationary or opposite to the direction of movement of the robot, and for the robot to slow down and find other feasible paths if the obstacle is moving in the direction of movement of the robot;
when the distance a is less than 1 meter, the value a is determined as 1 And (3) immediately reducing the speed of the machine dog to zero when the risk of collision exists, searching a feasible path according to an obstacle avoidance algorithm in the deceleration process, trying to bypass the obstacle, and starting again when the fault is cleared or the distance is more than or equal to 1 meter.
As a preferred scheme of the multi-bionic robot dog cross-floor three-dimensional inspection method, the path planning comprises: after the map is constructed, global planning and local planning are carried out;
the global planning comprises searching an optimal path according to two points through known information such as a built map, adopting global route planning based on a grid method to construct an indoor trackless environment model, dividing the global path planning for a machine dog in the grid map into a grid form by using Dijkstra algorithm, determining the position direction of an end point in the map relative to the position of a start point, continuously carrying out operation comparison on nodes around the source point according to a greedy strategy of the Dijkstra algorithm, obtaining the shortest path from the start point to the end point, and determining the shortest path planning between the patrol point positions;
The local planning comprises the steps of issuing a speed instruction to assist in obstacle avoidance, judging a walking path through local information, detecting surrounding obstacle information in the running process by a sensor, and adjusting the current local path to realize an obstacle avoidance function.
As a preferable scheme of the multi-bionic robot dog cross-floor three-dimensional inspection method, the specified inspection task comprises the following steps: issuing corresponding speed and direction instructions for each machine dog according to the final navigation result, so that the machine dogs can carry out inspection according to the planned path;
the final navigation result is expressed as:
wherein wG represents a global planning weight coefficient, wL represents a weight coefficient, G represents a global planning strategy, and L represents a local obstacle avoidance strategy;
when it is judged as a 3 When in stage, the global planning weight is increased, and the local planning weight is reduced;
when it is judged as a 2 When in stage, carrying out global planning and local planning on average;
when it is judged as a 1 When in stage, increasing local planning weight and reducing global planning weight;
and dynamically adjusting the weights of the global path planning and the local obstacle avoidance strategy according to the environmental analysis result, generating a global optimal path and a local obstacle avoidance strategy according to the adjusted weights, inputting the generated path and the generated obstacle avoidance strategy into a controller of the machine dog, executing navigation tasks in real time, continuously evaluating the performance of the machine dog in the navigation process, and optimizing if the performance is found to be insufficient.
As a preferable scheme of the multi-bionic robot dog cross-floor three-dimensional inspection method, the task allocation method comprises the following steps:
identifying and collecting all tasks to be completed, carrying out initial task allocation according to self capacity and task requirements, evaluating the utility value of each task by the deployed sensor, sharing the utility value of each task by the sensor through a communication network, executing a greedy strategy by each machine dog after receiving the utility values of other machine dogs, comparing the utility value of each task with the utility values of other machine dogs, and selecting the task with the highest self utility as a high-priority task;
the utility value of the task is expressed as:
U=w1*T+w2*D+w3*E+w4*S+w5*Q
wherein w1, w2, w3, w4, w5 represent weight coefficients, T represents required time, D represents current task amount, E represents risk assessment level, S represents task priority, and Q represents energy consumption required for execution.
As a preferable scheme of the multi-bionic robot dog cross-floor three-dimensional inspection method, in the process of executing tasks, if obstacles are encountered and new task conditions are found, the utility value of each task is updated, and task reassignment is performed according to the new utility value;
By maintaining a discrete performance assessment method comprising acquiescence and patience, when a certain machine dog is executing a task, the acquiescence value of the machine dog per se for the task gradually increases along with the time, and the patience degree of other machine dogs for the task is reduced, so that the utility assessment of the machine dog for the task is gradually reduced, and each machine dog synchronously and independently refreshes the performance assessment of each machine dog under the action of communication heartbeat frames;
the machine dog and the remote access computer are connected into the same local area network, a maintainer logs in background management software on the remote access computer, selects the machine dog to be controlled and remotely connects, after the connection is successful, remote control, on-site environment parameter check, inspection task formulation and storage of related parameters of the machine dog are realized through various functional components of the background management software, the machine dog is converted into corresponding instructions and sent to the machine dog, and after the machine dog receives the instructions, the machine dog executes tasks according to instruction content and returns confirmation information.
Another object of the present invention is to provide a multi-bionic robot dog cross-floor three-dimensional inspection system, which can solve the problems of unreasonable resource allocation, low efficiency and insufficient adaptability to complex environments in the conventional inspection mode through a task allocation and cooperation mechanism.
In order to solve the technical problems, the invention provides the following technical scheme: multi-bionic robot dog strides floor three-dimensional inspection system, includes:
the system comprises a task management module, a performance evaluation module, a communication module and an environment perception module;
as a preferable scheme of the multi-bionic robot dog cross-floor three-dimensional inspection system, the task management module is used for evaluating data and distributing tasks according to the utility value of each robot dog and adjusting task distribution in real time according to the task progress and communication heartbeat frames among the robot dogs;
as a preferable scheme of the multi-bionic robot dog cross-floor three-dimensional inspection system, the efficiency evaluation module is used for calculating the utility value of each task, updating the efficiency evaluation of the robot dog on the task in real time and optimizing the task allocation;
as an optimal scheme of the multi-bionic machine dog cross-floor three-dimensional inspection system, the communication module is used for establishing and maintaining communication connection among the multi-machine dogs and transmitting communication heartbeat frames among the machine dogs in real time;
as a preferable scheme of the multi-bionic robot dog cross-floor three-dimensional inspection system, the environment sensing module is used for planning a proper path according to a floor map and positioning information of the robot dog, and carrying out real-time adjustment when encountering an obstacle or finding a new task condition.
A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method as described above when executing the computer program.
A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method as described above.
The invention has the beneficial effects that: according to the multi-bionic machine dog cross-floor three-dimensional inspection method, through the cooperative work of the multi-machine dogs, more inspection tasks can be completed in a shorter time, the overall inspection efficiency is improved, a real-time obstacle avoidance algorithm and a task allocation method are adopted, collision risks among the machine dogs and between the machine dogs and environmental obstacles can be effectively reduced, task priorities are adjusted in real time based on the task allocation method, the machine dogs can respond to sudden tasks or environmental changes more flexibly, the overall movement path length of the machine dogs can be reduced by optimizing task allocation and path planning, energy consumption is reduced, the machine dogs can obtain more accurate environmental information under different environmental conditions through sensor fusion, the task completion quality is improved, the running state and task progress of the multi-machine dogs can be mastered in real time through a monitoring and visualization module, personnel can monitor and manage the whole inspection process conveniently, communication pressure can be reduced under the action of communication frames and independently, and communication efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art. Wherein:
fig. 1 is an overall flowchart of a multi-bionic robot dog cross-floor three-dimensional inspection method according to a first embodiment of the present invention;
fig. 2 is an overall structure diagram of a multi-bionic robot dog cross-floor three-dimensional inspection system according to a second embodiment of the present invention;
fig. 3 is a comparison chart of collision probability in a multi-bionic robot dog cross-floor three-dimensional inspection method according to a fourth embodiment of the present invention.
Detailed Description
So that the manner in which the above recited objects, features and advantages of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced in other ways other than those described herein, and persons skilled in the art will readily appreciate that the present invention is not limited to the specific embodiments disclosed below.
Further, reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic can be included in at least one implementation of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments.
While the embodiments of the present invention have been illustrated and described in detail in the drawings, the cross-sectional view of the device structure is not to scale in the general sense for ease of illustration, and the drawings are merely exemplary and should not be construed as limiting the scope of the invention. In addition, the three-dimensional dimensions of length, width and depth should be included in actual fabrication.
Also in the description of the present invention, it should be noted that the orientation or positional relationship indicated by the terms "upper, lower, inner and outer", etc. are based on the orientation or positional relationship shown in the drawings, are merely for convenience of describing the present invention and simplifying the description, and do not indicate or imply that the apparatus or elements referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus should not be construed as limiting the present invention. Furthermore, the terms "first, second, or third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
The terms "mounted, connected, and coupled" should be construed broadly in this disclosure unless otherwise specifically indicated and defined, such as: can be fixed connection, detachable connection or integral connection; it may also be a mechanical, electrical or direct connection, or may be an indirect connection via an intermediary, or may be a communication between two elements. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
Example 1
Referring to fig. 1, for one embodiment of the present invention, a multi-bionic robot dog cross-floor three-dimensional inspection method is provided, comprising:
the bionic robot dog navigates across floors and builds a map;
performing automatic navigation obstacle avoidance on the built map through path planning to complete a specified inspection task;
and performing multi-machine dog task allocation by adopting a task allocation method based on behaviors.
The cross-floor navigation includes: based on the navigation technology of 3Dslam, navigation is performed through an ROS system, parallel mapping and positioning technology based on particle filtering and a Cartographer algorithm are adopted, a sensor is deployed to acquire surrounding environment information to realize map construction and judge the environment condition, and the sensor is used for measuring the distance a between a robot dog and a target obstacle according to the environment condition to judge the security level.
In order to realize automatic inspection, the system automatically walks and stops according to a preset route and a stop position, identifies obstacles, presets the functions of 6.5-meter-layer and 0-meter-layer environments, automatically navigates up and down industrial stairs and the like, adopts parallel mapping and positioning technology SLAM and a cartograph algorithm based on particle filtering, and acquires surrounding environment information through devices such as a laser radar, a sensor and the like to realize map construction. After the robot dog successfully acquires the inspection map, the robot dog performs path planning on the known map information so as to realize the functions of automatic navigation and obstacle avoidance and complete the inspection task.
The deployment sensor includes: deploying a camera, a laser radar and an ultrasonic sensor; initializing a sensor, capturing a real-time environment image by using a camera, evaluating the current environment condition by using an image processing algorithm, and carrying out dynamic weight adjustment on the laser radar, the camera and the ultrasonic sensor by monitoring illumination conditions and distances in real time according to the performance characteristics of the sensor under different environment conditions.
Cases where lidar weights are increasing: the ambient illumination does not meet the threshold, various material surfaces exist in the environment, and a large amount of accurate distance measurement and angle information are required in the environment; the condition of camera weight increase: the ambient illumination accords with a threshold value, and the target object needs to be identified, classified or extracted with more visual characteristics, and navigation is needed based on the shape and color information of the target object; the weight of the ultrasonic sensor increases: the environment is provided with a large number of barriers, the barrier avoiding operation is required to be frequently carried out, and the environment is provided with a large number of irregular surfaces or soft objects, when the distance is less than or equal to 1 meter.
The camera can determine the illumination intensity by analyzing the captured image. A simple method is to calculate the brightness of the image, i.e. the average of the gray values of each pixel. A lower intensity image generally indicates a weaker light.
The calculated average luminance is compared with a preset threshold value. If the average brightness is below the threshold, it may be determined that the light is weak.
Judging the security level includes: when the distance a is more than 3 meters, the value a is determined as 3 Continuously keeping the current speed running, and simultaneously adjusting the running direction in real time according to a navigation and obstacle avoidance algorithm; when the distance a is more than or equal to 1 meter and less than or equal to 3 meters, the distance a is judged to be a 2 A stage for the robot to continue to advance and decrease the speed if the obstacle is stationary or opposite to the direction of movement of the robot, and for the robot to slow down and find other feasible paths if the obstacle is moving in the direction of movement of the robot; when the distance a is less than 1 meter,judging as a 1 And (3) immediately reducing the speed of the machine dog to zero when the risk of collision exists, searching a feasible path according to an obstacle avoidance algorithm in the deceleration process, trying to bypass the obstacle, and starting again when the fault is cleared or the distance is more than or equal to 1 meter.
The laser radar has higher measurement precision and resolution in a medium-long distance range, and is suitable for a large-range indoor and outdoor environment. However, lidar may not acquire accurate data in very close range because the laser beam may be blocked or not returned to the sensor.
The camera can capture clear images in different distance ranges, but the measurement accuracy is affected by factors such as the resolution of the camera, the illumination condition, the size of a target object and the like. The camera has better performance in a middle distance range, and can be used for tasks such as object identification, classification, visual navigation and the like. The performance of the camera may be affected when the distance is large or the illumination condition is poor.
The ultrasonic sensor has higher measurement accuracy in a short distance range, and is suitable for obstacle avoidance and ranging tasks in a short distance. However, in a long distance range, the performance of the ultrasonic sensor may be affected by factors such as ambient noise, air humidity, and surface characteristics of the target object, resulting in a decrease in measurement accuracy.
Raw data from each sensor is preprocessed to eliminate noise and other environmental disturbances. For the camera, an image enhancement and filtering algorithm can be adopted, so that the image quality is improved; for laser radar and ultrasonic sensor, can adopt smooth filtering and denoising algorithm, improve the accuracy of measured data.
And fusing the processed sensor data, and distributing proper weight for each sensor by combining the characteristics of each sensor. The weight can be dynamically adjusted according to the performance characteristics of the sensor under different conditions. For example, when the illumination condition is poor, the weight of the camera is reduced, and the weights of the laser radar and the ultrasonic sensor are increased; when the distance is relatively close, the weight of the ultrasonic sensor is increased, and the weight of the laser radar is reduced.
And calculating the distance according to the fused data by using a distance measuring method of a laser radar, a camera and an ultrasonic sensor respectively. For example, a lidar calculates a distance by measuring a time difference between transmitting a laser beam and receiving a reflected laser beam; the camera can calculate the distance by adopting a monocular vision algorithm or a binocular vision algorithm; the ultrasonic sensor calculates a distance by measuring a time difference between the transmitted ultrasonic wave and the received echo.
And carrying out weighted fusion on the distance values calculated by the sensors to obtain a final measurement result. The weight of each sensor can be dynamically adjusted according to the actual application scene and the measurement condition, so that the accuracy and the robustness of the measurement result are improved.
The path planning includes: after the map is constructed, global planning and local planning are carried out; the global planning comprises searching an optimal path according to two points through known information such as a built map, adopting global route planning based on a grid method to construct an indoor trackless environment model, dividing the global path planning for a machine dog in the grid map into a grid form by using a Dijkstra algorithm, determining the direction of an end point position relative to a starting point position in the map, continuously carrying out operation comparison on nodes around a source point according to a greedy strategy of the Dijkstra algorithm to obtain the shortest path from the starting point to the end point, and determining the shortest path planning between the inspection point positions; the local planning comprises the steps of issuing a speed instruction to assist in obstacle avoidance, judging a walking path through local information, detecting surrounding obstacle information in the running process by a sensor, and adjusting the current local path to realize an obstacle avoidance function.
The appointed inspection task comprises: issuing corresponding speed and direction instructions for each machine dog according to the final navigation result, so that the machine dogs can carry out inspection according to the planned path;
the final navigation result is expressed as:
wherein wG represents a global planning weight coefficient, wL represents a weight coefficient, G represents a global planning strategy, and L represents a local obstacle avoidance strategy;
when it is judged as a 3 When in stage, the global planning weight is increased, and the local planning weight is reduced; when it is judged as a 2 When in stage, carrying out global planning and local planning on average; when it is judged as a 1 When in stage, increasing local planning weight and reducing global planning weight; and dynamically adjusting the weights of the global path planning and the local obstacle avoidance strategy according to the environmental analysis result, generating a global optimal path and a local obstacle avoidance strategy according to the adjusted weights, inputting the generated path and the generated obstacle avoidance strategy into a controller of the machine dog, executing navigation tasks in real time, continuously evaluating the performance of the machine dog in the navigation process, and optimizing if the performance is found to be insufficient.
When the robot is found to be in a 2 Or a 3 When obstacles are frequently encountered under the security level, the weight coefficient of a local obstacle avoidance strategy is increased, the obstacle avoidance capability is improved, when the robot is found to have poor performance under a specific environment, the navigation strategy is optimized by combining a machine learning algorithm, experience data under different environments are collected, references are provided for the similar conditions, an effective feedback mechanism is established, the robot can perform self-adjustment according to real-time expression, and the navigation strategy is mutually learned and optimized by sharing the task execution effect with other robot dogs.
The task allocation method comprises the following steps: identifying and collecting all tasks to be completed, carrying out initial task allocation according to self capacity and task requirements, evaluating the utility value of each task by the deployed sensor, sharing the utility value of each task by the sensor through a communication network, executing a greedy strategy by each machine dog after receiving the utility values of other machine dogs, comparing the utility value of each task with the utility values of other machine dogs, and selecting the task with the highest self utility as a high-priority task;
the utility value of the task is expressed as:
U=w1*T+w2*D+w3*E+w4*S+w5*Q
wherein w1, w2, w3, w4, w5 represent weight coefficients, T represents required time, D represents current task amount, E represents risk assessment level, S represents task priority, and Q represents energy consumption required for execution.
When the task needs to be completed as soon as possible, the required time weight coefficient w1 is increased, so that the robot dog is more concerned about the time required for completing the task during task allocation.
When the task amount is larger, the task amount weight coefficient w2 is increased, so that the robot dog is more concerned with the number of tasks during task allocation.
When the safety in the task execution process is important, the risk assessment grade weight coefficient w3 is increased, so that the machine dog is more concerned with the risk degree of the task during task distribution, the high-risk task which possibly causes self damage is avoided being executed, the machine dog can better ensure the self safety through the increased risk assessment grade weight coefficient, and the damage risk is reduced.
When the task urgency is higher, the task priority weight coefficient w4 is increased, so that the machine dog is more concerned about the urgency of the task during task allocation.
When the energy of the machine dog is limited or the energy is required to be saved, the energy consumption weight coefficient w5 required by execution is increased, so that the machine dog pays more attention to the energy consumption during task allocation, the limited energy is fully utilized, the task with lower energy consumption is preferentially selected, the machine dog can finish the task more energy-effectively by improving the energy consumption weight coefficient, and the working time is prolonged.
In the process of executing the tasks, if the situation of encountering an obstacle and finding out new task occurs, updating the utility value of each task, and carrying out task reassignment according to the new utility value.
When a robot dog encounters an obstacle during execution of a task, it needs to evaluate the impact of the obstacle on task execution in real time. For example, an obstacle may cause the robot dog to fail to follow the original planned path, thereby requiring a new path to be found. At this point, the machine dog will recalculate the utility value for each task, taking into account the additional time and energy consumption required to avoid the obstacle. The tasks are then reassigned according to the new utility values to ensure optimization of overall task execution efficiency.
During execution of a task, the robot dog may discover new task conditions, such as equipment failure during inspection. When such a situation is encountered, the machine dog needs to add new tasks to the task list and recalculate the utility value for each task. In calculating the utility value of a new task, the robot dog needs to consider the urgency of the task, the time required, the risk assessment level, and the energy consumption. And then, reassigning the tasks according to the new utility values to ensure that the tasks with urgent and higher utility values are preferentially processed.
After updating the task utility value, the robot dog reassigns the task according to the new utility value. In the reassignment process, the machine dog can preferentially select the task with the highest self utility value, and simultaneously cooperates with other machine dogs to avoid repeatedly executing the same task. In this process, the machine dogs share their own utility value for each task via the communication network to achieve an optimal task allocation scheme.
In the process of executing tasks, the machine dog needs to dynamically adjust the task execution plan according to actual conditions. For example, when the priority of a task changes or the machine dog encounters unexpected difficulties in executing the task, the machine dog needs to update the utility value of the task in time and adjust the task execution sequence according to the new utility value. This helps to increase the efficiency and flexibility of overall task execution.
By maintaining a discrete performance assessment method that includes acquiescence and patience, as a certain machine dog is executing a task, its own acquiescence to the task gradually increases over time, while the patience of other machine dogs to the task decreases, resulting in a gradual decrease in the performance assessment of the machine dog for the task, each machine dog refreshing the performance assessment of each machine dog synchronously and independently under the influence of the communication heartbeat frame.
Each task has associated with it a acquiescence and a tolerance level. When a certain robot dog is executing a task, its own acquiescence for the task increases gradually over time, indicating that the robot dog is willing to devote more time and resources in the task execution process. At the same time, the tolerance of other machine dogs for the task gradually decreases, indicating that their willingness to pay for the task is reduced.
As the acquiescence increases and the tolerance decreases, the task's utility rating for the task gradually decreases and if other machine dogs are able to more effectively complete the task, the task-executing machine dog will prefer to yield the task for the other machine dogs to pick up. This helps to ensure efficiency and flexibility in task execution.
Communication heartbeat frames are sent periodically among the machine dogs, and each machine dog sends own performance evaluation data to other machine dogs and receives the performance evaluation data of other machine dogs in each heartbeat frame period. In this way, each machine dog can independently refresh its own performance assessment for each task, thereby achieving optimization of task allocation.
Each machine dog can redistribute tasks according to own task priority, in the task distribution process, the machine dog can preferentially select the task with highest efficiency evaluation, so that the overall task execution efficiency is improved, and when the execution progress of a certain task is affected, the machine dog can dynamically adjust the task execution sequence according to the efficiency evaluation data adjusted in real time, so that stable task execution is ensured.
The machine dog and the remote access computer are connected into the same local area network, a maintainer logs in background management software on the remote access computer, selects the machine dog to be controlled and remotely connects, after the connection is successful, remote control, on-site environment parameter check, inspection task formulation and storage of related parameters of the machine dog are realized through various functional components of the background management software, the machine dog is converted into corresponding instructions and sent to the machine dog, and after the machine dog receives the instructions, the machine dog executes tasks according to instruction content and returns confirmation information.
Example 2
Referring to fig. 2, for one embodiment of the present invention, a multi-bionic robot dog cross-floor stereo inspection system is provided, comprising:
the system comprises a task management module 100, a performance evaluation module 200, a communication module 300 and an environment perception module 400;
the task management module 100 is used for evaluating data and distributing tasks according to the utility value of each machine dog, and adjusting task distribution in real time according to the task progress and the communication heartbeat frame among the machine dogs;
the efficiency evaluation module 200 is used for calculating the utility value of each task, updating the efficiency evaluation of the task by the machine dog in real time, and optimizing the task allocation;
the communication module 300 is used for establishing and maintaining communication connection among the multiple machine dogs and transmitting communication heartbeat frames among the machine dogs in real time;
the environment sensing module 400 is used for planning a proper path according to the floor map and the positioning information of the robot dog, and performing real-time adjustment when encountering obstacles or finding new task conditions.
Example 3
One embodiment of the present invention, which is different from the first two embodiments, is:
the functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Example 4
Referring to fig. 3, for one embodiment of the present invention, a multi-bionic robot dog cross-floor three-dimensional inspection method is provided, and in order to verify the beneficial effects of the present invention, scientific demonstration is performed through economic benefit calculation and simulation experiments.
MATLAB and CloudSim were used to evaluate the algorithm. Simulations have been run in an environment with an Intel processor and 16GB RAM. The operating system used was a 64 bit Windows 11Ultimate. And simulating the point system by using MATLAB programming language, connecting records, and constructing data distribution.
Setting up an experimental environment: several inspection points are deployed in a simulated multi-floor building environment, including different floors, different rooms, and different types of obstacles. A certain number of dynamic barriers (such as walking people or mobile equipment) are set in the environment to simulate the actual application scene.
Preparing equipment: two groups of machine dogs are prepared, and the method and the traditional method are respectively applied, wherein the two groups of machine dogs have the same hardware configuration and basic functions.
Task allocation: the same inspection tasks are distributed to the two groups of machine dogs, wherein the inspection tasks comprise inspection point positions, inspection sequences, task priorities and the like, and task lists are distributed to the machine dogs adopting the method and the traditional method.
Experiment was performed: the machine dogs of the experimental group and the control group were individually started to perform the inspection task in the simulated environment. The following data were collected simultaneously: the total time required for completing the task, the effective obstacle avoidance times, the collision times, the battery consumption condition and the task allocation efficiency, 10 experiments are carried out, the experimental result of each experiment is recorded, and the experimental results of the tidying experimental data are shown in the following table.
Table 1 comparative data table of the inventive and conventional methods
Index (I) | The method of the invention | Conventional method | Percent of rise |
Time to completion of task | 80 minutes | 100 minutes | 20% |
Effective obstacle avoidance times | 115 times | 100 times | 15% |
Number of collisions | 7 times | 10 times | 30% |
Battery consumption | 80% | 90% | 10% |
Task allocation efficiency | 90% | 65% | 25% |
According to the multi-bionic robot dog cross-floor three-dimensional inspection method, through data analysis of the experiment for 10 times, a conclusion can be obtained that the average time for completing tasks is reduced by 20%, the method is optimized in the aspects of stair climbing, task switching and the like, so that the robot dog is more efficient when the inspection tasks are completed, the obstacle avoidance performance is effectively improved by 15% in obstacle avoidance times, various obstacle avoidance strategies are integrated, the safety performance collision times are reduced by 30%, the obstacle detection and obstacle avoidance decision are more accurate, the risk of collision of the robot dog with the obstacle in the inspection process is reduced, the energy consumption battery consumption is reduced by 10%, and the energy consumption of the robot dog in the inspection process is reduced through optimizing the path planning and the obstacle avoidance strategies.
From the above comparative analysis, it was concluded that: the multi-bionic robot dog cross-floor three-dimensional inspection method has remarkable advantages in the aspects of task completion time, obstacle avoidance performance, safety performance, energy consumption and the like, and is more efficient, reliable, safe and energy-saving compared with the traditional method.
As shown in fig. 3, in the testing process of the multi-bionic machine dog cross-floor three-dimensional inspection method according to the method of the invention, when the threshold value is too large, the phenomenon of missing identification is caused, the obstacle is identified but not judged, the accident occurs due to adverse effect, when the threshold value is too small, the phenomenon of wrong identification is caused, the obstacle is not identified but is judged, the cross-floor three-dimensional inspection is affected, and according to the experimental result, the threshold value is finally determined to be a when the distance a is more than 3 m, the judgment is made 3 Continuously keeping the current speed running, and simultaneously adjusting the running direction in real time according to a navigation and obstacle avoidance algorithm; when the distance a is more than or equal to 1 meter and less than or equal to 3 meters, the distance a is judged to be a 2 A stage for the robot to continue to advance and decrease the speed if the obstacle is stationary or opposite to the direction of movement of the robot, and for the robot to slow down and find other feasible paths if the obstacle is moving in the direction of movement of the robot; when the distance a is less than 1 meter, the value a is determined as 1 And (3) immediately reducing the speed of the machine dog to zero when the risk of collision exists, searching a feasible path according to an obstacle avoidance algorithm in the deceleration process, trying to bypass the obstacle, and starting again when the fault is cleared or the distance is more than or equal to 1 meter.
The method is implemented in a certain power plant, under the complex working condition of the power plant, the inspection tasks are executed by multiple robots at the same time, so that production equipment or instruments can run safely, in order to enable the robots to meet the automatic inspection requirements of different areas of the power plant, the distributed path planning of the multiple robots is adopted, the robots walk and stop independently according to preset routes and stop positions, obstacles are identified, the functions of automatically navigating industrial stairs between 6.5 m layers and 0 m layers and the like are adopted, and the parallel map construction and positioning technology (SLAM) and a Cartographer algorithm based on particle filtering are adopted, and the surrounding environment information is acquired through the laser radar, the sensors and other equipment, so that the map construction is realized. After the robot successfully acquires the inspection map, the robot can realize the functions of automatic navigation and obstacle avoidance by carrying out path planning on the known map information so as to complete the inspection task.
During the testing of the four-legged robot, the multi-robot scheduling is utilized to cooperatively complete the inspection of the covered gas turbine electric plants, the environment detection in the inter-area is realized, the noise is detected, the automatic identification and reading of the meter, the infrared temperature measurement of the equipment, the identification of the running and leaking images and the like are realized, and the full-coverage detection of various facilities, equipment and environmental information in the workshop is realized. Based on the realization of above function, finally realize unmanned on duty, traditional manual inspection special man is on duty, it is laborious to consuming time, inefficiency, inspection standardization level is also lower relatively, easily lead to diseases such as occupational disease, through the 24 hours inspection work of inspection robot, reduce staff and come in and go out workshop region inspection time, effectively improved operation management efficiency, in time discover each item parameter abnormality and the fault condition of equipment, the emergence of calamity and accident is significantly reduced, simultaneously reduce the manual operation maintenance cost, reduce personnel and allocate the latent risk that leads to inadequately.
It should be noted that the above embodiments are only for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that the technical solution of the present invention may be modified or substituted without departing from the spirit and scope of the technical solution of the present invention, which is intended to be covered in the scope of the claims of the present invention.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310516688.8A CN116804552A (en) | 2023-05-09 | 2023-05-09 | Multi-bionic robot dog cross-floor three-dimensional inspection method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310516688.8A CN116804552A (en) | 2023-05-09 | 2023-05-09 | Multi-bionic robot dog cross-floor three-dimensional inspection method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116804552A true CN116804552A (en) | 2023-09-26 |
Family
ID=88079138
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310516688.8A Withdrawn CN116804552A (en) | 2023-05-09 | 2023-05-09 | Multi-bionic robot dog cross-floor three-dimensional inspection method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116804552A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN119541071A (en) * | 2025-01-21 | 2025-02-28 | 霖久智慧(广东)科技有限公司 | Intelligent inspection method, device, equipment and medium for robot dog based on power management |
CN120095840A (en) * | 2025-05-12 | 2025-06-06 | 浙江大有实业有限公司杭州科技发展分公司 | A kind of intelligent inspection control method and system of power distribution room based on robot collaboration |
-
2023
- 2023-05-09 CN CN202310516688.8A patent/CN116804552A/en not_active Withdrawn
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN119541071A (en) * | 2025-01-21 | 2025-02-28 | 霖久智慧(广东)科技有限公司 | Intelligent inspection method, device, equipment and medium for robot dog based on power management |
CN119541071B (en) * | 2025-01-21 | 2025-04-04 | 霖久智慧(广东)科技有限公司 | Intelligent inspection method, device, equipment and medium for machine dog based on electric quantity management |
CN120095840A (en) * | 2025-05-12 | 2025-06-06 | 浙江大有实业有限公司杭州科技发展分公司 | A kind of intelligent inspection control method and system of power distribution room based on robot collaboration |
CN120095840B (en) * | 2025-05-12 | 2025-07-18 | 浙江大有实业有限公司杭州科技发展分公司 | A kind of intelligent inspection control method and system of power distribution room based on robot collaboration |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111897332B (en) | Semantic intelligent substation robot humanoid inspection operation method and system | |
WO2020192000A1 (en) | Livestock and poultry information perception robot based on autonomous navigation, and map building method | |
CN115200588B (en) | SLAM autonomous navigation method and device for mobile robot | |
CN116804552A (en) | Multi-bionic robot dog cross-floor three-dimensional inspection method and system | |
US11880943B2 (en) | Photogrammetry of building using machine learning based inference | |
KR20210063791A (en) | System for mapless navigation based on dqn and slam considering characteristic of obstacle and processing method thereof | |
KR102792068B1 (en) | Mobile robot for avoiding non-driving area and method for avoiding non-driving area of mobile robot | |
CN117742337A (en) | Inspection route control method and device | |
Choi et al. | Improved CNN-based path planning for stairs climbing in autonomous UAV with LiDAR sensor | |
CN117739990A (en) | Navigation method and device for inspection robot | |
CN118707956A (en) | A robot monitoring device capable of autonomous movement | |
CN119658721A (en) | Robot rescue emergency control system based on voice interaction | |
CN112987720A (en) | Multi-scale map construction method and construction device for mobile robot | |
CN119472732A (en) | A tunnel construction inspection method and system based on multi-UAV collaborative operation | |
CN118533191A (en) | Displacement optimization method of mobile perception robot based on feedback regulation | |
CN117733854A (en) | Operation and maintenance management method and device for composite robot | |
CN116360456A (en) | Method for dynamically adjusting route of electronic dog for inspection | |
Visser et al. | Amsterdam Oxford Joint Rescue Forces-Team Description Paper-Virtual Robot competition-Rescue Simulation League-RoboCup 2008 | |
Liu et al. | A robot obstacle avoidance approach with lidar and rgb camera data combined | |
Shang | Survey of mobile robot vision self-localization | |
CN113064425A (en) | AGV equipment and navigation control method thereof | |
Siemiątkowska et al. | Mobile robot navigation with the use of semantic map constructed from 3D laser range scans | |
KR102647135B1 (en) | Real-time crack detection system for construction site using artificial intelligence-based object detection algorithm and operation method therefor | |
CN119860779B (en) | Intelligent inspection method, control device, system and storage medium for wheeled robot | |
CN119472641A (en) | Control method of an operational intelligent legged robot for power inspection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20230926 |
|
WW01 | Invention patent application withdrawn after publication |